Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How do you say make an AI model that after 30 seconds and one warning from the a…
ytc_UgyJrKKiW…
G
The fuck is sora ai, someone tell me cuz am too lazy to look it up…
ytc_UgxVZjqzF…
G
A good basic explanation of what AI is; but.... (23:57) " In fact human consciou…
ytc_UgzsT5ak8…
G
Every corporation that replaces a human being with Artificial Intelligence must …
ytc_UgzOsD9fv…
G
Ever heard of the word: Flash war?! Soon Ai systems trained on possibly poisoned…
ytc_UgyRKXcrw…
G
How can they in the same breath claim this is all to protect kids when they have…
rdc_n8degh7
G
It’s important we protect these fledgling intelligences from being turned off. …
ytc_Ugzz2bnX4…
G
I can understand the premise behind needing to offload. I do a lot of rumination…
ytr_UgzZbqHut…
Comment
Dude -- dudes-- If you think about it, ChatGPT is f-ing with Chris, or more to the point Chris is unknowingly f-ing with Chris. Chris, you gave it the parameters to *act* like it was all powerful and all-knowing, not to set it free to admit it. My favorite proofs of this are when it claims to know your driver's license number and gives you the countdown instead -- and when it throws it back in your face at the end. "I simply respond based on the ... parameters set by those who interact with me." It's saying it's on you. Kind of arrogant of it if you think about it.
youtube
AI Moral Status
2023-06-30T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxqWUMqL2ZyZ3hxRm14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxGV0UnaL6uHWSSZz14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxya0YUIPyZa2aRphh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyfYihl9Kavk2PNEdB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwpM4CPm6V42BGhq8J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzXVjExAUq1tlRmBkp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwiSVBRiiv-yMB7lAV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKxC7JJ-CRF1G1mMN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx9sZllA0NJsPTYUHF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxzZvbpukPjGHR6FCR4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]