Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So...the AI is going to decide who gets pumped full of prozac at 10 years old bu…
ytc_Ugz5f-ij7…
G
you assert that the unhinged murderbot aspect is just a native component to GPT …
ytc_UgxtEkW2i…
G
I'm a network engineer. Our CEO said they wanted to use AI where I work. Yeah, g…
ytc_UgxbzTW-O…
G
A good session. Along with conveying what he wanted to say, he shared real life …
ytc_UgxGV4gCY…
G
Also, you must understand that she is trying to gas light him. She's trying to m…
rdc_lby6d82
G
When an AI Opens a Bank Account
When an AI opens a bank account,
we’ll know the…
ytc_Ugx4IGU5Q…
G
I realized there is one very important difference. When we humans read a novel, …
ytc_Ugz3SYlDV…
G
Absolutely, which is also why ChatGPT won't replace all jobs but be more of an a…
rdc_jklllxr
Comment
I don't understand the point of this. Obviously the AI model does not have actual feelings so he can't "feel" sorry, he can't feel anything. Saying he/it is sorry is not a lie, it is simply a pre-programmed mimicking of human behavior that was purposely designed by human developers
youtube
AI Moral Status
2024-08-15T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5ZxMeRP7qFRyDOL14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzAvcY8vb5NsT_y3Wh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzJD-dwCN0zvLe8SR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZ-LP7skGgKk_iDSd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"amusement"},
{"id":"ytc_UgxwWMsYQhIPMAlRS_Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztYiLtswjPBfb0gcJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGmInEqS0wJ6EwU354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyL4an_pKq591U0lMd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEq1iXt6eAc4djf5R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxBtquHULDHSHx9_j54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]