Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think he's just coping here.
Who's to say we won't get the "Einstein" of AI ev…
ytc_UgyFYNQRd…
G
People who say they will never be replaced by AI or robots generally have no ide…
ytc_UgzO3ySwn…
G
Open Ai wants their stock to rise, what a surprise, and this just in people are …
ytc_Ugy2cA4XH…
G
Full automation of the economy is inevitable. And it's a good thing. Stop fighti…
ytc_UgwAoSXIH…
G
WTF? Wages for an agent? Rights? This is woke-level empathy. Anthropomorphism. …
ytc_Ugx9vDqNa…
G
AI isn’t killing the bachelors degree, the bachelors degree has lost its value a…
ytc_UgzNYQWHR…
G
Many great thought-experiment ethical questions. It'll be curious to see how thi…
rdc_cylmzjw
G
Well said. You are 100 percent correct it should be opt-in by default. On top o…
ytc_UgxDygWr9…
Comment
You can trick the AI if you don't ask direct questions. Most of the time the AI will reply with information not available which it's supposed to do. However if the question is vague and complex the AI will give you a long list of possible answers. So the AI isn't really at fault.
youtube
AI Responsibility
2026-04-22T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxVtTScjgJLD5NR6UZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAvby9-AQLNih61nh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznaRpQqo1jP7kl6x54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxYt46w4_Z0NR4tmvF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLzJdYJkmKxFapcw94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzgz0LvhMzWEDa5NNV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwg8lViO60GAsI6AEh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5ARA8OD_JjIu-R0V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzR_gpspjd4srClgPF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxY4GPvX_2_IHcqJyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]