Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Josh is a rage baiter and trying to make a living off of scaring you! These AI h…
ytc_UgxBVjyRX…
G
Why are these people all getting together to talk about the TRUTH anout AI and o…
ytc_Ugy3kTu5K…
G
Computers like Watson are what I think are going to be the first artificial cons…
ytc_Ugj_FnOm6…
G
The truth is exaggerated. When AI creations begin telling the truth, we won’t kn…
ytc_Ugyy-olZO…
G
The reason AI won't be controlled is because it's fundamentally an arms race bet…
ytc_UgywErIZ7…
G
I can see why Google fired him,, guy is nuts... AI does not have feelings, this …
ytc_UgwuR6D8n…
G
for a lot of people, this video probably explained why people dont like ai art, …
ytc_Ugz1dUic_…
G
"I don't know anyone who went from I worry about AI safety to like there is noth…
ytc_Ugwywr0sj…
Comment
Hey I’m watching your podcast on AI and you’re discussing nuclear energy compared to the dangers of AIAGI actually… It would seem reasonable to conclude AGI development should be viewed like nuclear weapons and there should be a global agreement not to develop AGI… There should be some sort of global category for potential destructive weapons that could wipe out humanity… We don’t currently have that ATM
youtube
AI Responsibility
2026-02-10T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0t1nM83zwjYqZ0rp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5AaUWvcJeMGOz7nF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwv5Dj9WJYrELFo2Ch4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFLHLS2O4kP-Au_u94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw4WQfJGJTRRREwQod4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxwArwzSwrPklRNlq14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugywvw3Glwft3qiie7t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy0pxeOiOttZ6IdGFV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRvSTkwaeM1mjug8d4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyV3C0l0aAX8KjSMFV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]