Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everyone in that company who signed off on replacing those 700 devs with AI shou…
ytc_UgzSRt2P2…
G
We need to drop ai for a few decades and focus on our humanity and arts…
ytc_UgyG8bIvE…
G
😂😂 its almost like people been talking about AI taking over a lot of jobs and pe…
ytc_UgxPG8vB1…
G
That why i don’t buy smart home ai devices, smart keylocks, wifi bulbs, smart va…
ytc_UgzMpRW0H…
G
It depends on who is feeding the information to the intelligence and what inform…
ytc_UgyQQMMQq…
G
you are also gonna replaced by ai tutors very soon just wait and watch. And the …
ytc_UgwRDdbSg…
G
"The AI can do better In seconds what might take you hours"
Mod doesn't understa…
ytc_Ugyr4iusL…
G
I'm more wondering why people aren't more afraid of the infinity dollars the mil…
rdc_nipn3pq
Comment
Something I find interesting is that there is this assumption baked in that we need to make AI safe in the first place. I wonder why there is such a belief that a superintelligent AI would be motivated to do harmful things that we need to protect ourselves against. That feels like a human projection onto a machine.
youtube
AI Governance
2026-03-09T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-ZL4FE7BBRMgngHZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx-094ipLcEr60KnKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGjf89ZOyph-cMWzN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfgbL3xXtIZdZ-eu14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNKsyfkMVH9UnQGqV4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyrzL6GgCNeuBK-VwV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWYOR2WR05hpVUFgl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw-6jxFficmec5OHzF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyClnt3akH8DPkfHMh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxvYr61US4qcF2UwRl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]