Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Once I was doing vibe coding and the ai was breaking. I asked it why you are bre…
ytc_Ugyk1a8UC…
G
Mechanical engineer switch to journalism? Compared to someone like Alex Jones.…
ytc_UgwHvbQp3…
G
He is right; computers will be always limited for what is described as the 'Chin…
ytc_Ugy-aYtWO…
G
AI bros are just lazy twats that cant be bothered to devote any amount of time i…
ytc_UgwKc3ylP…
G
Who cares? What's the big deal? It's not like people can figure out who a specif…
ytc_UgxrhBvPg…
G
AI must be eliminated for humans to survive- I hate this sh.it- come on, out of …
ytc_UgxPdXXk_…
G
Based on my experiences with AI in relation to my IT work, it is pretty good at …
ytc_UgzhvuHiu…
G
This clip is edited, he was talking about AI getting too smart and to put a chip…
ytc_UgyVxOy6H…
Comment
2001 A Space Odyssey is typical prediction of a rogue AI that makes decisions and eliminate humans by any means necessary, but when AJ told us the story of how AI can end the human world, it got to me another film I remember: Fritz Lang's Metropolis. When AJ mentioned Sydney, I think of Evil Maria... ruling mind-draining workers into chaos. This is scary stuff!
youtube
AI Governance
2023-07-07T02:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwzsvT0R8QaLUyNUIx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxtb04KfZMnmnwKn4x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx5zeaDZkjk9MLbG_54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOy-FBaa-ajhMXoMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykacu35_BkzEzD8hN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyEH-u-qtlab019hkB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxl48isDCChCc0PY594AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzKB-h2aVWk7JxR03x4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzR0GPh_1T1t2ExF0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzX0cewoR8nHZWY4Td4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]