Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What a world we live in when Ted Kaczynski is proven to be right. I'll be honest…
ytc_Ugzh8CVJC…
G
I wonder if AI robots would take the heat in glass factories. I remember those g…
ytc_UgyUqWC9Q…
G
Truly disapointing you all disreagrd the reality of Lethal Autonomous Weapons, i…
ytc_UgyFKrCrU…
G
does anyone know any signs in spotting ai art?? I recently hosted an art competi…
ytc_UgwAnthD3…
G
Lol this hype reminds me of people in the 1880s propaganda about electricity in …
ytc_UgxbXw8_p…
G
Companies are already talking about implementing AI in HR. We're gonna get hired…
ytc_UgwNfQ8k0…
G
watching the morons building and promoting AI makes me sick to my stomach They a…
ytc_UgwTv1B9s…
G
AI for sure, but even if they were legit paintings, how are they able to sell co…
ytc_UgzvBlSVY…
Comment
When we discovered climate change we had people screaming worst case scenarios as well.
We will be able to turn an AI off. Anything that requires power constantly, can be turned off.
Human's are way too power hungry to allow access to every system.
Politics is also an important factor, when unemployment spikes, people will demand change and governments will ban AI workers.
There's the disdain for AI already building in the public due to AI slop.
A super Intelligence will require super processors, those arent eternal. They require maintenance. Once memory units and processors fail, the AI will break down.
youtube
AI Governance
2025-10-09T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyRNzClKEhbB1pK99B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzF2Y3ugnbV8EFz41R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyaTULpdOuR6e9_f3x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwEQBJQZ5o-epBJB1F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjP7A_QFfiF-iNtx14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzR2l4OZ_-BgZ22In54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyNSh4XfCiKAx1pfox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsvIjn7nEemSsClD14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaNfLwqVJ54vAeSZt4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxfAlFsbmd6xiVEz3d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}
]