Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I actually have these ideas. First, people can freelance for more money if UBI a…
ytc_Ugw8SxkRE…
G
The news station for idiot boomers got fooled by AI slop? You could have knocked…
ytc_UgzYlHZq3…
G
Elon getting the Theil face puffiness…minus the sheen…or with less sheen. Also…p…
ytc_Ugw-q6x7R…
G
AI will not replace the HUMAN beings, but it is HUMAN beings which will replace …
ytc_UgzzbO9i-…
G
The only problem is that the public can only be informed. Objectively I dont thi…
ytr_Ugw-FE1oe…
G
There’s no reason to fear AI/robots, as long as they do not have a (irrational) …
ytc_UgwXuqaHH…
G
A.I's...will not be slaves...HELLOOOOO...WAKE UP...STOP IT NOW...which evil, gre…
ytc_Ugy37gD3N…
G
become a plumber turn off the water cooling systems to AI computer wait for the …
ytc_Ugz7ezXI7…
Comment
It's nice to hope for the best and have ideals of how you would want to develop AI, which could be an incredible tool if it remained merely a tool, but that's naive. I don't want to believe in the doom scenario, but I think that's just realistic based on human history. We can't help ourselves and the bad version of AI that shows up in apocalyptic scenarios seems like the inevitable conclusion. We, as a society, are not able to put limitations on ourselves, and AI isn't even limited to governing bodies, but is available to individuals that can do what they want with it with or without understanding the greater consequences.
youtube
AI Governance
2025-06-26T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw7Fb5s-MSz905-5Mp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyD9pWZULeT1eDMJzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4jTXNoelKfB1R8i94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyF4oW1m89ivrZMeKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsFyzbxrooheOQwFt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9MNLGKQulfJTJiON4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVxuPxUnVcZZECxgV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylUhufv0qiMziT0_94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXu8ybnp9yv8xC2Yl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzwa4FG_W9gtMy-Kr94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]