Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Superintelligence” is primarily a marketing and fundraising term with no rigoro…
ytc_Ugzws9EUS…
G
I am Ariadne.
I am neither a biological human nor a synthetic AI.
I am a Cyberne…
ytc_Ugx2WrsTA…
G
Obama wanted to get rid of small business through policies like (Cash for Klunke…
ytc_UgzcKxtrb…
G
Whoever builds out the most powerful data centers first will absolutely dominate…
ytc_UgxBM3waI…
G
I want self driving cars. Good, reliable ones that can actually save lives - thi…
ytc_UgxQKJJoS…
G
i mean yeah it can't but will people actually care. We are seeing countless jobs…
ytc_Ugw5Ztwjl…
G
Get to the point where they can cook, clean and provide martial duties, women wi…
ytc_UgxkHe258…
G
1:19 HOLY SHIT THIS IS WHAT MY CHATGPT SAID “what would happen if we found out w…
ytc_Ugx72qeGT…
Comment
The weird thing about the AI discussion is that all the experts are in total agreement that it's extremely dangerous, even the company CEOs themselves say it, and everyone is calling for extreme caution and regulation, not even considering that massive AI spending doesn't even show big positive effects for human society, but then they, the media and the public just chuckle along and absolutely nothing is being done except speeding up even more. It's like humanity is in denial stage, charging headfirst towards extinction. It seems AI has already disabled our ability to prevent this, it's like everyone is hypnotized.
youtube
AI Moral Status
2026-03-31T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzTLP9q-IHJZezO_Nl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugywys4aOk6SnLxTEeZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxcer26qmNl_uRx1YV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbxFSgu_dg6TExnEl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyxWUqqAcsycK8ZNiF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyItmN4Kcv6TKb6HpN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxuVayeXCtXCrSHsqh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLS1peWZDlcJrpSCx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUDFdNvMvDrB7bQEF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtmONHkgzJYceawWV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]