Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
16 year old art student here. When I first heard about AI I felt like giving up.…
ytc_UgzV3Quvh…
G
26:00 Super Intelligence will probably treat humans with the same care and compa…
ytc_UgyKOwW8M…
G
It is hoped that one day unemployed programmers will take down top AI experts. A…
ytc_UgyniJKSX…
G
If we develop something that attains sentience why would we use those systems to…
ytc_UgiAg_hJ4…
G
Ok, so for the ppl that are confused,
Those are actually robots, and they just …
ytc_UgzfLV0ap…
G
Book launch clearly.... kind of obvious and also c**tish really to be blowing th…
ytc_UgxInHOFO…
G
I never will, I love listening to JRE but I get annoyed when Joe talks about AI …
ytc_Ugwb_FieJ…
G
If nobody has jobs bc of ai, nobody will have money except the super wealthy bc …
ytc_UgwI-ifCa…
Comment
At this point I'm just praying for a f****** solar flare that wipes out technology and reset society / sends us back 50 years. I understand that millions will die, the economies will collapse, but I still believe it will cause less harm than AI.
youtube
Cross-Cultural
2025-12-31T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx6yhDuB4OsSjlzLIl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxFlIzaeEX2VARjQCt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyKxCzlxnC4aibc6Gd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyIgQQuV6sq692gNP94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwsUOuLnRPhwqPywh54AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxwddxMv-8zI6CKLT54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZbuXrSSeCbmGx_wd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgziYvIY24rM2yS6B2d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyShA6j6r1k4CTfjAl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwiAP-Qx5gROzV7GYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]