Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dude didnt think AI is dangerous because he somehow missed the 1991 terminator 2…
ytc_Ugzj05yck…
G
So far it’s going very poorly. That recent Goldman Sachs report was pretty damni…
ytr_Ugy2CjCFm…
G
Couple to this, unlike a human making a decision as a being in a specific place,…
ytr_UgyYtD8aJ…
G
I don't think they deserve rights, eventually it'll turn to I-Robot(the movie wi…
ytc_UgikPCAEU…
G
If you're an artist who uses brushes and pencils to create art, switching to oth…
ytc_UgwtWm7IG…
G
People have said the same thing about flying not being possible in the next cent…
ytr_UgwxCTtNX…
G
Dave MAYBE an AI punishment is turn down the energy speed, like 'go sit in the …
ytc_UgwXeDERc…
G
WE NEED TO STOP THIS MADNESS. THESE PEOPLE ARE SATAN WORSHIPPERS WHO ARE PREVENT…
ytc_UgyM3OqOD…
Comment
The risk of AI escape is very serious, but the more proximate risk is the targeted use of AI by humans, against other humans, in devastating and potentially species-ending ways. Our chances of preventing that scenario seem even worse than our chances of surviving AI super-intelligence.
youtube
AI Responsibility
2025-05-21T15:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwDlFP8lyajhQ8xVZ14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz7RNufmAypI0qbEth4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy0yQSui3zk5QEWekF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyyDoiD6OkaaMT9s4F4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXhmgNM-FWJ4OwZDl4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzvyp0I058q87ioa9N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzJk9yf-lPD1NDkXKt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx2aBaO2VNVOJ6thM14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwcXH68aaKs1dPt5_x4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy01vLz5ok3BGH44tB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]