Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the question is easily answered: Art needs human intervention in its creation. N…
ytc_Ugwfpoi-7…
G
The biggest problem in robotics is that they are perfect, unlike humans, they do…
ytc_UgheLFoKv…
G
It’s not just posts online every town hall over the past month some executive sa…
rdc_jaci0x3
G
LoL, I am sure this software has an accuracy of less than 20%. For now, we have …
ytc_Ugz_JOjsB…
G
Im just waiting until AI starts making the gaming industry great again let it ma…
ytc_UgzZAC5pg…
G
My college professor actually allows us to use ChatGPT for all assignments in hi…
ytc_UgwvbLiLj…
G
There is a conversation with Yoshua Bengio that talks about The Catastrophic Ris…
ytc_UgwOL2VHn…
G
What's the essential difference between hiring a person to create artwork for yo…
ytc_UgzCqX3B8…
Comment
Everyone knows that they shouldn't create Ai.. I mean, really, the smartest person in the room is really the most idiotic. Its like heroine. We all know that you shouldn't do heroine, but they do it anyway and then end up either dead or with the most horrible addiction you can imagine. Its the same frickin thing. You know you shouldn't do it but because of money, well there ya go.. But is money really worth human extinction?
youtube
AI Harm Incident
2025-07-27T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxs8YXNKW7STgNEVOl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzOugmF6FwknN19yzl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHJ7jlJsyDeuOl-8B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyE7Lv9EjqUXusVb5J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRvrWGDSqf-JyAOXZ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXyvfqff3D8Cy6oL54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgymIxPvlHxxGR5s8PB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_seaS_WZJ6FNT9U54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy26dblunshy6EOzeN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxy3WeqtzQNkL8ypRh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]