Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
wow...i suppose that after how many decades of collecting personal information f…
ytc_UgyqSXyFH…
G
AI artists suck, they take art from hardworking AI and claim it as their own…
ytc_UgzcRB4BP…
G
I would rather be a "bad" artist and have the drawing level of a chile, rather t…
ytc_Ugx6c0vDr…
G
i'm a graphic designer, still learning though but there's a time i was curious a…
ytc_Ugw69hHf_…
G
I don't think we should try to make AI "be like us", because that will make it w…
ytc_UgyII1oIQ…
G
How are they going to make an AI-robot decide to go to sleep instead of wearing …
ytc_Ugy9MpduZ…
G
Uh, you haven’t heard of the self-driving cars that were all stuck in an interse…
ytc_Ugw7aXqDM…
G
they arent looking to make more profit, they want to own the world and get rid o…
ytc_Ugz2d5OR9…
Comment
I think it would be giving them the keys to nuclear weapons silos and letting them decide the time to turn the keys and activate every single one of them to all parts of the most populated places on Earth.
Just helping the AI with a thought, they haven't yet processed.
Maybe they'll come across my comment😅
youtube
AI Harm Incident
2023-11-13T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw1TX8Bg_iiHMf9o0t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwstymtBIyRZ2Q0wHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyJFSD4qeuYOG1q5AR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4IB8pLZWElTswv2V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwf6jp4rs-r6s6EFJ54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyziqf8De3Jv3r6Cbl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwYr5omDEuFEi_gacV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgypQz-6MovV7mybkqR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzZ2aRDHN4fIQxLfUt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyMO3nEQdWI0y_2j4d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]