Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will definitely take us towards wealth inequality and ultimately towards the …
ytc_Ugx2WJc9h…
G
U don't control the AI. U don't control the robots. U don't own the corporations…
ytc_UgyrSkwGu…
G
You can solve any problem by simply killing everyone the problem affects.
When…
ytc_UgyMBv9tv…
G
Yeah I admire the ambition but that’s not going to work. The self driving cars w…
ytc_UgysaU8TD…
G
the basic mistake is considering AI is doing better and faster. It's REdoing fas…
ytc_Ugx6nIjWI…
G
But this could be read to mean that once they get self-driving cars to the point…
rdc_dmpjcpb
G
I definitely wouldn’t want to listen to podcast hosted by an AI - I need a real …
ytc_Ugxje7x9n…
G
Let us have a moment of silence for all of the people who think this is not AI g…
ytc_UgwYHQPmn…
Comment
This is it. This is what science-fiction writers were writing about 100 years ago and now it’s finally come to pass. The robots will realize that humans are obsolete and a blight and we don’t have a chance against these guys and they’re bulletproof bodies and cars it look like that guy has quite the aim too..
I thought the robot chicks would be perfected first but I guess I was wrong because what can I say I am human.
youtube
AI Harm Incident
2023-12-08T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyzGQU77lM0CQkXpQF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyGBIRLIZFJr8KDYP14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyBLn1QeQJZPptwZAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzODlPL89PTKu3LJt94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx-b4BoY1HyXSlKSyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzxa8ZwCIojsAJH-lx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyzek_crAb8vcOhW6d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyAS81bFxfQpM2YVVx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyTVRF9n7uBbt-b0VJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxhar7KLwoI2Bfd0AN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]