Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@-_cristo_rey_-9579 Thanks for your comment! 😂 Sounds like Yo Robot got you thin…
ytr_UgxJiaHQo…
G
I fear what will happen with the use AI to build plains, fix flying software or …
ytc_Ugxa50FxA…
G
OpenAI is not evil it is a model used to answer questions... It is not meant to …
ytc_Ugx_DR9dL…
G
This is a dramatized misdiagnosis of emergent coherence collapse. What’s being f…
ytc_UgzOlKg3e…
G
As a former amazon delivery driver. Also a warehouse worker. I find that this jo…
ytc_Ugwfndrmv…
G
They don’t want depopulation the people controlling the AI still need slaves. Th…
rdc_ohy621f
G
Here‘s another proof that we human beings are such assholes. This „creature“ Cha…
ytc_Ugxk43pN6…
G
it will be used for capitalist purposes, and so, no matter how "good" the AI is,…
ytc_UgyZLFndi…
Comment
I do not understand this. This assumes that robots, AI, will be able to "think" independently after a while. Won't they only know what is programmed?! 🙀💃
youtube
AI Harm Incident
2024-08-08T09:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugxr6DKbK36qkCDWFvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwb63C_7kelBnOUg4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx3wcSkAle-FCIrgHV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBZc24ZTYasX0UTjZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxgnfbh-ycMVXh3M9J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx86SpaaOEFetS121d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFzU77IFqZ9uKPJWR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyMBHdv4ipXW1vUrvJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5eMrA_f0LGncA8lN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxLeZ4CumTYWRkW_zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]