Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
America, being the strongest country, needs to utilize its power to prevent AI a…
ytc_Ugz7gI_yy…
G
i dont buy a WORD sophia is saying, she just sounds too PERFECT she sounds like …
ytc_UgzI-DTed…
G
He was a little boy when Einstein was alive 😂😂😂😂😂 turns out not only ChatGPT hal…
ytc_UgzXKVpD_…
G
that’s so messed up bro.. this is the disgusting side of AI where rules need to …
ytc_UgxeRlfBO…
G
The idea of the Fed regulating AI? They screw uo everything they touch. It would…
ytc_UgwZ-c0Od…
G
No need to worry everyone.. as enZo and Ai mates can only work on the truth of t…
ytc_UgwcXAihn…
G
I stumbled on the AI that's going to replace my job the other day.
Edit: He lit…
ytc_UgyPh5-tw…
G
surgery is already robotic in most areas with DaVinci and similar robotic aids. …
ytc_UgzSXnLrb…
Comment
Ngl I would just treat the robot as a equal if we don’t want a mechine up rising to be fair, I think a future of our machine and it’s creators coexist with a better one because what’s the say human beings and machines have a huge relationship in a way because of we created them we gave them the program and the come alive sure they don’t have a soul but if their programming is very well self-aware and moral then still doesn’t count as well assault but they have the mind of a human so treat machinery, like it’s your son or daughter or mother or grandparents all the best friend, because left to say I Think I prefer a zombie apocalypse instead of a robot apocalypse
youtube
AI Harm Incident
2025-07-15T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzIRYP8Wr_PuUa730p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyiMzXEi1osMGPQesB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyrrDofqJe8eMOWcyV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJB_XZHvCwIqFzP854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzfHTFc1cfbHN8LBeN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5mPL24_iJb86ohsR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxpsJbM-4gDbRGvPZl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxbn10fqyeOh-nMud14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyU-BEv9ErXajDqtO54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx4OeFibnX1Pr-cTbx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}]