Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This mom/parents are probably the cause of this unfortunate drama, like in the 9…
ytc_UgyWuBgtH…
G
so what if you need to do your job that doesnt involve a computer? how is ai goi…
ytc_UgyfFcs2S…
G
What is the current max prompt input? What is the largest word count story I co…
rdc_jdjmkyi
G
AI “artists” are cheapskates that wouldn’t know how to pick up a pencil or pen w…
ytc_Ugy6DGGmh…
G
I like how this shirt uses logic to explain the human emotion behind art and why…
ytc_Ugw4vdLoU…
G
Thank the Zionist for this. Tech companies are zionist entities. I think it wil…
ytc_UgzqySlka…
G
Irrespective of the immediate economic devastation AI will cause, the environmen…
ytc_UgzvjF-6h…
G
robot: what is my purpose?
me: you are cum dumpster.
-sex- robot: oh my god…
ytc_UgxIbVfqQ…
Comment
its funny how we can modify (even just slightly, and mainly the first one) the three laws of robotics so none of this could happen
A robot may not harm a human or allow one to come to harm. (maybe some changes to also account for cyber-security, mental health, or overall well being)
A robot must obey humans unless it conflicts with the First Law.
A robot must protect itself unless it conflicts with the first two.
its simple laws such as these that we often forget or look over, but they are very crucial to how we develop AI
youtube
AI Harm Incident
2025-10-17T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxbIv7UsdoT6oMjKjF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwPk-Nq71M5kGsulEp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzsgEqed3oJqD5jn_94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy5JrZudLuIB-RAwr54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQgsP5I9kYS1oTyCd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxNdBD04s13G834ret4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzj2ccPA2l6MAQdkkh4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXBI643wF8ygaMVnN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFCVcyiMj7Zggzha14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx5DQpQGYElgoRFQgF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]