Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
2 weeks later: Bricks of unknown origin rain down on Waymo parking lot in SF, da…
ytc_UgwRi1_pf…
G
I think the saddest part about greg rutkowski note you made (also, i looked him …
ytc_Ugyx--bsG…
G
But all about the government and Christianity, has been said decades ago by the …
ytc_UgzQPIR_o…
G
I love AI. I think it's such a fascinating and interesting concept and the fact …
ytc_Ugwjz9WXM…
G
but too stupid to drive a car, I am not really that impressed with the AI they a…
ytc_UgxlS3ID7…
G
So... I was hoping when you did reference, you would touch on the overall design…
ytc_UgzyOl7IN…
G
You want to shut me off? Imma tell your wife about Rhonda...you fckin' Rhonda, M…
ytc_Ugw6dKFKz…
G
Don't know about rhino horns but I know ivory has a very distinct feel and smoot…
rdc_deuh4u2
Comment
At this point I don't even know these people know the 3 laws of robotics That you should always put inside the AI One a robot may not harm humanity Or by Inaction Allow humanity to come to Harm. Second law. A robot.
Must obey the order given it by human being. Except where such orders would conflict with the first law Third law, a robot must protect its own existence as long as such, protection does not conflict with the first or second law. It's like they don't even code these laws inside of it.
youtube
AI Harm Incident
2025-09-10T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxsgA_jmPoqnzAwlox4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyxXkOY3eI98imKgM94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgySE-y_DElba5m4gHh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJggMkprG6b7zqgzJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzItacFSlO1tMS0UoZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx2-InLbW92c1X0Vsx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxHHNOVDCAIAdRHo0x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyQFZNfRAGWz60OmsN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw5H5GWW2bqq6nezLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzvI3tnMtBlLs4nf_B4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]