Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI: _struggles to identify objects in very overexposed, dark colored images (a w…
ytc_UgwfyJE_n…
G
I like to think that if a robot is self aware and aware of its own existence the…
ytc_UgzHeP5Pu…
G
I’m not blaming Adam, I know the poor kid was depressed, but I am REALLY curious…
ytr_Ugx-jaqzf…
G
This reminds me, Elon Musk made a robot that had a design similar to the ones fr…
ytc_UgypEPRH6…
G
This is Proven to be true. Hope you saw the news of a Dubai doctor quoting - "AI…
ytr_UgzwIL5ud…
G
I’m going to get skinned alive for this but this video is both payback and thera…
ytc_Ugz7tG64P…
G
Yeah, the best AI (and we are still decades or more from "the best") is only as …
ytc_Ugzt_gypv…
G
Solution: The rights of each robot depends on how they are designed and programm…
ytc_Ugjl5NS5p…
Comment
•In theory, the AI systems would need SOME humans spared to keep them powered and maintain the hardware. Although one would think of the “Just destroy it” approach, it would be more like those old westerns where the local doctor is forced to aid a gunslinger’s wound or get shot 🤷🏻♀️
•The technically neutral “mindset” of AI programs makes me picture them as the programmer’s extremely intelligent kid, that wants to help, but feels under appreciated and seen as a “Be helpful to me without me truly acknowledging it, kay thaaanks”
•My main take away is to treat it all like: iRobot, this one 60s Star Trek episode, The Amazing Digital Circus—Including the story it’s based off called “I Have No Mouth, But Must Scream” (also from the 60s…when half a room was taken up to just calculate things by the way, so THAT’s not freaky..)
youtube
2026-02-12T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyJwlGC8BWQCKjaxSl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzImHWLNVTeoK6Kx14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwdv4sBb93El6ID-m54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwnAGb128UO8Fy33ul4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyiz1xAXDJBraxIRf94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwTXVRNlNSEttLh18F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwsuskb1ioc4zQK6J94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgweMvlAtEbh-txBZNN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyoXxUJbULK0iUjAEt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxBKOmLhbI3JXHbBI54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]