Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
**Real** self driving cars don’t require people to be alert. The semi-autonomous…
rdc_e13m18o
G
That’s been my experience too. I was in one of those “strategic realignments” la…
ytr_UgwvburQw…
G
Rapid data processing with AI only benefits self driving but collision avoidance…
ytc_Ugwd0_w4k…
G
Who honestly cares, now that AI exists it won’t disappear, and it’s only getting…
ytc_UgyeX_yuj…
G
To me it sounds more like people that don't want to live in the streets cuz a ro…
ytr_Ugw2lYaI_…
G
How is the AI order bot "surprising" you, exactly? By the depths of how wrong it…
ytc_UgwvT36vm…
G
Got another reaction to a potentially horrifying comment by both AI’s. During th…
ytc_UgyvJSFAv…
G
@KeeperofStoies Typical,ignore other point and choose irrelevant topic,it is a…
ytr_UgxHXgO7p…
Comment
Shouldn't you do first an interview with Elon Musk, before echo (chamber) Elon's bad, Elon's evil!
Because than you will have the arguments to justify your opinion (your echo).
Especially about Ani. This AI girl friend, this personage you pulled up in one of your pod casts, that indeed is quite disgusting.
I would like to hear Elon's commenting on such an atrocity.
youtube
AI Governance
2025-09-24T16:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyocAXq7lX01V6SBAp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxm57ELLuPgLMNSoWx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLJK4DZjr0HnJOurF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1repfljJV3utd2pd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgySdXoqVE6yyiNpxLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVllMw5tudy8G4wRN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQE1GXvK6kNyU6ynF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxuY23slKUx0BEK5CR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz6COEsMHPCRDMwSqt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwYuzL0jr_Fn_WDtmt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]