Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know I'm late for the show, and I'm not even sure if my lukewarm take is even …
ytc_UgzU5OBs-…
G
I agree. Alex is using “gotcha” questions on a program that was designed to mimi…
ytr_UgxV0BNvG…
G
Me and my ai are solving overpopulation. Lets just say were getting on the wrong…
ytc_Ugzw0YdP3…
G
And we prove how stupid we are by listening to the thing war us about itself and…
ytr_Ugy3WjK_B…
G
Car AI also can't see cyclists. In fact most people are really bad at judging th…
ytc_Ugz1Zc3Dz…
G
That's the thing though. Power attracts those evil bastards while the meek ones …
ytr_UgyL3ptjd…
G
And as this video posts, tickets to the second Fyre Festival are now on sale. No…
ytc_Ugx6lBLDF…
G
If development testing for these self-driving cars get a ban, this pedestrian is…
ytc_UgyRLMUoU…
Comment
6:23
I have reservations about the autopilot still. I don’t think that the tech is there yet. Anyway driving auto with a 6 month old kid? No way. Horrible idea. It’s one thing to risk yourself, but a baby? Yes, there’s risk with driving, period. But while humans have our shortcomings, but we can improvise better than the AI still. Think about other people running stop signs, etc. Humans are better at reacting.
My car has the adaptive cruise control, which isn’t the same as the Tesla AI, but I’m extremely vigilant when I use it. Anyway, don’t risk AI with a baby.
youtube
AI Harm Incident
2023-06-08T15:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwu90-aJBcfwMCOhlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzXZp1le5oB2aTaqcZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxhyXG8ZF9ZrAMriGJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxgXPRMR3nn-GTTB7R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyq9U-AKTBvDEVNr2J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxjOHd4N3USuWI4MpJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz_ESyqi3BaEtRe_ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzsDkeIoiSNTbDO-mt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwHVa8izwwlppgKR_B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwiBmdRgmfTezehyfl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]