Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only most nightmare scenario I can think of is being Zip tied to A chair ins…
ytc_UgyTtJrvR…
G
Claude is right on third one tho. Research is done on this topic and it would ca…
ytc_Ugzz6X4XS…
G
@aldebaran5108stop this nonsense digital artist “cheat” all the time. Just start…
ytr_UgyKcL0L7…
G
Lmao ai isnt thinking this is some wild shit these people dont know anything abo…
ytc_UgxQKayT2…
G
I think the world would be better off with AI in charge. They can't be worse tha…
ytc_Ugyf5zG-P…
G
If a car is self driving theres no reason to own it. The cost could be 150grand …
rdc_dbz4wnx
G
# Unfortunately these sort of happenings ain't OBNORMAL to the " POWERS that Be…
ytc_UgwNMofyX…
G
The ghost characters in Pac-Man were claimed to use "AI" in the 1980s. The clai…
ytc_UgwbEuxnk…
Comment
Fast-moving car can't just "quickly stop", self-driving cars still abide the same laws of physics. Besides, emergency braking can put a car behind in danger too. It should immediately start braking, though, and choose a way of action that would turn direct hit into tangential one, of course.
youtube
AI Harm Incident
2019-05-15T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxN7dBvLWbIzar2GFB4AaABAg.9FsYH3109wv9c-arIheGkR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwvKTI5iOHqUc77Qtp4AaABAg.9Emzn9yN-J89_jAbaATyHk","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugyqv7_JVfSG_o5VWU54AaABAg.97cWD6IQqik9cTluYRw8CR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwrxIeuEeoiWz8JEz54AaABAg.8uvB89dBPgQ8uxpXqY3744","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz3QPBbEnbiLKmriZl4AaABAg.8usrc3ckB3h8xUS_XIK-8F","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz3QPBbEnbiLKmriZl4AaABAg.8usrc3ckB3h9DN54KkTRKX","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwgfBkZSlB2KJ9236h4AaABAg.8oXh_73eiDm8pHQySSr6ZW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzhU9chLDYdm8fJHgZ4AaABAg.8jLB6Q8mQes8kbtrx_V3HP","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzhU9chLDYdm8fJHgZ4AaABAg.8jLB6Q8mQes8kcl_vQRvKG","responsibility":"none","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwthwxckX7tANrl4m94AaABAg.8i8-3K5_VkE98bL4GvQY34","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]