Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just wanted to say that Mercy Ships is actually a really solid charity and the r…
rdc_dpc4yus
G
I would go only if there is a similar call and effort to seek the end of Yemen w…
rdc_f1xwgvb
G
Cops with barely any training and a room temp IQ with hallucinating AI is a dyst…
ytc_UgxtZmYEz…
G
The robot just need an open hand 👋 slap to reset it .. i work with lots of mach…
ytc_Ugw4QYlR0…
G
If in "jailbroken" you mean "the reset every prompt had been stopped" then yes. …
ytc_UgydMA-Id…
G
So sick and tired of the AIs. An AI is only as good as the persons that program …
ytc_UgzqXynRP…
G
13:29 It is a human problem, but the problem isn’t with the consumer; it’s with …
ytc_UgxSRj_CU…
G
There is more meanig to those terminator movies than just killing humans. It for…
ytc_Ugz76W5pw…
Comment
Seems bonkers that there isn't something akin to a praking sensor that slows the car to a stop before impact in the direction of travel. Social media is littered with videos of Teslas crashing into carboard boexes, dummy children and all sorts of other unrecognised impediments. I won't be happy until I see Elon Musk running out in front of a speejding Tesla in full self driving mode.
youtube
AI Harm Incident
2022-09-03T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwnN6Gfy7to_CUlYk14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyCshs6uNh8BQlE6uF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw-S9UVeTwsMijGxtl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzT0cWw43g51ynIM-d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyG4h40amsvpgUuWDh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugznw35WDb47qd6BuKB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMgNUwF0snTvUVMTp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyaIkhtf3VDkfvdsm94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy8gvYOO1wDBZ0nlQd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwpLDC9nKzNvjhydVR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}
]