Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Alex: CHAtGPT you're starting to sound a little bit like Jordan Peterson
ChatGPT…
ytc_Ugw1zfMgp…
G
The thumbnail here is a little misleading as it is not possible yet for articula…
ytc_UgxkUPsWx…
G
If AI gets rid of humans, what will be their purpose? They’ll have nothing to do…
ytc_Ugzp02K36…
G
The guy with the black carpet stuck to his head is so dull.....yet I wasted my t…
ytc_UgxEiXQ5h…
G
Ai vs human, the problem with AI is we don't control it, it's like the differant…
ytc_Ugxmtvoal…
G
The problem isn't AI and automation... technology will naturally take all our jo…
ytc_UgwGlu7a9…
G
Yo-yo: "Why would he do that? Hasn't he seen those 80's American Robot movies."
…
ytc_UgzPFIwwt…
G
the worst thing you can do in this era of humanity is contribute to the vast amo…
ytc_Ugzu1_1Nd…
Comment
But that's the problem. With driverless cars, people will naturally start tuning out their surroundings as the computer takes over. The same thing happened when automatics became popular, and then again when car navigation became popular. This is the ultimate culmination in that. If we don't have to pay attention, because it is being done for us, our minds wander. The safety driver was wrong, but this is a perfect example of how dangerous these cars can be when the technology isn't 100%.
youtube
AI Harm Incident
2018-03-22T08:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgzQOZ0xeR_MvbxaTPl4AaABAg.8e4YBJWuzkA8e5gQOfUZmh","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugz7AxvoeP0cNLIk5-p4AaABAg.8e4WM0WYx8O8e6NBFp_Rvl","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz7AxvoeP0cNLIk5-p4AaABAg.8e4WM0WYx8O8e6SYyLtDOe","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzfgGIMf0YW9DpIzxV4AaABAg.8e4W678ktNY8e54DRFXGnj","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzEnXkqTqC0O-XZLOV4AaABAg.8e4VnUp5aqP8e5xQyebu9V","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgyU2UngCDg136H8NUx4AaABAg.8e4GR1FHZx08e5dkLBbx8h","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzF7MnIXWPYaBn_UUV4AaABAg.8e4FfPJnYWM8e4PBgB0RqO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgyV019tyG9R0g7rw7F4AaABAg.8e4EMi_4IbY8e6ooGklDfK","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyiyW43qT6sIpx0xW14AaABAg.8e4E7Ts8Gs58e4l71ZHvLA","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyiJmvjw7Yhygp30C14AaABAg.8e4D0yxQ5F_8e68DNiMAmd","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]