Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I went back and listened to the section twice, and your argument on the topic of…
ytc_Ugxg1StcQ…
G
I remember I was roleplaying in berry avenue cuz I was bored and this girl was b…
ytc_Ugw8nfeUO…
G
She is right. No matter how many accidents or deaths coming from Tesla autopilot…
ytc_UgzW7kTwl…
G
You can keep drawing the old style or learn how to program and feed an AI to get…
ytc_UgyC4Kuch…
G
I realized there is one very important difference. When we humans read a novel, …
ytc_Ugz3SYlDV…
G
Only AI will be able to monitor the billions of interactions online. Wait until …
ytc_UgwxB62sm…
G
i got a feeling you made up this aj to bash ai and wrapped this agenda in decade…
ytc_UgxD8FRJH…
G
It's awful. Part of the tech industry's push to replace every good vocation so a…
ytc_UgwWTmefN…
Comment
BOOOO. This entire question is moot. Know why?
"Your car doesn't have time to stop" is a stupid condition. Self-driving cars will be programmed to drive at a safe distance from any car. *If something can fall off the car in front of you and you can't slow down to avoid it, you are following too close.* *PERIOD.*
Also, if you're the car behind the driverless car, if they slam on their brakes and you hit them, you are following too close/aren't paying attention too. You should both be able to slam on your brakes and slow down together (with you a second or so behind) and you should not hit them.
That's the rules of TODAY'S roads, let alone with driverless cars!
Thankfully, these kinds of armchair thinking distracted drivers won't be an issue when most cars are driverless. When that happens, an entire lane of cars can slam on their brakes and no one will have to hit each other, because they'll all be driving safely--by the computer.
youtube
AI Harm Incident
2017-06-15T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggttszQdOIT0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugjeinq77JWQDngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9KaGK7Pz36HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjT75c_hYfYXngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghemGmrqvMpTHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg9VdcEV0AJu3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiMecIFIV9nLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugio-4_UVi4xP3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugh2ii7t431fIXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UggBZZ06kKai4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]