Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ximecreature
If the driver infront of me suddenly slams on the brakes giving …
ytr_UgjY_cocR…
G
One wonders how govts are governing so many people without having at least start…
ytc_UgwVtYBF7…
G
The problem now for me is how to survive in the next few years. What is happenin…
ytc_UgylflwkI…
G
Bro, Meta AI needs AICarma ASAP. Mixing up cats and kids? That's a next-level oo…
ytc_UgxokJCuJ…
G
I see The Terminator coming to a city near you or a rural area. That's what I se…
ytc_UgxPh1yFt…
G
Exactly, this should stop right now. Or it will have extremely serious repercuss…
ytr_Ugycb7zkx…
G
A.I. can fuck up any human as it has FOCUS which human dont have at all…
ytc_UgwmjuPYN…
G
im confused he was shot by who its not like the ai did it?? i assume he posted s…
ytc_Ugw4NyqIa…
Comment
Major flaw, his initial assumption is incorrect. A self driving car is programmed to obey the "4 second" rule. So if the vehicle in front comes to an immediate halt (or drops a large object) it will be able to come to a safe stop. If the self driving car is not obeying the "4 second" rule, then it is faulty and requires an immediate service. This means the responsibility is the lack of maintenance of the owner. His argument is similar to saying "you are using your computer and suddenly it catches fire because the owner has blocked the fans with paper. Is this premeditated destruction on behalf of the computer manufacturer?"
youtube
AI Harm Incident
2015-12-14T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjItq0wivzFzHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi3UjQWwYBga3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugil7mqZ96nRsXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgizhDQN0tfbqXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjdML6iup9kxHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiDITa8mouAQXgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Uggk2g1O4hSYuXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjdS9_U-Ytg-3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjIfcNAortGP3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghasmfeHrS-OHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]