Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The FOMO on Kardeun is insane! The next leg up is going to be massive.…
ytc_UgxkmykN-…
G
I think all this discussion about "AI replacing developers" ignores one crucial …
rdc_mpg505n
G
But then... what happens to the board members!? Do they get replaced my an AI bo…
ytc_Ugzff53kF…
G
Yet! Society will reate and Ai will effectively control many. This a relationshi…
ytc_UgxSynWrX…
G
@joeblow2286 yeah she's definitely whining because she doesn't want her work get…
ytr_UgwZNIviA…
G
I fix robots for a living so I’m good until they make a robot that fixes other r…
ytc_UgzeEHc8S…
G
He shrunk the kids in ‘89
He blew up the kid in ‘92
He warned us about AI in ‘25…
ytc_UgzXBZs_5…
G
It amazes me how people trust technology that man creates. Everything about a d…
ytc_UgzXkMP5s…
Comment
When approaching a scene with emergency or police vehicles on the side of the road, motorists are REQUIRED to move left if possible to provide a safe space for the police and emergency crews -- the Tesla made zero effort to move left even though there did not appear to be anyone preventing that move. Additionally, the speed should have been greatly reduced but the impact was at more than 50mph. NHTSA and the NTSB should ban autopilot or whatever name is used to describe autonomous driving across the entire USA with exception in only a few places that have chosen a death wish. But, we live in an era where billionaires do as they please and governments at all levels appear willing to endanger the public in order to placate the billionaire class.
And that brings us to the Cybertaxi, with no steering wheel and no brake pedal -- how does such a vehicle do away with a human driver altogether when a regular Tesla, with all the hardware needed for Autopilot or FSD, requires a human hand on the wheel?
youtube
AI Harm Incident
2024-10-26T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQ482mk7AAlmeAd854AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwn3RhrYM-bEmfzhEV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyuiTCc9Jgr7aqCPz14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6r5vUEb2TTt57CfZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQmizgU6DGL0VC9uZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxgkqOmyMVdhHOCBNp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWhxzEsg9xwon7Ry54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx4ycPMWYTT3mF6twh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyyLYsj9QUrQidExTl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxfa_1GIRYapbWzojx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]