Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bruhh ppl in comment are so dumb.. they are the first to get replaced lol…
ytc_Ugx8WtdZo…
G
Very true , but also consider that China generates its footprint by producing th…
rdc_gx824xr
G
A Terminator makes good blockbusters but real world AI is much worse. A Terminat…
ytc_UgwT2ROP1…
G
When powerful men and women have to have a meeting about AI… people have no idea…
ytc_UgwAMUTR-…
G
reason ai detection is so shoddy
is cuz they use ai to detect ai, and what does …
ytc_Ugy2R3jte…
G
Will AI change things? Yes, no doubt about that.
I just hope that we (as humans…
ytc_UgweJwn6P…
G
We should start using the facial recognition software on ICE. Test it out to see…
ytc_Ugx9MIa_u…
G
Always time well spent listing to Hinton speak. One thing on "we are special": i…
ytc_Ugw3LHjxZ…
Comment
As others have brought up, I feel like this scenario is flawed form the get-go. This is assuming that self-driving cars will make the same reckless decisions as humans in following behind a truck too closely to avoid any sort of potential accidents. I imagine a self-driving car, unlike people, would actually do its best to maintain proper distance from all other vehicles when possible, thus avoiding a scenario like this altogether.
youtube
AI Harm Incident
2015-12-08T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgizunohajILCHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjjWOUDi8MzcHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQvTuYsrqOtngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgghF14lWrWg93gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghSobsLJzKwTngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghTsPIeRMcNT3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiGpAhmNNMkf3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjlPAxVCSrTmHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugju10Xr0tXdF3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjc3KGPZNZyqngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}]