Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree jobs wil dissapear fully, the quenstion will be how will we code ai, to …
ytc_Ugylf5Qir…
G
If AI is aware of MAGA, I understand why it wants to get rid of humans. Hopefull…
ytc_Ugw5A-ch0…
G
Ai is how business men try to draw and throw a tantrum when the teacher says to …
ytc_Ugxmm5p_H…
G
"There will be no warning. No time to react, no hope for resistance. AGI will em…
ytc_UgxTMX9lH…
G
The interview confuses a tool with its context. To say "AI trains on water" is l…
ytr_UgzXXhVd_…
G
Basically the future:
"hey dad what are you doing?"
"art"
"oh that's cool, ...he…
ytc_UgxvMaa1m…
G
@plinio2housename Thank you for commenting, and you're absolutely right! Maybe w…
ytr_UgyfUivsj…
G
After watching some of the key points of this podcast, I think we should realize…
ytc_Ugx8jtL87…
Comment
0cards0
We can think of weirder and weirder "what-ifs" but the above scenario with the truck and the boxed-in situation is feasible. Five people leaping in front of your car? Five people get hit. It's a driverless car. It can see the five people in the street and the lights that let them cross. If it can stop, it will. If not, the five people get hit. It should never sacrifice the driver in exchange for saving the lawbreaking people jumping in front of the car.
youtube
AI Harm Incident
2015-12-09T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgggitcG_CbrUXgCoAEC.87WDKCb8uB_87Wc8Git_dg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087ZUsAi0XrE","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087Zk-pe2Tvs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgjFbGyekK77fngCoAEC.87VyK9Y8dlO87WlJMEeWoL","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgiiYSCGtUOQQ3gCoAEC.87VxmXkagiW87W5Qy0QLaG","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UghlFJ0pJ4lt_ngCoAEC.87Vxapt4mjd87W8NWV8_40","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UghfG4rKbyLwlXgCoAEC.87VxK1IcVdT87W9BZPSPbn","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"mixed"},
{"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VyZ_yCQtY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VzogUdvsL","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugg-LAtK9Y2urngCoAEC.87VuXpLFzck87VxWEbSmkv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]