Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Shoshana Zuboff argues that “surveillance capitalism” is far more dangerous and …
ytc_UgyVZ9v0W…
G
Ai will rend human minds in vegetative state. Crime rates will be much more wors…
ytc_UgxMN2KTu…
G
I agree with him. There's so much potential in the future of Ai - so why throw o…
ytc_Ugz-q3MWC…
G
Will AI be smart enough to shoot Musk et al to Mars and get him out of our hair …
ytc_Ugxb5zhlc…
G
Ai is the new tool for creatives PERIOD music, film, whatever
I spent the last…
ytc_Ugxj4i3FJ…
G
Varun is so confident that if AI doesn't replace jobs, he will make sure that it…
ytc_UgzmRZS5a…
G
It's not a gift?? We all start with stick men and circles. Just bcs ppl are not …
ytc_UgxXvFMHn…
G
AI doesn't need to do crap to end us, we have already engineered an extinction l…
ytc_Ugw8_b3Ox…
Comment
I feel as though in this scenario, the self driving car should chose the option that statistically will cause the least amount of damage. And sure, for this highly improbable it may be safer for the motorcyclist to not wear a helmet, in the grand scheme of things the person who wears a helmet is still more likely to have less serious injuries than someone who doesn't so that point doesn't really matter.
youtube
AI Harm Incident
2022-06-05T23:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz3NDLJm5vOL8_5Ki14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzzec3Twn63agGPyDB4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz8wJCpFoQ2L1TPwT94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDg--Hfm2lG0jR6Ut4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWoJcDFo_ekiyvEmt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8hBTPSf8XBnRxR9t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw4jM93_9cAtGe9wgN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCoTNgNzS8ucWLuet4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyKUDGVaTLJ7c09rdd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxIAyCois5Y25HZHYZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]