Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI isn't there yet. Most companies just saw it as an excuse to cut jobs and boos…
ytc_UgzSeidjw…
G
Maybe AI will recognize that we dont need politicians because they are ruining e…
ytc_UgxLkD1IE…
G
The secrete is that if we keep talking about this subject, AI will eventually be…
ytc_Ugx0kNmV4…
G
AI is for lazy people. And even if you want to save “time and effort” is worthle…
ytc_UgyZV9CLx…
G
@larslarsen5414 they might also charge for specific data sets when companies mak…
ytr_UgwagqtOQ…
G
A lot of IFs need to become true for this scenario to become true: e.g. Hardware…
ytc_UgxKkHJzJ…
G
I forgot who said this but my favorite comparison is that ai art is the TEMU kno…
ytc_UgweaI-qr…
G
It’s just a picture, how is it different from any other AI art they made…
ytc_UgzKDM52w…
Comment
The car would just stop because it would be able to control it's following distance so it had enough stopping distance, I'm sick of hearing this devil's advocate bullshit when it comes to self driving cars. Most of the problems raised are extremely unlikely, and made more unlikely or next to impossible due to the autonomous nature of the car which take out the need for reaction times and human variables which are almost always the reason behind road accidents. Also the issue with who would be responsible of being resolved as Volvo had already stepped up and said that they would take responsibility and it seems as though many other major companies will soon follow.
youtube
AI Harm Incident
2015-12-15T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh8rhSAIlTrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj8lU9CWdFWf3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghFWNaMvDiVGngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgggdoqiWWgg1HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggYKBs14QZPoHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9xnzyNGEqq3gCoAEC","responsibility":"society","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugim4SKNBlRtfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjXjm0R3slUzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghxbQR1FcrERngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgipNWetGSuz7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]