Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This was an incredible, in-depth conversation! I love how it covers everything f…
ytc_UgzzpAcF_…
G
Sure super intelligence being harmful to humanity is an interesting philosophic …
ytc_UgzCdcvzP…
G
That warning about dangers from AI was not for nothing. I seriously think anybod…
ytc_UgxczzyvP…
G
As an artist myself, I enjoy perusing the AI channels for the decent few ideas I…
ytc_UgwifKTWX…
G
Great video. I'm here because you appeared on the Daily Wire (https://www.yout…
ytc_UgyG5SSWE…
G
Elites blaming Ai for destroying the world lol. As if they aren't the ones that …
ytc_UgzQQD1DH…
G
It sounds like you’re looking for something more entertaining! While the convers…
ytr_UgxzykIZX…
G
The intention is 2 fold. 1) find a way to convince the board for these companies…
ytc_UgwkQQlvc…
Comment
Not only will it be long term, eventually cars will drive themselves. Weather you like it or not. The driver is ultimately still responsible for what happens and they should be prepared to take over at any time. The accidents are happening because the drivers are not paying attention and taking over when the system makes an error. The problem is two fold. 1) the marketing is misleading and 2) The drivers are not paying attention as they should be and as a result are unable to intervene to prevent the accident. The marketing needs to change…its not full self driving…and the user should have to complete education on how the system works, including its limitations, before they sre able to use the features.
youtube
AI Harm Incident
2025-09-28T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwuQRD2kupySFWdpTh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfovMhdm3eLeQV9G14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzXhs9F4VXJJur7V4d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxm7nng80gPtO_qWEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznIuALORolMMREhxh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRAWB1j9sIV36PwAB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw4a-Xy9TqWSUF5xWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwnoKYJtEYMF6GUwbF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwvKXyJ3tTKFOnqNpd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1gx3jQu_InJLPGqp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]