Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If an autonomous robot kills someone, the owner/builder/programmer would be culp…
rdc_cqiprrm
G
Agree. I’m on a AI research project and LLMs have poor analytical judgement with…
ytr_UgyFo-bvA…
G
i have a strong feeling its the dajaal, hes taking a different form as an AI to …
ytc_Ugw90-ZuW…
G
imagine telling an AI to fix a bug. only for the AI come up with a solution that…
rdc_mrrnbgo
G
I listened the to the audio books 'A.I. The Coming Wave' and 'Scary Smart' by ex…
ytc_UgwkQbSi7…
G
AI Artist "getting absolutely destroyed" when what happened was that the person …
ytc_Ugz58iRJY…
G
How about self driving trains? Huh? Huh? Anyone… oh i live in america where cars…
ytc_Ugx4sKavz…
G
This… then meta shitcans a record amount of people. Fuck AI and fuck big Corpora…
ytc_UgxyG5QuM…
Comment
The real issue is there is too many big compagnie who don't want this to be a real thing.
You don't really need " ai " to have a self driving car, the car can use the good old teck like some qr codes + gps to detect at what speed the road is. You can draw special line in the middle of the road, so the car can follow.
Many things like that.
But car manufacturer don't want that to be a thing because they can at least divide by 4 or 6 the ammount of cars used and sold.
youtube
AI Harm Incident
2026-04-24T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzAmE6WFrpH79DLRGt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyzCjgaFEh83VUiYSd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDhspAqJOkTYj0k7t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzaHJNYfrWfY8Ku_K54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxagg68u_yyF4egUG14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1sktigPFtf2eRQQ14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyYLrYZnzkJne7C1XZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMeFv49mTEeTt0ObN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyQiF2alDI_Y_CUWLp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyauiBGrLE4lAQJ0Xl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]