Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I believe that by 2033 everything about antichrist agenda will be implemented by…
ytc_UgwpHxUKB…
G
if the creator of AI just doesnt know what will happen in the future then we are…
ytc_UgyE-6M-a…
G
Guys this is entirely fake. None of these ai have voices 🤦♂️ and i asked them a…
ytc_UgzJ-RfHE…
G
Our robot overlords would be far easier to appease than the psychos we have now.…
ytr_UgyhabpuD…
G
@inakiballesterospolloni3436 That's valid, AIs don't have understanding so it do…
ytr_Ugwk6NZoU…
G
> Not sure if I should care what type of labor goes on in India and Banglades…
rdc_d3suoi5
G
It has already started. There is both a class action lawsuit against Midjourney,…
ytr_UgwQoFQAX…
G
We can use ai to fix that... really it isn't that bad, it's actually a totally n…
ytr_UgyNSZOWk…
Comment
The driver is a moron. He deserves punishment.
Tesla was irresponsible, because since their users can be morons, they need to protect the wider public from the users.
Tbh this is not that bad of a look for Tesla. It's hard to draw a line of "you can never use our products anymore" because of user behavior, I believe this would need to come from actual law on self driving, not the company that is making the cars.
So what I see here is the driver is absolutely responsible for gross negligence, Tesla needs more legal clarity as they can't be held responsible for inventing the rules to nuke their customers, this needs to come from law so that them and competitors are playing in the same field. So the law needs to catch up.
youtube
AI Harm Incident
2025-08-21T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyR1EF9kaZOvBEaDMh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx4IbGWKhGsZ364hsZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw03S5SnYHMdOC_K6p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwT8E8Mealb8fVM3qJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxvHLHN9LUFdEsPAHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxwJAChWjkhUdf-t914AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYHM_C2ZmNvoVhPut4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxt4P0B-5Qbr55IEBd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyfAohMOjjIenXqiOp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyv0R5RkG5T2ZRRLux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]