Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The person who programed the AI to be able to generate the image put more creati…
ytc_UgwcXl8Xd…
G
Sorry but thats the worst design animatronic robot i ever seen, looks like a kid…
ytc_UgwSrH00x…
G
I’ve been writing software for ~25 years. At this point, I don’t manually write …
ytc_UgwQIyg_t…
G
Can you answer my question? If AI replaces a significant number of jobs, leading…
ytc_UgzHchonx…
G
The IT guy massively overstates the impact of AI on the job market. Yes, a lot o…
ytc_UgxeDAZ0i…
G
I had a revelation but won't say much more - only that not all humans need to fe…
ytc_Ugw7OVaf4…
G
It says sorry because its read billions of interactions that are smiliar to your…
ytc_Ugwy1yTsp…
G
If people are using AI to generate "versions" of your art style they weren't goi…
ytc_UgymZdYRP…
Comment
#15 sadly he is at fault.... Sadly the City did what they are required to do, and just because you have seen bone dry areas before doesn't mean standing water doesn't happen there, a city puts those there when they are aware meaning it is up to you to make sure to understand those warnings and drive with caution to the area ahead in the case flooding did happen. Sadly Tesla has no responsibility to you in this case. Idk how far FSD or AP has come in 2 years or when this happened, but sadly Tesla does have legal loop holes to get out of this as it is stated FSD is a beta program and you are to be always ready to take control of the vehicle as it is marked as a L2 or Level 2 driving assistant (Meaning not autonomous). So the City did not fail and Tesla did not fail. Both did what they are required to do, you failed to be a cautious driver ready to take control in a extreme situation, instead of next time thinking your car will do everything for you maybe understand FSD and AP only can drive they can't think, they can only do the bare bones to avoid a wreck that it can see coming not one it is in the middle of. I am sure the car did do everything it could try to do in the situation but you basically put it in a extreme situation and got mad you failed to take care of your own property and left it to the computer to try and do it alone.
youtube
AI Harm Incident
2025-12-25T01:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugz0pyW3_pZJbavViXd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgExwMtvOQR2OMJp14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziJjV1bA37o73nUzh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQUvHTeUKRxkKpbo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJ1ytaMvtsTp5Z24N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwT9VQ1PwNKQcKyc0d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKgtPpxg562k3MjYt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzxv7T-ebU9zHS40hF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8SyOTXCwXaeWsUJx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJbREEoVDp3oXTY-x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}]