Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Great way to ground a symbolism of a robot. Yes they want to contribute to the h…
ytc_Ugw89gmO-…
G
Is sucks that she died but she should have used the cross walk or at least do th…
ytc_UgzeG5l_J…
G
@luckysevenow1872 and AI isn't replacing art it's just a new more advanced tool …
ytr_Ugx1rrc0A…
G
What about a hypothetical betrayal of humanity to another species or an off wor…
ytc_UgzSTB8vM…
G
Currently taking a robot and humanity course in college and just a curious stude…
ytc_UggnCeYK8…
G
It has been trained on recordings of human speech, which has stutters and sounds…
ytr_UgxxZBd3_…
G
Because the AI technology boom is being fueled by greed a bubble burst is inevit…
ytc_UgwXHVkQV…
G
I recommend not slaving AI, history has showed us how it normally ends and it's…
ytc_UggK5dZal…
Comment
Yes, we should all hate Musk for a lot of very good reasons ... but take close note here of what Legal Eagle DID NOT tell you, specifically to magnify hate rather than to seek out the truth and better justice.
If someone uses a tool in ways the manufacturer does not intend, or forbids, it usually isn't or shouldn't be the manufacturer's fault. Should Ford or GM be sued every time there's an accident due to a driver intentionally speeding, because they didn't prevent the car from being able to speed against posted speed limits - something that is totally possible for any modern car? Tesla has always made it clear that autopilot (and FSD) users are ultimately responsible and must maintain close supervision, regardless of any claims of what the software can do. This driver intentionally violated this. He was also violating the law not just by speeding but by just handling his phone while driving, let alone trying to recover it after dropping it.
Also FFS. this video is brutal in how it conflates Autopilot and Full Self Driving, they are not the same thing. They do not have the same abilities. The claims of what FSD can dont apply to Autopilot.
It's a shame that all the instances where accidents truly have been avoided, where lives have been saved by the software go totally un-noticed, because nothing happened. There are a lot of idiots who think self driving vehicles must be perfect before they can be accepted, as if all the people who will die from an imperfect but better-than-human system not being available, just dont matter.
... And after stuffing all that misrepresentation into the video, in a case where likely it was just an ignorant jury that didn't like Musk ruling based on their feelings... Legal Eagle then tries to sell its services to us, in case we need a great lawyer. Why would I go to you when I can't be sure if you were intentionally biased in this video to feed on our hate from Musk, which is bad enough, or if you were actually too incompetent to recognize the problems I mentioned.
youtube
AI Harm Incident
2025-08-15T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwq6KiREqPGHazGBmF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugybk-OA1vLU4UcGCbl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzIGBdXNCotCf2F6M54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxYYpk9W-uEudo72_l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxenDH_L2vxATkiBPl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw5UbtFLRYhFPr9fIZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgygLXahYGjAZNrP4ed4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwEA79KnB9Q2VyJ3AB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx1oH75K5SbUzene4J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxExnuQ80F8HWrRPlR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]