Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You’re making a very clear and accurate observation about the disconnect in Alex…
ytc_UgytyHk26…
G
That's a pretty optimistic view. Especially considering that this video and inte…
ytc_Ugzl1z5Ar…
G
The only time I've ever used ChatGPT was a week ago, when I needed the most gene…
ytc_UgwLJnzhj…
G
I suck at art but I will continue to draw by hand until I get better. To spite A…
ytc_UgwMBBVMJ…
G
The only bias that AI will have is western European anxiety and entitlement prog…
ytc_Ugxkvyd_p…
G
Unreal … I cannot stand content creators that start a video off telling you to b…
ytc_Ugyol8BWo…
G
Use AI and allow your workers to be dumbed down and in the long term your busine…
ytc_UgyasANm8…
G
LLM meaning "Masters of Law" tripped me up considering what channel we're on xD …
ytc_UgwnA0rop…
Comment
The "self-driving system" marketed by Tesla has killed motorcyclists, basically the autopilot has a hard time recognizing at least certain type of motorcycles. They're literally treating everyone like a guinea pig by sending these out and the safety bugs haven't been worked out. Also it advertises that oh the vehicle will drive to you in a parking lot with smart summon with the notes on the bottom basically telling you that you shouldn't do it. To me that's like a motorcycle manufacturer saying hey on this motorcycle you can do a handstand while riding it, and putting in small print on the bottom, that it's very dangerous and you shouldn't do it. What they say in their advertisement and in bold is going to be paid more attention to than something in small fine print. It's real scummy behavior by Tesla by advertising it as self driving meanwhile in small print contradicting that.
youtube
AI Harm Incident
2025-08-16T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwOBvzBV2uhFmiWWV54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzeoERGrAf0QKzRV14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw-CP-i6gxyfF5X0794AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy9AkaQqB7KOduImQp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy-uzESCdWqODhW65V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzbIq01Jjir86TLyZR4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzL0ORudY85HNS4ETV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzSFTsQmA3Jq0dfJRp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxOIR2UO7BobBjCoEB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyurXv0I0EwRxXXlY14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]