Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Has there ever been a bigger liar than Elon Musk? (Oh wait, I forgot about Donald Trump. But which lies more? Hard to say.) What's harder to say is why federal regulators even allow such faulty systems to be sold and drive around on the roads in the first place. Missy Cummins of George Mason University states it clearly at the end of the video: having an unreliable system do most of the work but asking humans to attend to every second anyway to correct it when it makes a mistake is a non-viable approach, period. Human brains are not equipped to do that. I don't know why anyone ever thought that would work. You can have humans do all the work and use an automated system as backup for safety, or you can have an automated system THAT YOU CAN RELY ON 100% OF THE TIME do all the work without attention from a person, but this in-between state that invites a person not to pay attention but them blames them if they don't is just insane. Other car manufacturers have not been so foolhardy, besides that their systems are also safer due to also including sensors besides just optical cameras. Furthering the problems is the plainly deceptive marketing. Calling it "Autopilot" or "Full Self Driving" when it is clearly nowhere near any such thing is false advertising pure and simple. No other car manufacturer is making such false claims. Musk confounds that by blatantly lying during speeches and presentations and interviews where he falsely claims that his cars are already safer than human drivers, and the like.
youtube AI Harm Incident 2024-12-15T23:2…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyXM95yDqdxmY9oK-94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyRQzE2sS3wFSmDY2B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzcF9nYUR2tlQ6jR9p4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzf0NqEzcVVpmXu0yl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwL0GAp4oz8OlwyKWd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwHCYZkjfQ2oY8dE354AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx6XsqH5tl1KujMBWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugw8NuBFZG6T5bRNTjF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyemoJI3SVCqZA97VB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy7FakZ4fx-BUUQAxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]