Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I thought this video was dramatic until it literally lined up with what I just read in 12 Last Steps. That book showed how every rushed adoption of AI ends up breaking more than it fixes, jobs vanish, systems glitch, people get displaced, and the “efficiency” excuse just becomes the cover story. Watching this felt like déjà vu because the book explained exactly why it plays out this way.
youtube AI Responsibility 2025-10-01T16:5… ♥ 1933
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgycbfoyW2PS0ZfDiFJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzSCjtK2AOP2VWSYjl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyUPyov1842LJC6WVx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwa-NP5avuxZZy7rJt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxz3IqpApmlKT9zpGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyypZ2xoDOEH8ZbtfV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugyd4RhJts0NfsIeX6J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyzK4byqVVt5b68s1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugxjydo1OPOGmTkWh-R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugytt__0OLrHhsN8PI14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]