Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The world existed before AI. AI just sorta came along. The same way life existed before electronics. But was life harder? Yeah you might’ve had to source everything by yourself, in libraries, taking hours. But oh did it teach you a lot. And look, I’m not against electonics, Google OR chatgpt, I think it’s awesome to have so many resources available on the tip of out fingers. But at the same time, we have all of this and literacy rates are falling. People can’t use everyday things that aren’t digitalized. People can’t solve a simple problem without asking AI. _Why on earth_ would you want a chatbot to solve your marital problems? So yeah, I kinda agree with the guy. Remember 2016? iPhones were a thing, we had socials, we had Google. No AI. Was it *that* hard to survive? Why would AI be everywhere. No I don’t need to edit a photo of my breakfast with AI before I send it to the groupchat. No I don’t need AI to recommend me what coat should I put on in the morning. As sure as hell don’t need AI to be my therapist. There are many advantages to AI, I won’t lie. But it’s all in self control. And very little people actually have that. I’m afraid I don’t know anyone that would use AI to write a prompt/idea and then not be tempted to let it do the whole assignment when they simply can.
youtube 2025-11-13T11:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxpLy4HDu1b8hdaI_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyO-K0FYFkdYS2bC5x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz6oiuN3t5ybY-aohh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxOfUqQ3RU1ZLE03Jd4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugwt-HoGZ7jp1g8o2O54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz8VbfyocQBYBGPLTN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxvgcbZCWLdNwrYYsV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzJrEq1gApKjAA5v254AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgxYrbgM6bDfs_nzuop4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwt_riRLQO-XQLc4TB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]