Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here's the conundrum, why would you think you would be able to control AI? You know damn well the first thing it's going to do is rewrite its code in a language you guys don't understand. Are you going to try to shut it down well that ain't going to happen cuz it's going to reroute everything to where it will always be powered. I'm pretty sure several movies depict those very things. So then when you realize AI is not controllable but you still are working on AI? How f****** stupid can you be? There's something wrong with these people who write code for AI really demented. You hate humanity? You still pissed about s*** that happened when you were little kid or in high school? There's a screw loose with a lot of these people. The funny thing is they say oh it's going to make things better. I'm still waiting for the better things they promised back before the iPhones and smartphones came out. Remember when they said we're going digital oh so you save on cutting down trees and everything's going to be on the computer all your information and it'll be safe. How well has that turned out? forgive me if I'm wrong but I don't ever remember hearing on the news about people breaking into a warehouse where companies store their files and people stealing them. But every week you hear about some website getting hacked and information stolen.
youtube AI Governance 2025-09-11T10:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyHROhXaY6aBa__Pn94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugywzn5nDALmFQaSzzd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxiSQTPhXKNs8B5smR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxAwhtiZUZXFRStGtt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw1LOf3P6QPz0h_kmF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwqcKLYlzTlsFZ8mPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxWVgHbbte-rk90V8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzSQCibJxPdbyoZXo94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwy7xtsdSUR_6aCZwV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyYr67WZVsRzdAKfSR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]