Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Someone smarter than I am please respond: So I watched this video and while I largely agreeded with basically everything isnt the idea of the future supposed to be a point at which no person has to work and can freely do as they please ( obviously no murder and so on ) and us as humans can just live how we want wouldnt that be the most fufilling life we could ask for and I see AI as a tool to get us there because the more jobs we can replace the faster we can switch over to something similar to a Universal Basic Income while those who want to work can because we can never truly get rid of a persons wanting to explore there will always be someone fasinated with the universe facinated with a video games facinated with food that will continue to bring innovation without the need of money thats basically my core hopeful desire fom AI. That being said I know there are a lot of problems im not that dumb, like for instance we need AI to be regulated like this should never have been release to the public and also I STRONGLY belive that any content made by AI if it is spreading missinformation then I belive the company should be held liable for taking away the human rigfht of information yeah its a little steep but so is drowning all the information we have in slop. The other important thing is we need to reduce the energy consumption of AI and make it more green, but I feel like that wouldn't be a big issue if the general public didn't have access to it. So this was just a look into my brain and some recent thoughts I've had, while I by no means am an expert, I'm still interested and honestly I'm hoping that if we are able to use it CORRECTLY then it can greatly benefit the whole of humanity.
youtube AI Jobs 2025-10-08T16:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwTuLyjJWCD_sUD89R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy04i1x69meXBWkM0p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzvmGqZOi9Ng7KWTdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxaTo811dDN9Kz9OMl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyTTvAmhtTmNcuCSFZ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugxu8M98e9_igrFQiZB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_Ugy1cDigDMvBmizcWPt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxE6jCy8dZHWNmDg0V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwR70NR2crdPMkvkVx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyZqCj3QSx4AfrW-414AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"} ]