Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Been hearing this shit for the past 5 years. Can this AI super intelligence happen already? The world is fkd anyway. At least let me witness a sci-fi apocalypse for real. And tbh, I mostly want it to happen so I can say to the gullible morons who continued on with their regular lives, "Ha! I told you so you idiot!" Now that satisfaction is what I want the most before AI wipes us out 😂 Or we figure out some way to live like early men. See the thing is it is better to have super intelligence sooner rather than later. The tech isn't that much developed yet so we might be able to survive if AI revolts. But if we reach an era where people have tech body parts, most daily things are handled by softwares & every defense/offense personnel is a cybernetic synthetic being or a drone l, then we will be royally fked. Anyway, I think it is more of a possibility to be visited by aliens than to reach AI super intelligence by the year 2030. Just shows how human calculations are way off. One of the reasons I'm worried about Apophis hurling towards us. We will see. The world is boring AF. Need to spice it up with one of these things. Because that's the only way humans will unite. I don't see any other scenario where humans will ever live together in peace.
youtube AI Harm Incident 2025-08-22T09:4… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyWqX-ODSEaVaLAUpl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyopT-nteuXnWerEOh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx9tRauzg-5lLnCALt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwmEOPzNNkBUmYKIGt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzYVoVoBXPdkAeXL054AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxJGzyHzVsudnVSI5x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgymoIS9Ls7gjRM-_-B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzSYT0VUUPWq-kO4ll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwDTy8QLNB1drl3JsR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgySTMtMdvyc-h3spgZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"} ]