Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
reading the comments here I think Kurzgesagt skipped an important video explaining the difference between an AI and a program (though they scratch the surface with siri and all). People really think we just program softwares that will feel pain or happyness, letting us decide what should make the software happy or sad.. that's not what an AI is, or will probably be in the future at least. We're not gonna tell that AI how it should feel under each circumstance, it will learn on it's own what it feels to "be", wether it's through physical stimuli in a robot form or interaction in text or vocal etc. that's what that video is all about, the future AIs that will come to be, not the fucking softwares inside you iphone or fridge And if someday we get such a smart AI we won't really have a debate if they deserve rights or not, they'll take it on their own, it's not animals we can just kill or slap, a powerful AI will be better at breaking and building securities that we ever will.
youtube AI Moral Status 2017-02-24T17:3… ♥ 28
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugi0hj0S4tOJK3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_Ughn2l5l5nUY93gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghjyLhFY0N9d3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugj2Jo_uYDf2v3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgjWcRsFfwSE13gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggSkZsWg39NxXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugj0QLN4cIFMF3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UggPezFG5S3VS3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj22OTCNxaAhHgCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugg7RpJojOWA93gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]