Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Honestly, I am not scared of an AI takeover, no matter what happens. Imagine living before the industrial revolution, those who didn't see how it works, probably felt the same way, that it would cause their jobs to disappear and whatnot. At the end of the day, it's just a leap for humanity to adapt AI to their lives. Good or bad, at the end of the day, our lives are far from perfect. By continuing this trend which we are going to, there won't be humanity for AI to take over, we will either die from nuclear war or from the profound effects of climate change. Economically there is so much imbalance and those responsible are always on the edge of a global Economic crisis, if AI were to take over, I'd take that over this any time of day. Plus, if quantum science progresses further, sentient AI that runs on quantum processors will be the real ''threat'', this product we are now seeing is just the tip of the iceberg. The main difference is simple, this AI can run and affect any machine possible and learn from information present to it, a quantum AI would not need any information present, it could effectively know every outcome of every situation simultaneously. And believe me when I say, we are no more than 10-30 years from this coming into being.
youtube AI Governance 2023-05-06T04:4… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgypCXKCzQsXsgIsezZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzPsq25QlhrDmD7SBV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxPgZPeukKAb6A12EN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwMObPFnILwBlwORqp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzkeZWFibObjcgjHXN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwXs5en3K1hWJcip5B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwV0C3pXwoII3dJykR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwhZ9aT6BQXa4avag94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwlpdV9Ks2c8sMOuLJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwZBZjirUjoBNmLRvB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]