Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How about global efforts went into building life RESILIENCE on Earth rather than race for TECH and A.I.? Please no technofix answer, thats a tale for transhumanist children. I would really like to care about robots rights and wish my toaster sweet dreams, but... there seems to be tons of human and natural emergencies to deal with first, like, human rights (including refugees rights), climate change, environmental collapse, global inequalities, raise of populism, mass surveillance, natural resources depletion etc. which my toaster won't fix. Because 1. it will not care about humans in the first place, as we are obsolete and pointless from a robot perspective, 2. the race for natural resources and energy on Earth will benefit robots rather than humans. 3. anyhow, it will soon desire to destroy its creators to become its own God. Sweet dreams toaster.
youtube AI Moral Status 2017-03-01T22:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugg9Dqny3LoDQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugjl892grkD1CHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjusG2XXNsQ8ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgjdXJQpASsKnXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UggFqHDoWRfrsXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UggRQk_shtKMS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgiU0CbkUs7EXngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ughae_Q7RxIYQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Uggczad5RakHtngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgilhY784SZqgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]