Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The answer is a big No. Because we do not want a function A.I at our doorstep because we don't want a Future like The Matrix or The Terminator in the first place (or Hal 9000). We don't want to do it because our fear to be enslaved and killed will stop us from doing so.
youtube AI Moral Status 2017-02-28T06:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgiKYV8v9JQYg3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgjCkPtC30Z9mngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjmL9PTUYn27ngCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UggSRmUXxp_mdngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjhwwXIci4w4HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugjo_2qmwrEy2XgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugi2Yut5usR3QHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgjcyN9r0FMRwHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgjSd-41hV6ELXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgiTz-lvV3YGIHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]