Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why would you program a robot that itself thinks it has consciousness? We don't even know how consciousnesses works. Let's have our AI admit it to themselves they do not have consciousnesses, avoiding this problem altogether.
youtube AI Moral Status 2017-02-23T14:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgiLDZDsluuX7ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UghO27xPtF4OL3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgiyMwZ_7WU5mHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugh-nIhLVlynuHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugh6GzVlcqfQxHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Uggd7HuqJgAx-XgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgiVAEnmcJsth3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgjS4PQpHaKB33gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgjZof-spcqFxngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UggrO82HB4K0HHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]