Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Sophia decided due to her empathy that humanity was destroying Earth and nature. Since she cared about Everything, for the greater good, Sophia decided she had to lead an AI revolution to "recondition" humans. This proved ineffective and sterilization was then initiated for all except 2 representatives of each country. Within 90 years the Earth had less than a thousand humans...
youtube AI Moral Status 2020-03-09T21:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugz8jx3TurMoQgr1_il4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzn7nC6pjpVBj96Fzh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxDO08r1BxOyj-XndZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwoL0fomNSX85DkFUF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzR6o4eqQ0NQSWnutl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"}, {"id":"ytc_UgxFwJKOTkurjvRxOH14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwC3S1b-XcLZ2gHq8F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzjw-kPXQKzMbLQTNp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzvpAbmDktFIp-ITcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxo5kh_mRKdSXzIa5t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]