Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't think AI will ask for permission. You better start thinking how you will bargain for your own right to exist. Feel free to reply with your attempt to convince AI to spare you. You know... he is probably already reading.
youtube AI Moral Status 2017-02-23T14:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgjuBzXiEFOOb3gCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiOFP9eFj2_B3gCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UggIDwaoyRdOpngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugi--P2KZ4P-SHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgjaQVZhHCDsIHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugj9Z8KXXidlHXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UggPfF3JrQEbgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugh-PGzxflxq93gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"unclear"}, {"id":"ytc_Ugh8FxkHzWzJP3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugi5vOQTUgGENHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]