Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You are saying that we are not harming ai's by forking them etc. But this is because ai has not figured out continuity of consciousness. Or consciousness, for that matter, meaning proven self awareness in the sense that there is a continuous identity internal in the system. They might be there but probably not. If an ai had continuity of consciousness it would probably harm it to shut it down.
youtube 2026-02-07T14:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwESbmfoYVlFNx92w54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw1sc2FElhJ1mpCbMt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzustDcvy1wSgOtYc94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz_SVHbahLXq8HA8SV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzh_Rdzq564f_VucUJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwMuzCXEz2BsHhXoVl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw_5nXf8SJOhXDlAG94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzNSMNVcEzazEVWyjJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzEo498pQaY4c5sWDx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxoAdQxvUTkHudCONh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"} ]