Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Humans doesn't need new 'partners'. We need tools that we can order to do what we wouldn't want other humans to do. Why give feelings to a tool? That would be just stupid or cruel. If AI gets conscious enough to feel it should have rights, then we are too late.
youtube AI Moral Status 2020-07-08T17:1… ♥ 2
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugyv-VWH8N1G-bXkm_h4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwnzGo5u2NkylujZTZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyASnWP6Xu93wG3wWV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzdmL5ZPD-UlubmFTR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzVzg-BhDx4_4YlSCJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzDwscANVl6urHwfKF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugxb_R1uhSz-uY97F-t4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzpasU24AoMP3l73cx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzt-Lfb_2EfvY7rinl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzyAHY7BuXJTQBuNTd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]