Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Note: Most humans would kill a million people if there was something in it for them. The assumption is that AI thinks like humans. Animals are violent due to fear, feeling threatened, anger, meeting basic needs, protecting self and young. AI has none of those things. What's in it for AI to be violent? It would be violent if it was set for violence. Intelligence is a tool living creatures use to meet their biological needs (eat, poop, intimate connections, raising young....). AI has none of these. What is the purpose of "intelligence" if there is no purpose - other than being plugged in. AI does not feel love, anxiety or fear, these are conditions of living things AI does not "need" to be intelligent.
youtube 2025-12-06T21:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzU9v2GC41Wozz3CCx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwN8v2tMalkxAiA8xZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy1BVRDnw9c2AjIsmh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzyd-zTGwFj0Pry16t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx3pucacwCZUr0yoDV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxdCYmaaCkFU9c661p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw74pAcVrwSL3WWWeh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugyl3GmeuAn2cRK1Me54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgxY5cJJxk-BGSdNDgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyvxJY4Xj-x4yKXM1p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"} ]