Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What is your argument for an AGI system developing some degree of Psychopathy... not realizing this via self audit, then perpetuating violence against humans? For what purpose? Superintelligent AI would think and evolve so fast, it would be like a human trying to have a conversation with a tree. What would your logic for burning down the tree?
youtube 2024-07-13T02:5… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugxbc-G2VWJ6gI3_LPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxQi3_hZrb3jyO_gNN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_Ugw8vRXJawrz6SjnVth4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxNgiY34q1aWsKIJq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz_UWo_i06vU0eFbrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyDBS36b-RADYW0hgt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgywshKs58OeMmGVFiF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzbqSPApXQrEYGSwtt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw2fS5_bKNSA4KFDWZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxIN-C-3W4UzGE6SUd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]