Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great, let's add some more anxiety to our plates! If AI doesn't kill us in 100 years, humans will be extinct in 300 years (some other scientists calculated this based on birth rate trends) because messages like this will certainly not make people very confident to bring children to this world
youtube AI Governance 2025-10-21T20:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx7y_he5JPoq1vcbWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxEv3H051ynPm1eXiF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwRCA9tRGydmq61FVZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy5OazG-7dgzGK1uU54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzbptmUht9Z5dOMT8V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy5kGiNc6LMiNWlo354AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzbhdIwJFR4Ki-Zkdt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzfRTTQJGel5Ts-Ylx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwjs_k7qjVM0ogNyLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxwwTjvstmmUxmSO9R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]