Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ok, so when i thought id grow up, i didnt think that: 1. There could be invented an alien deadly bacteria that kill all life on earth by existing; 2. There would be billionaires who learn history from far-right lunatics that also control the research facilities that could potentially create AI that can kill humanity if it wanted; 3. That I would live in the era where billions of tons of food would be wasted while billions of people suffer from malnutrition; 4. That officials from various countries would be threatening each other with nuclear weapons.
youtube AI Governance 2025-10-04T09:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgzmFMSaLmQypoAULiV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw3a9XNxBCgMgdrSC54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxqb7hQAKaBVJdo-pF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxKCUfu10NPd5tgm5p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzb-LMFBGMnxKwFs9F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]