Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So what do we need to do? I can’t just watch this and sit idly waiting for doom, I want to do something to minimise the probability of an AI takeover but i don’t know what
youtube AI Harm Incident 2025-09-11T17:4…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugx1rjUHFz7tC3ND_GV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzS4KNgsgSqimRY48x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy9u6la-mWOdxncWMJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugwbxf2VZhkLE9iWa7h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwWyWyauO1mCd7yXvF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzo2FSqDSsBqW32s8x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxMOvY-qcSnwDrGYV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw0LgtW892HJH3QRFd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxqJyJ3SUclQcSx1mx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxDGuPOXerLEosjUyt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]