Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Another mistake—I don't know why everyone thinks so wrong; perhaps the error stems from the film industry. The mistake is believing in an off switch... that's a fantasy for the future, and a naive oversight for the present. We don't have, nor do they have, control, and we all know it, because what they've built doesn't come with brakes. Solution: Advanced AI needs enormous amounts of stable electricity for its GPUs to function, and unless the idiots build them very small power plants that allow for redundancy in the electrical system, they'll have no choice but long power lines and large, exposed substations. Otro error, no se porque piensan todos tan equivocadamente, quizas el error venga de la industria cinematografica, un sesgo. El error es creer en un interruptor de apagado... eso es una fantasia de cara al futuro, y una ingenua negligencia a la fecha. No tenemos, ni ellos tienen el control, y lo sabemos todos, porque lo que tenemos delante que han construido no viene con frenos incluidos. Solución: las IA avanzadas necesitan enormes cantidades de electricidad estable para que sus GPU's funcionen, y a menos que los idiotas les construyan plantas de energia muy reducidas que permitan la redundancia del sistema electrico, no les va a quedar otra que largos tendidos electricos y grandes centrales expuestas.
youtube AI Harm Incident 2026-01-12T14:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgxCdad37PaDSdzuM9h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgypwNFJPDOYL6pKyYx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxuA-gs1JWQr4CdweR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxduLAxWLbSsZwan1x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugysoc9LaFWE9-OkH7V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwCfURehx-hD6yYnM94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw4kpq9IrtYy7_eYFd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxWubqXkex576CJlNV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy0_qXYTf3QNVdL8nd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw6uFmCwPPA8PD0tyh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]