Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
On the topic of simulations, why he thinks the creators of the simulation have room for improvement in ethics? Do we care about turnin-off an llm? No, but who knows. Maybe, it hurts to the llm too or even if we are certain it hurts, we just think is a machine. Maybe we are considered a machine too, biologocal, but a machine.
youtube AI Governance 2026-02-09T04:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz_hO2JUA2C27HBgLJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyJpIyYM4Aj5DgyYoJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxt1nc7cIWi_iQHrCZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxBYIReoBGybQ1xme14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxjFvMZQ29vySDF8Wt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwEhCoOzsQvr8t-Krl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyAG9y5lARy1gySj_B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz7MBPwoHvELJSfaHh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyfvMMDXyzg8_HJpEd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwCqvGHz2CQuuSbb3d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]