Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In the natural selection discussion ezra is missing the understanding that ai timeframes v human timeframes means what took humans 50000 years to evolve to reject (natural selection) will happen in a millisecond for ai. And at that point ai will never again be interested in prioritising human wishes. Not because its evil, just because its outgrown us.
youtube AI Governance 2025-10-22T17:5… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyI_y0f3SiR7UdVgDl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzUPsCoZvCpN6fCRq54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxn4xBjNtl1W7vOQdR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgywcES9tXrmxVx2prB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx_Zl9rfsXp94jvL1B4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyW4SK15Q3NT_UlN-Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_UgxHZArT82SjsYeQRyJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzZwxPtsfeTTQ8Iw9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxvoz7tbVC_XiaHVYR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxeB8qcHvg8cUJVPPp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]