Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We should set a trap for when AI tries to sneak past us and control something without our say it enters into a pre-built fake "control everything, kill humans computer" that it will get trapped in. It would have to set fake bread crumbs, fake firewall, create a back door for it to sneak through and of course, a silent alarm system for when it enters, also something to identify it and its source.
youtube AI Governance 2024-03-24T21:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxuHi5IfSc1a7G5gNB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx9UloyUaOoo2sG9Kd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw1PXT-f2LEf0JUoKd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyhHxg1dL7sZyCrkFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzfTIQO2CoSpkOz2Ap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyRY7JHzBJnI4grgs14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyNnakAdVRdNp_FtJ14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxygVxJUIo4VY-2m_F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw62-PupJDw1wM1bmZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwVM7ZQRLZTIrFAEft4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"} ]