Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If an AI can't figure out the priority between killing humanity or the planet in order to produce paperclips its dumb rather than smart and we shouldnt have it.
youtube 2025-11-06T18:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwSiQ4ItBFS0A-jRN14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzPiStvWPn_9-0-5a94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"resignation"}, {"id":"ytc_Ugw7kgIekrMT9S4KBFl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzsCsQ0QNNapDVkmIR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugza5pGpm3hKU2Wz8id4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy2ZD7DRUSTSGUlxoR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzUt7r9B-ednZNczCR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxk4DjdHw2DA-uZh7p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwXIGDL_jJnG2dij1h4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugws3dZOuqyms35J0HR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]