Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If AGI were to be fed information on how it was created, it would be able to create the next, better, faster, bigger generation of AGI, which could then be fed information on how it was created, and so-on. We're not at a point where AI can write more than 10 lines of code without screwing it up somewhere yet, so we're safe for now.
youtube AI Governance 2026-03-18T23:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugzefk0ERwhgizAGqWV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzMp826dOOeGp880yp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx_UjX-RltaggbEf5N4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzIjQtMarlbAx3iByh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwzR9MjPdKGcrKP7354AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgwpKTeo_IY0iwIeXTF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyNsOK0DZFI5s-cCA94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxAyLbQ7hLtijrrufJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyhvPtJYu6VDPVNdrp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxrFusZ-h8-OIupFuR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]