Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think the scientists definition of "smarter", "learn" or more "intelligent" should be taken with a grain of salt. Over the last decade those terms have gone under a lot of scrutiny in the scientific community. These AI systems process data in ways modeled off the biological system of the brain, and vastly imperfectly any engineer will tell you. To think the electrical equivalent has somehow surpassed the biological original b/c of storage, ease of interface or speed is perhaps applying a metric that isn't really derived from the machine operating in real world applications / phenomenon, in the wild, as it were. Put it this way, if you ask an AI to solve poverty, war or immortality. It's not going to draw you a blueprint. It can't synthesize or imagine in that way. If AI could or come close to behaving in these abstract complex problem solving scenarios, you would know by now and movies would suck way less b/c AI algos could write brilliant scripts over and over no problem, and if that solution existed, which it doesn't, Hollywood producers could afford those writing AI hardware farms trust me.
youtube AI Governance 2023-06-12T10:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzLhEvlGTAsr2vr0PN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy_byUMtkeZGDY1LSx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxDpH0EcphvghtEnI54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx8ikakNUwctDPNtbt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyuGjjOPE4LcOA6UaN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxwGkojt6fmzz6JlQ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugybyt22LjIZI049n8Z4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzcMrYoA-ipcVjGLM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxnV9vA7BjJpqfFZwl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwTBdOuauIurBSPlZ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"} ]