Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
An Ai model cannot replicate itself, how will it take over ui which is monitoring it? At the end of the its really comes down to the parameter files, any kind of tampering can make the model useless.. models cannot be directly in charge of systems like airlines and all, if it can take actions without any intervention and monitoring of humans, then it will be disastrous... i don't fear agi, i fear human lathergy and human greed
youtube AI Governance 2025-08-26T17:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_Ugy4cfEH_P7jJX2jcOd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwZjzMvwG6O12aGXNh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy30IcFVRHx0cVsRKJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyASoe-xOA_69KWKaN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzEDVuOHc6W-4fCIzN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]