Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
And the big question is....why Super AI would need a "pet human" that multiplies and kills the environment that gives him food + AI doesn't fell or need to be with someone. Not a good time to train the AI during war times or for war because it will learn first to kill not to protect life on this Planet.
youtube AI Governance 2025-09-21T18:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgzzEtldBgnEFCqYX8d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgyfyUu7Be4H3bb02G94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugx23Q-jhglYBq-U1ox4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgyXrkQ-vIydhUAwBmV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzWphUfkI2P48H3Wrh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgylmXBsGAG2ZZucLcR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugz7Qiqi2n_U1OTzMqR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxqXUrebrs_K6BSKOJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgzAGIK-aoFGH6rjk0l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgwxAI2U1sozsvebMah4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"})