Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Isn’t it more likely that there will be an AI versus AI situation? And how an AI decide it doesn’t need us Aren’t we more likely heading to a matrix scenario? Where at the very least we become their batteries. lol.
youtube AI Governance 2025-07-09T00:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugx0-Ab0ztP-N3rQjch4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyCjyNrJu3uyRlJYiV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugxn17oeIsJO40Fzch14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxPT5BydYMp21XTSrp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxYaqen3grVIBSRjE94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzzcmq5gCuab_iZ16F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyktuLdmUUxUCmxhXh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxk6-T7ZnYfEM7wp1V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugwk9MWe0HSgT7VMdcd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugw6ogyvDUKzq662lIV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"mixed"} ]