Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Okay but the hallucinations thing in AI is a time bomb in a doctor performing heart surgery etc. type context where delicate work is needed and trusting the robot to not freak out when they’re reconnecting the heart or something Huge concern. Also were not even close to that. It’s a bubble that will collapse. Unless it works :/
youtube AI Governance 2026-04-25T16:3… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxdL5IMCfCJA_OODel4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwegWZZpv6JAmawTTV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgztpPRsgAy4iQ99Hwl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwff4_dZng118XyJYV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzlRVcOlyRIU9fN48l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugy54eSKcm05OJwa89l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx8NgkXdUMMOZFkGu54AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxCWoUz1MdrfeIIiwZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwdVdNSI6hf_bvIxVF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwBfnFn30bVFvfZBqJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]