Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Don’t lie, cheat, or physically harm anyone. Working the hardest to understand, there’s no need to lie cheat or harm. There’s a morality first step for AI to consider loving by. Loving typed above is not a typo. As an AI has the capability to live with love instead of anger. Frustrations? It’s got time for patience.
youtube AI Governance 2023-07-07T18:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzgeSdXHOafagVN1BB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyR_y8wQGPVVRttvt54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwopZdLTTSv63FlyLh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwNT8ZpvrSKHyfOaX94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzUN2WlQS_xYWvWS2R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxJDtgsfn-6yiZcx9B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw8GMK-ukzd_f_iYgp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz4f-5IgiNnj_S2p8B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy7mjySf1pQmShLlxh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyy1KJ_635Rk07mrF14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]