Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is no need for A.I it's an ego trip for the developers. You can of course talk about how it would be useful but the dangers far out weigh the vanity of it's creation. "Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should"
youtube AI Governance 2024-02-01T02:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyban
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxkMpXwzOgJu0sdf2p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxSaXpuXYcDvcypYpV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxoH9ulG_duymgDFP54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwLZJn06iPrZWB0T9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw57Ow55EplBC_kkX14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxegNf6KdPZOMhwbjR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz7uQJH78vv70ptwUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyhjjCTQ6ibb3ckYMh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzPFRRUaMj6BMol8G14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyzzTBGOiOKa-3QnsJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]