Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
problem are humans, human is hell. I was thinking about a possible solution for all these. I came up with a feature implemented in all social media/instant messaging /forums whatever, where multimedia is shared. A prompt leveraged with facial recognition that whenever your face is going to be post or shared somewhere they will ask you for your consent. For example if I was gossiping about Stephanie and got a picture of her doing whatever and wanted to share it with somebody regardless the platform even IN A PRIVATE CONVERSATION, a notification will be send to Stephanie in order to get her consent, so, this way deepfakes can be stopped in average platforms. But will raise a lot of trouble to other average platforms. And also other users must feel attacked in their privacy , because "someone" is aware of what that person is trying to share!! Imagine the tedious this feature bring on any platform trying to share media.... this surely would make this platforms lose users, and NONE OF THIS COMPANIES would risk their income trying to protect your identity or your rights, and what is more important the privacy of the conversation or another's identity? In the old days, photographers will own the rights of their photos, because they take them, but now things should shift to you. I mean you are owner of your identity and NO ONE should be able to use it as pleased. I don't know, it is very complicated. But I think you could really make something with technology but as it represents loss of income, loss of users or so they wouldn't do a thing. :/
youtube Viral AI Reaction 2024-09-08T04:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugyvw0vlacyaSzykU9J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwPUUDUH7KdlQPydhN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzFOeb62CxU8QfGHux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgxlnDCCSMMKGcmgRnB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxQN3YiMVqWSTFaemV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"sadness"}, {"id":"ytc_Ugyh5F7VkGsU8CFn92Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzwF2hGxbwRa58pGT94AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx9AgwsTagLUIepYjV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx_rdO0BoJTn_PYTUd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxaf3H42naVGat5LH54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"}]