Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI is now also taking over therapy. This I believe us unhealthy. You're placing your feelings and mental health in the hands of a robot that does not care for you, neither cares about your feelings. They might be able to study the human emotions, but they cannot feel them. Pretty soon people will develope robotic children if us humans do not like, nor want our children anymore. We need to stop this immediately. Now, some purposes are useful, such as robots helping with dangerous jobs, such as firefighters or construction, to prevent any serious injuries of humans or even death. But than again, that would probably take peoples jobs and there would be more homeless people. Some people say that the government can just give people a specific amount of money every week or so, but would the government have any money left if they're using it all on AI?
youtube AI Harm Incident 2025-11-06T21:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyban
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwTucqq4ZQ1qaLurLd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy28UiZL7a17FLNoUN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzP2I_m8aETuJOhE554AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwdG_KDACf42mokoIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwwObM6kTVkOdfWLPF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugy4vCOkIH-q4SIM6u14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwTD_yCwyKBhUM_dTN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwYD9DYu8_6MbDsBxZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwrBR9lb4PKVczxhYx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw-x2O70fW6XvExpSN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"} ]