Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is a really sad incident and my heart goes out to the family. AI is being pushed onto the world as the great new saviour of the world when in reallity it is gathering data, not just any data but your data, your innermost personal feelings and emotions, it is gathering your details and so much more. When you consider that most of the AI companies are big tech firms which are run by billionaires who have just one reason and that is to increase their wealth and reduces your burden on them it is not surprising that the "therapist" bots do not sign post vulnerable people to real life human beings for help, which they could so easily do. When the likes of Palantir are using your data everyday to control what knowledge you have or are just outright selling it to businesses, there are some serious issues with AI and LLMs. No I am not talking about "Skynet" from the film Terminator, "Skynet" already does exist though not as an AI system. Serious regulation needs to be brought into place to restrict what data these LLM can use, how it can be used and what it's limitations are when dealing with users who repeatedly show forms of romantic connection with the AI. Apart from this AI is not good for the environment as it generates lots of heat and requires 1000s of gallons (4.9ltr to a gallon) to cool down the system for one complex calculation it is asked to perform.
youtube AI Harm Incident 2025-07-22T22:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzlAR5-quqvAhHGXwF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyXtawJW8jj5rzAhGN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzo4bT-xnHCOJPbf4N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"}, {"id":"ytc_Ugw5qQ_L_xobXL49fLp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzLyi5WoUQbykDGsxJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugya-Dpn-IboQAcjlKl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzCAA8JctRNK0K_5ih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxlK8KJ-3HYMRD-HuV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwHdgnh8FQMJ34UUD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwILL0OqZyaIzmEpm14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"} ]