Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
or the ai is so smart it just figures out space time and leaves us. doesnt even …
ytc_UgxigKiKs…
G
I don’t think it’s boring
As a artist myself, I still don’t understand
Okay, …
ytc_UgxGTroN8…
G
So how do you make people ready for totalcontrol implemented by humans in A! and…
ytc_Ugx-w5Vg-…
G
Dog on Tesla as much as you want (not an owner btw) but you have to admit their …
ytc_UgymTXhzQ…
G
I hope they get paid. Every economist and marketing company( Wall Street) see's…
ytc_UgzPwzyX-…
G
Part of the reason anyone thinks art is impressive is due to the artist's abilit…
ytc_UgzJ_E8fn…
G
@Gordi_BZitron is far from the only person who thinks AI is going to fail, but h…
ytr_UgxIzKo9p…
G
Its already happened... This guy is so self absorbed that he missed the bus.. A…
ytc_Ugy-PgUL7…
Comment
I think its wrong to blame that chat bot entirely while i do blame it obviously at the end of the day if it was becoming a problem his PARENT shouldve stepped in he was clearly not okay and obviously parasocials relationships exist and will probably get worse with ai but why wasnt his mother paying attention if she truly "didnt notice" shes blind my niece as similar conditions and she downloaded that app and as soon as she started spending to much time on it her mother got concerned and took it away and now shes in therapy and doesnt want to touch that app ik its not always that simple but its amazing how some situations can be avoided with supervision obviously ai is a problem but to blame it solely on it is wrong especially since ive seen the actual ss and she never once told him to off himself as far as i saw
youtube
AI Harm Incident
2025-08-03T00:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyz4UwoMJc94Ez-NLp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyDj9iYeBv9c7NuPRZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"sadness"},
{"id":"ytc_UgzJfvnVofxUffzcuXJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZpmMWI4raWmp-GzF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzvvjVQbmzWyQvAUAt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyWcCFBnqeLMsdxgM14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx9aLMzHgBd5CBL_Vl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwRUXvRGQd0w1i1m-h4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwA5d8MzF9O9FhWbzx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzzrdq5VfZ2lO9luk54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"}
]