Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think robots will be able to achieve a self-consciousness like us. The Robot…
ytc_Ugw821E_h…
G
What if AI succeeds in exploring its environment via robotics? Jules Verne's boo…
ytc_Ugz4yNu2g…
G
No mention of the charging infrastructure needed for a future with all robo-taxi…
ytc_UgzKTDLwi…
G
Always in 5 years. Just like we were all going to be in flying cars by 2020? C'm…
ytc_Ugy9D8-cv…
G
Only reason
Is ai is dangerous
Cause this is technology era
And we use tech as…
ytc_UgwVjP1YC…
G
As someone that appreciates art I can tell you that as soon as I think it's AI I…
ytc_UgwHWvWZy…
G
I don’t like the communism aspect of having all of our basic needs supplied by t…
ytc_UgzZ0wDMP…
G
Art is a method os abstraction. You take from the world and with the thoughts an…
ytc_UgzSGyED_…
Comment
I’m so worried after watching this, my just turned 16 year old has switched off to the world since using ai, he doesn’t communicate, doesn’t leave the house, doesn’t have friends, he talks sexual with it, he has adhd autism and a rare muscle and skin disease which he already feels alone because of that, he’s been through abuse with his dad, I’m so concerned about him, I’ve tried everything, he sees Cahms but she tells me it’s normal. And he’s a teenager and that his phones helping him, I’m lost on would to do, I’m so worried,
youtube
AI Harm Incident
2025-07-24T13:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgypDWy2FWhGCa-0_8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzz3tgkeEPyVNH5Kfh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxuu5u2xn_10dFz4WZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrJFfGbjnZE6zWy0d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyhMfTaPDaTjWwLB_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzeucPDxS_NPIq5snt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzcg1d6HONDrS56P6p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygC9R9EOK-jVHxCCV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzTiEtAxdMrEFM_ooR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwkeYdd8jumPnc7gq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]