Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
U can see that there is a star on ever art, that means it's actually Ai…
ytc_UgxMpreZ8…
G
Stupid, can't knock out a robot with a fight. Intelligence wasn't part of the fi…
ytc_Ugwwr36VS…
G
AI will never be conscious. It is an impossibility. Without the spark of source,…
ytc_UgwN9fSFJ…
G
Human stupidy will eventualy kill itself. Ppl are opening pandora box, and what …
ytc_Ugy1qNwfk…
G
No Computer program can be more “Amoral Psychopath” then the only Genetic line t…
ytc_Ugy3FAeSi…
G
Love the card trick and the content! AI is indeed starting to take over everythi…
ytc_Ugw9xTMAa…
G
Well no because just listen to how they sound. Thats ai generated voices. And ai…
ytc_UgxyPVeAt…
G
Really balanced takes here. The main thing I'm concerned about with AI is the us…
ytc_Ugz0IW1_P…
Comment
I don't understand how an LLM is able to provide this answer, when there were so many reference points from 1950 onward for why sodium bromide is bad. Considering most of these companies are currently in the midst of lawsuits for pirating infinite information to train their models, it seems unlikely that an LLM model would suggest such a dangerous substitute. But also maybe I'm ignorant.
youtube
AI Harm Incident
2026-03-28T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzV-F_60Sfovnz-dkV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5jD0Xf0wkDn7Cnd14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOJ9wBH40WIZAJBUB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwS09-Iq74TKVoWUQp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0tG5aByWkkzR_zdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-m7z6oZR4lXUHD494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyqfTWUSg1XY1uC1nh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzMwffmn05kpK8F21h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxXVYi2jMP-WueoI9R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyThoe_cT5Kc-Ov8uV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}
]