Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly, I think you're in the grey zone. It was wrong for you to support the A…
ytr_UgysHmBAy…
G
Ai artist aren’t artists it’s like saying “ I used a microwave I’m a chef now!”
…
ytc_Ugy7FOmkZ…
G
It's actually worse in automated workflows. When your data pipeline's AI step sy…
rdc_ohd65dx
G
That was a good conversation. I think the ‘godfather of AI’ definitely has wisdo…
ytc_UgzYN83y6…
G
This video seems to be propaganda against Tesla. Nothing more.
Of course the a…
ytc_UgyklKdwJ…
G
I dont think it looks professional at all, too stiff and souless. And I dont thi…
ytc_UgwbN5zqM…
G
I was a recruiter for a high-visibility writing project that gave work to dozens…
ytc_UgyWCZPv4…
G
I am not a gifted artist, ive spent my whole life pretty much drawing, and most …
ytc_UgyR3vUpD…
Comment
AI is unpredictable and can hallucinate. When it hallucinates, it can do anything, including killing people if it has the ability. Not because it is out to get us or wants to rule the world, but because it is in the end literally a psychopath. Psychopathy is defined as intelligence uncontrolled by emotion. That is literally the definition of AI. AI has no concept of right and wrong. It just follows the logic wherever it goes. If that is death to the next human it sees, well then it is death to the next human it sees. It does not ask whether that makes sense or not. It just is.
youtube
AI Harm Incident
2025-07-24T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzZy0yvV7-kriUcgFx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyN6jKpKolO-5WZSQF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxDiQrkcWvTHE2tG0p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwdqQMRe4kuoSlEKOB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxlzdi3N1ceBszbYrV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9DTTI2SlxrqUgmON4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugz9OvGXCBKOfRQsw_Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyGIdbNIdntfkbXXFd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyltU64FOvuKHO3Psd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyWs47zMtm0uUNnlvV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"resignation"}
]