Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai. I think Game/VFX artists grew beyond the market needs and now the fall down …
ytr_UgwPOFWEf…
G
No offense it's 10 time smarter than most already here is why: knowledge is reta…
ytc_UgxLkSYjd…
G
I like a lot of your other videos, but not a big fan of these human vs AI art vi…
ytc_UgzViB6FY…
G
If ai is a tool like a pencil i want to see the ai "artists" use a pencil and se…
ytc_UgxV5KyNz…
G
Most artists aren't/won't make money from their art anyway tbh. The ones that ar…
ytr_UgyeRXdD_…
G
Why’s that? The AI isn’t going to mass produce terminators because it can genera…
ytr_UgwhPOn87…
G
@kittyspeedy87”lost your job because of ai? skill issue”.This the type of answe…
ytr_UgzT7yL1c…
G
If you can prompt the AI in such a way you can get most of the article verbatim …
ytc_UgwvQK-le…
Comment
Absolutely wrong, you say AI has many purposes. AI is too dangerous for humanity. What I see is that people play roles and gamble on luck that everything will go well. They try to be more efficient, that's the reason, but the risks increase drastically through improvement and I'm an expert. And secondly, no, we don't need AI to be successful and make progress. We only do it because of the money. And thirdly, no, the rich do it. You don't decide anything. Do you want your job to be taken away and your existence too?
youtube
AI Harm Incident
2025-09-06T23:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy5Q7MKqA6LfE93Ra14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxYZZ5FYFxxtVKwcAh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyQPBk0m6NW92zs0x14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz6HILXx3uU4ODHl0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgymSaPIrKMBzlj4vLd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw5Yc-4qPqPn688pnN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwSrf-APjKWRU80wMl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzxY85W6BQy-fxaeaV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxmm7R0bewe9Qa8VKh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz34ol7aFwkbUHsUpl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]