Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Being good at art has two ways of achieving it.
The first way is the easy one: T…
ytc_UgwxKhURW…
G
That's not remotely accurate. I don't care for pushing all this AI stuff myself,…
ytr_UgyzY7gHB…
G
I call bs. This guy knew all long the dangers of AI. The minds behind terminator…
ytc_UgydYne2z…
G
Google is doing great job to keep AI productive for humen. Remember, there was a…
ytc_Ugy1qDlV3…
G
0:15 I'm a truck driver and not concerned at all. I'm in my truck right now and …
ytc_Ugz_bZgM_…
G
All those dorks a geeks are trying to kill us with AI . . . Its dangerous. Its l…
ytc_UgxUhnSHJ…
G
How can you automatically assume that you can program morality or amorality int…
ytc_UgwviX4Z_…
G
"I went to school with the people that now build these technologies"
Oh, so this…
ytc_Ugx8AYMJr…
Comment
You remember: every specialty is on the table if people want so. In democracy, people choose what they want and not. Influencers ride the wave of fear of unemployment and make tons of videos like yours. To make money, not for philanthropy. You take the millions of doctors out there, the families they support etc..and fire them because of AI. I think those millions will react exactly as I would: you do so only over our d**d body!! People were up in debt, did sacrifices, their families did…f**k you and your f**king AI!!! 🤬🤬🤬🤬🤬
youtube
AI Harm Incident
2025-07-29T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwCGRCuCUT1fWSkq-94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3ZHKX-kQUhXmdhTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLI4PvUG5T2HzHM-J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6RuLU4vvrTi8xFK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsFttH8NhrmKI7Fs94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlJJwtDT070Q4fi7N4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw3SG892grJatou1794AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6g6hXhbns37ncdmB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxy41O5WyNnlD1el5B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwt0sZAxlPoQ60Rlvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]