Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The video is well made but it leaves out some relatively obvious stuff that woul…
ytc_Ugis80PWS…
G
Robots opening bank accounts does not make them SMARTER THAN HUMANS. No robot ca…
ytc_Ugz0TxGiO…
G
Honestly I believe a significant number of humans don't care if AI destroys us. …
ytc_UgwlQ3CAx…
G
Musk is so despicable. The smirk when he says appendices. What is even the point…
ytc_UgwfpnohY…
G
AI = Destroyer of human being employment..once 70 human being lost their job.. …
ytc_UgwiSiI_E…
G
I hate to tell you but if you are asking this question, it's game over. Yeah, on…
ytc_UgwS1GOHj…
G
“The most freakiest AI” poly ai in the corner: *thats me we have no filter*…
ytc_Ugwb8PEJE…
G
HOPE EXIST : In a reality so diverse and rich of options, this resolute surrende…
ytc_UgzbTGmjt…
Comment
1 thing that AI will not be able to do is figure human behavior out. As irrational as most people are, that makes human VERY random, and very difficult to predict. Also, an AI considering blackmail more than likely won;t realize that someone humans JUST DO NOT CARE, and will do anything to tear you down just because. Arsonists just want to watch the world burn...
youtube
AI Harm Incident
2025-07-26T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyhL5pKNOtpnjlfUNx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyAe5aaerDpGX0D-r94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzXX7uQO_MMeoPO6WB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzXMbEsEsHrcUfFohV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzXhelodPBhaYNwL314AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzYY_-YsFSWXWxKlV94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz4icBbmVvyoAeA1A54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyuahMIwWcfW_GG-mB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwXNCEPWvwGQcgSiv14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxD5luuBNv4f0bRFj54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]