Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai Bros Talking about how AI can actually solve a monotonous work that noone in …
ytc_Ugxrs8llC…
G
AI helps junior profiles not just in tech. But in the end, the output often turn…
ytc_Ugwg30dj_…
G
This is scary this kind of robot can be program to perish human. May the Lord Go…
ytc_UgwQ8rzoc…
G
The beginning of the end or the end of the beginning? That is where AI chat is t…
ytc_Ugy481pEX…
G
it cant, but executives dont care about quality, they only care about maximizing…
ytr_Ugx0QH5em…
G
Hey @panko4875, thanks for your comment! And you're right, Marsel Khanov was def…
ytr_UgwMb1ETq…
G
No it’s 19 ok Guys hé did hé for the video by wait meta ai it’s dumb 😂😂😂 🤫…
ytc_UgxgPIZYh…
G
😂 AI can't even minorly think out of the box enough to do my job "intelligence" …
ytc_Ugzvv6WQb…
Comment
When people say “AI is going to replace us” they tend to picture scenes out of ‘Terminator’. While that’s not necessarily an impossibility, the reality is so much fucking dumber than fiction.
We’re already starting to see AI replace humans by taking over human jobs.
Then humans will be kicked out onto the streets, starving and dying, because machines replaced all/ most basic manual labor.
I told you it was fucking dumb.
Dumbest of all, it’s a human-caused, self destructive problem, easily fixed by the humans causing it, but they won’t because they’re fucking stupid and greedy.
youtube
AI Harm Incident
2025-09-12T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwhLjx5jDAq41z0KAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwm5UoiBq9KXbwDx5x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyMqGLWLGSSdWmT4yt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyi0Z95my4NJoBH8xp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy4tkUqG_DHiyoUWoF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyhVqDlod9I__-A3Yd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyPfce0sI6rU2FmF4d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_6XpQmk9-pswQJjZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJrmiOTQPTTdLrvLR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWFIQbTv_wF7ctOVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]