Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While it's understandable that some artists may feel threatened by the rise of A…
ytc_UgxCMMpYL…
G
Having a perfect partner, of your design, is the dream of many. Could you only i…
ytc_UgwRZxtRr…
G
The funniest thing about AI art is that it can turn out so weird and wrong, but …
ytc_UgxhhjsBQ…
G
Crime Instigation Program, not Predictive Policing.
Sounds like the Kids For Ca…
ytc_Ugwb2FRQu…
G
Ironically the solution to this is more AI, not less. It's Elons well thought o…
ytc_UgwKkZko7…
G
It's used in fantasy literature as well, which is definitely in the AI's learnin…
ytr_UgyqKQ4Q2…
G
It's harder than you think. computers easily mistake shadows in certain areas as…
ytr_UgxvbMjrP…
G
As a software engineer, I am frustrated how amatuer she is in using AI. AI inves…
ytc_UgzUlthQO…
Comment
I can work in IT all I want, warn fucking EVERYBODY around me nearly DAILY that LLMs are glorified dice machines shitting out LETTERS according to statistics containing every text ever read and STRIPPING IT OF CONTEXT - and people still readily use this shit.
I refuse to even use the free version. Fuck that shit, if I can't google it, I can't become an expert or need to order a book older than 2023 (from then on everything could be LLM shite), simple as.
youtube
AI Harm Incident
2025-11-24T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx46HsdO5vB3f3on0h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwfH-pFbFfS4mB2aDh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjngIgVcdaWcn8-aJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwORH6fT1daDN0207V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxz6f_Kiag-g-7EInp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwooL8oW3IFRvo7QXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZe4AzSx1e5hOwZK94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyKpQ0-yopz0ZFUWqR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVOVcdoXAtM06Ro3x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwZb5tB5jvL0bdi0YB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"outrage"}]