Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait till these new self driving big rigs start murdering people! A.I. will even…
ytc_Ugx1AL00B…
G
The point is that the creators of algorithms have not been able to rise above th…
rdc_h4mxg61
G
Ai is trash, it can't do a junior engineers job, it's all a smoke screen to offs…
ytc_UgwBNCy52…
G
People will get rid of people when ai takes over to much stuff. Imagine a world …
ytc_UgwGp7-Ui…
G
9:53 careful with terms like dumb when talking about quantized digital mind fram…
ytc_Ugw-_boNT…
G
AI is just another tool/ weapon used for the powers that be to oppress furthermo…
ytc_Ugz-v4fVD…
G
2:49 you were talking about laying people off in call centres by using automatio…
ytc_UgzQyNAjv…
G
Pay the local people more and advertise that they shoot poachers should help the…
rdc_era9oku
Comment
The thing is that AI is a fundamentally unsustainable business model. It costs billions to build one datacenter. Then, everything except the racks and walls have to be replaced in a few years once the GPUs are out of date, which costs billions more. The AI companies have to make literally trillions of dollars within 5 years. Not to mention that AI has not actually created anything of value. Companies have to rehire the employees they replace to fix the AI that keeps fucking up. Customers hate not only dealing with AI bots for things like customer service. They also don't like companies that have replaced humans with bots.
The AI bubble will explode in less than 5 years.
youtube
AI Harm Incident
2025-10-21T00:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgygoYW0Zh2V88M1iAJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxhMIrfwBpO041nGOp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxyc4UzIecoSVDueo94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzM4eyuI3tAYYJfDwt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwU-JnIXtH0kXsD-PZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzna7AUbO7Z5NuJ7st4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzpsXV_sg-Rqs47Kld4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxEBqd2iclFyoHrHWR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvQzraoZpYTIvDO7B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgyRRCWJFwV9Ok-s2YF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]