Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
13:53 Even though it's just circles on a screen, I swear that chatbot is sweatin…
ytc_UgwhnPAXU…
G
Stephanie, did you try and search for your deepfake? it's crazy out there. take …
ytc_Ugy0lkEFT…
G
Cool. Conventions have rules, and if you break them, you get kicked out. That’s …
ytc_Ugz0kWgqd…
G
All we need is dumb AI. Controlling traffic lights, doing calculations, etc. We …
ytc_UgzJzMmXQ…
G
You think drones of tomorrow are your worry? What about the Orwellian dystopia A…
ytc_UgxHw6_0W…
G
a more futuristic but realistic one would be insurance companies using ai to dec…
ytr_UgyJNcmSa…
G
I’m not sure if I can call myself disabled, just in the fact I am on the high fu…
ytc_UgyUu5VK5…
G
Scary, but unfeasible. These small drones simply do not have that capability eve…
ytc_Ugx5zsptA…
Comment
AI poisoning 100% works, there is a current issue in the Software industry where code assist tools which were being made with neural net and traditional programming are being replaced with pure generative models and the new tools are absolute garbage. When it comes to software the only available training data necessarily includes a lot of bad and non-functional code bits, often with obscure issues that an untrained person would not see as a problem until its too late. I have not seen a bit of generated code actual create something that wasn't able to be suggested by good code assist tools like intellisense, blatantly wrong, or contained subtle issues that would have been problematic later.
Its straight garbage
youtube
Viral AI Reaction
2025-04-25T13:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzW1XpIRm0mFxiDWex4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyCmCzO5zxtqx-Crt14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTPG0eKGt2mU1jL7R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCAJZvmgeTjNsZZC94AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgznI_x8kIbHU5uqNhZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjEHQzSlW5sQzvdGh4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy7ZUVjsXavscxQPF54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugx_tIlQWcy-HJchm4B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzkQYcziqkaFo95kJ94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkzalEhcb86cMuTNt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]