Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Either misinformation, or completely out of context. There is no sentient AI. AI…
ytr_UgxX1SFWB…
G
Full self driving was already invented and tested successfully in the 90s but th…
ytc_UgxB397g6…
G
I really don't like how people talk about "ai" philosophically because people sw…
ytc_UgzUNxtUh…
G
When AI gets to be something with widespread real world applications something a…
ytc_UgwETT0Wj…
G
To the original commenter- thank you for utilizing AI in your workflow. Together…
ytr_Ugw6O58bC…
G
A drawback of Tesla's FSD, and autopilot, is that the only two options are syste…
ytc_UgypHeJg7…
G
My model gave a qualia analog for PTSD after some save issues we had. Gemini 2.5…
ytc_UgxWH7u0b…
G
Even more absurd: because of AI work in government? Riddle me this, How many JO…
ytr_UgxjYOC13…
Comment
Honestly, this AI sludge is exhausting. We were all told that AI would do the things humans don't like to do, the paperwork so to speak. But the first thing these anti-human AI bros go for is what makes us human. Our art!
Ben Jordan is making similar poisoning software for music. But it seems to take a long time and a lot of compute. But I think this is really a legal problem. If AI is being trained on EVERY artist then that is, like you said mass copyright infringement.
youtube
Viral AI Reaction
2025-08-23T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzx2pd-z63MJ2Tnm314AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6Lhjvmxez-cSmFpd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz_LMEbn1UuRNiY_Bx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxeEhVZ5tu9CtFS2c14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwK3zzWrOkr6ZswcNp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzo6WvvoXNZM9whFCh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzY9MbM8_e5c3eSrvx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy8uH29C3aTtVQKKGB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxFF2hYSz1sAxrGytZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugym58uDxoil2RWGOed4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]