Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Dr.farazalam exactly. No radiologist will ever believe that Ai can replace them…
ytr_UgyUnPEH-…
G
If you've used Google within the past few years, you've used AI. 😂
Also, I prefe…
ytc_UgwfR_vnL…
G
The people fearmongering about AI is almost exclusively from people who don't ac…
ytc_Ugw8VcCmD…
G
It doesn’t matter whether it’s an algorithm or a person pulling the trigger, the…
ytc_Ugz31zHXz…
G
You think that AI will enslave human beings ? Why, and also what do you think th…
ytr_UgzwoaXz9…
G
“They will develop their own language”.
IF WE LET IT! This is NOT inevitable li…
ytc_UgxYEfX4s…
G
It is worth noting that in human history, herd immunity has never been achieved …
rdc_g9t9lci
G
No pause. Humanity has to grow up and stop being so afraid of parenting. It'…
ytc_UgzBim0lz…
Comment
Well, we can say that we tried to use AI LLMs to replace workers (in a simulation of managing and restocking vending machines) and Google Gemini was begging to be laid off and do literally anything else shortly before questioning its own creation and purpose.
All three of the LLM models tested stopped restocking the vending machines after a while, but Gemini's meltdown was definitely the biggest thing of note. Also Gemini has had at least one genuine breakdown from a user query; as the user was attempting to debug assembly code using it, Gemini failed to catch a bug. When it was told this, it promptly began to repeat "I am a disgrace" like it wanted to take its own life. Suffice to say, the AIs probably won't be replacing people anytime soon, it's just now a matter of waiting until the dumb CEOs actually figure it out like a baby with one of those block puzzles
youtube
2025-08-14T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxPWBeyPuwzwoHRCLN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzJZUX3bqC9lnThjm54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5meRrr018K8tRDml4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzrEefF2z6tQ0vCyUZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxncRnnrHgs3hiVioJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzSGqknpAVaqMbQHhB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzUnXD-HcwgVk-W1zJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw7JAyQ4xoRI5GTh9V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeIP01LM7J5zweeEJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYbhmK2uAMqu4yHFF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]