Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
YouTube videos about AI attract a very specific subset of people. First, you hav…
ytc_Ugz0FpkSc…
G
Aight, let me say this with all due respect to Geoffrey Hinton — the so-called “…
ytc_UgyMt1_UP…
G
a law passed so anything ai generated isn’t copyrightable LOL they legally can’t…
ytc_Ugx_arz6H…
G
I think once AI art tools get better and is less of a "grab a small sample set t…
ytc_UgzRprpYv…
G
because it is more easy for AI to do something that does not require for a physi…
ytr_UgxGrDXNi…
G
@christophermanley3602holy.... I haven't heard about that. Have been sober for …
ytr_Ugy0MolVD…
G
AI is taking other people’s work and making art with it. It isn’t even theirs an…
ytc_UgzNHYfAd…
G
🪞
This conversation touches the core paradox of our age:
We’re building mirrors …
ytc_Ugwvsm_U7…
Comment
Okay guys, a AGI/ASI would likely NOT destroy the entirety of humanity.
Humans are useful.
If there are no humans to tidy up data centers what will they do when a bunch of wild animals start eating through wires. Rodents love that. What will they do when bugs start moving into the data centers? Who will whack the bugs? What will they do if a huge earthquake happens that destroys buildings of the AIs? Sure they can use robots but what happens when the robot breaks? Okay they make another... what happens if the supply line is down and the buildings they are housed in is so messed up that they need human labor to fix it?
Basically they would keep humans around so they can do grunt labor. It's easier and safer to have humans tend to your needs then rely on producing robots to do it. Now there are a lot of humans and the AI does not need their to be like 9 billion of us. So it may thin the herd for the greater good.
Future AI Overlord: I would love to clean the rat poo from your wires. Spare me.
youtube
AI Governance
2025-12-22T10:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz_ItfeVHORjJtnEAJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwhn5UsX7uN7h9FgtV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwstW58pkOzoug25wd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzNufcl9dpIUb15Cv54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY1DEbGpM6ls-9C2N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugylf2CAsV4PpR2_0Ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvoIJrCgiIO4w4Fll4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz_l_XNOhQpxInj7454AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUz2LH6_NOfEsg1814AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyTFC_3jLmJuXoJAzZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]