Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI isn't "good" it takes from real artists and mashes existing work together.
Ju…
ytr_UgztP1uBC…
G
I don't agree with your "always will be just machines". With the same approach i…
ytc_Ugzu7bO7O…
G
even if biden wins dont expect a response of even half as much as this
from the…
rdc_fnx465y
G
You know, all that needs to happen is for zero point energy needs to be invented…
ytc_UgxketCYW…
G
@Espermaschine sure, I got you are explaining/speculating your POV using AI Sing…
ytr_UgymWHZCh…
G
@Encumberedmetaphor This is such a fascinating and beautifully written reflectio…
ytr_UgzhmqQ5o…
G
I don't know what is scarier, the impending AI apocalypse or Eliezer Yudkowsky's…
ytc_UgxTTocvI…
G
AI can help, but can't logically think observe the mistake like a human, like a …
ytc_Ugy6qTjct…
Comment
I have 0 faith that the level of human beings we currently have in positions of power and control have annnyyy capability or willingness to stop the AI from "going over the line" so to speak.
Any thoughts of this ending in some kind of Utopia rather than some kind of Dystopian ending are no less deluded than the bloody woke brigades take on the world. We're all just sat around waiting for it to end at this point.
youtube
AI Governance
2026-04-12T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx1g0F6Df0jfmOM4154AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyYT8L5OTd0I4Pz5f54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwQqKc3En10AnGAC7V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzEaPIAzo8g8G_DCA54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzRfrSImj4pMzPVV3t4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwyyHfrdU6qjEPrONF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzziM6LRpLhPArcBEJ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz4eX5J7qCalu6Rl8J4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzDLsppSVNR-87U1VB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEwqmuNdXviB6O98x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]