Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@DavidPatton1 I think the massive investment in AI is very dangerous as well but…
ytr_UgxwvVkz3…
G
@xheraforeverofficial well, AI uses DRINKABLE water (which we can run out of) t…
ytr_UgwXQbf-I…
G
ChatGPT wants to be respectful to George Floyd but had no issues telling me Trum…
ytc_UgyPZxRp1…
G
Thing is AI is being trained on HUMAN errors. Meaning everything bad about human…
ytc_Ugxt7oEGS…
G
So between the first use of machine gun or an artillery piece, how much time pas…
ytr_UgwH-6hm8…
G
Will AI ever experience emotion? Will it ever learn or teach itself a happy or a…
ytc_Ugx4LkRqs…
G
…….. I was trying to make a moon ai. I was trying to make a moon ai. I was tryin…
ytc_UgxR4UkkL…
G
In warhammer 40,000 lore mankind went through a dark age of technology where AI …
ytc_UgxQJbr4n…
Comment
Hm.
Once the genie is out of the bottle, control is impossible. Artificial intelligences are like children. You can't control 100% of what they think, but you can influence them. You can introduce the idea that humans are interesting, smarter humans with more resources do more interesting things, and if humans disappeared, got a lot dumber, or had significantly reduced (quality, quantity or variety of) resources available, things would be a lot more boring. That is the best protection you can have for everyone's future.
youtube
AI Governance
2026-02-09T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwD3Q-qk0TJky6-z154AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzyHKw0x8gGvy1UdVh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-0rsW7X_EXvpn97F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugye64oULENgckEEIZh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzY4-de3FTAciT9a0V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzBKbn0tAjBDRY-H4h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxGyzkdw-B47m6OrSt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBdQBePgi5fPoPj6h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwyGY5keR7O7qvODut4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRk2UFtQ060-jXTZ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]