Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean... Serious question from some uneducated folk...
My gmail account, …
rdc_n7fx0qe
G
The problem is Ghibli creators don't get any credited fees for that stolen pictu…
ytc_UgwnI-hHj…
G
Im trying to get into the industry andwas finally working for a small indie game…
ytc_Ugw9I5Op6…
G
No AI can make good architecturale designs. I know because i tried. It can help …
ytc_Ugz8pUb6b…
G
Thank you for saying aloud all my misgivings about this AI mess. I hope it's not…
ytc_UgzT2i4-Q…
G
@ConnerArdman
combining AI and moving to South East Asia reduced an amount I pa…
ytr_UgynvGhxR…
G
It's like storing data in the cloud. It's generally "safe" but it's not best pra…
rdc_l56zs21
G
mayvbe wasn't really just the AI but the whole system of beliefs we all are forc…
ytc_UgyHxwGcJ…
Comment
One thing is to simulate a virtual reality, which is still far from realistic nowadays, yet completely another thing - to simulate it with 8 billion intelligent and emotional human beings, as well as other complex animals. If it is even going to be possible at all in the future, the energy resources to realize it would be so enormous that probably impossible to get them. Anyway, the only way we could be currently in a simulation is if it exists in the distant future, but why would it then recreate the world from the past instead of these times? Unless it gets to the Matrix setup, where AI is governing the world and we are the energy source, while the past is the only satisfying reality to virtually live in. Hmmm, such a long shot! Still might be possible, but I'd guess 1% chance vs 99%.
youtube
AI Governance
2025-09-05T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwVAHvgn6wDPSQhf7N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6fAJqyDQ2SpoQBNZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgybZIAhKJgX4Bstrnl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxJQFThJIBK2NBIEb14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyh3Rn0OO7LDxgIhul4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaNe5gy5M8JnS32iN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw_boPUf27diERaNyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1qqtHYR-aNdEX5Yh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxOH9ebyctRwsjBi854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7UsEhviCNZvpWOs54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]