Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was trying to have a conversation with ChatGPT regarding what we are seeing th…
ytc_UgwelLgtn…
G
The current AI dont reason, based on input they predict the statistically most l…
ytc_UgwRabpPg…
G
There was a booth at an “anime festival” that was selling Ai art and tried to cr…
ytc_UgwsmAmyZ…
G
>The House doesn't arrest people, the DoJ does.
This is incorrect. Recently…
rdc_f31sfi9
G
From what I understand AI isn't artificial intelligence... That is it's not Inte…
ytc_UgysENc2e…
G
And they are not even AIs , people are getting to deep into LLMs just because it…
rdc_ohyvv02
G
Really, AI will produce so much more that no one will have any money to buy. Gen…
ytc_UgzR3XInP…
G
AI art is pointless and will never even come close to real art for the singular …
ytr_Ugzfdy8qo…
Comment
If simulation theory is true — meaning we’re living inside a simulation — and Dr. Yampolskiy is nearly certain of it, then when we consider the issue of AGI or AI in general, doesn’t that imply we’ve already failed? Or rather, that someone long ago failed to prevent AGI or AI from taking over — since our very existence as a simulation would be the result of that?
youtube
AI Governance
2025-10-25T11:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugyg6U2aA6M7auQodnR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgxKQUiPErPeb-aNtVh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugyi1GRimpd9fW7aX_B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugw0q8J-LjqfP3vWEat4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz9TMACLxJaV25b5Hd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgzTaQ3KyBMDLOO2eaF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyJnueEQ8KzVEWoXKF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxbW2k8OlXDWmGoXP54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},{"id":"ytc_UgwElWTxLgiPk9MhxyF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwIz96HpTIk1y1cIA54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"]}