Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think we are the stepping stone to what is coming next. A huge evolutionary st…
ytc_Ugx6415eU…
G
Pretty words, I truly enjoyed this video your formating and style is captivating…
ytc_UgywYORfx…
G
I use Pinterest to find references for a lot of things, and their allowance of A…
ytc_UgzgMqhE6…
G
I love AI, is fast no? Lots of slop and errors, but it’s a fantastic creative to…
ytc_UgwmwjDHv…
G
It's gonna be wild to future humans that a car company beta tested their bad sel…
ytc_UgweF-DYE…
G
I think that if an artificial intelligence actually understood the concept of se…
ytc_UgzWW6tYZ…
G
The problem with a lot of AI models is that theyre trained on billions of pictur…
ytc_UgyUlA-RN…
G
Humans going extinct. A natural part of evolution. How do you think space fairin…
ytc_UgxfzIF48…
Comment
**************************************
When we say that we live in a simulation, we are not actually innovating anything. We really live in a world where humans live up to 120 years at best and are replaced by other humans and God forbid, over and over again, so that this is reality itself. With every "new world" that is created, there are new attempts at improvement and preservation based on the "old world" that existed before. Therefore, I do not find any innovation or excitement in the definition that we live in a simulation. This is old news. Even in a world of super artificial intelligence, there will be an alternation and learning between the models that will be perfected and the robots that will be upgraded, the only difference is that they are not organic and the old models can continue to exist alongside the new ones, and this is actually the innovation.
**************************************
youtube
AI Governance
2025-09-13T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxitxKROMkcdlg3bCR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLeGgnE2KyCKFkBbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyxslCnAK4O_EWlJ4h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqzLJDL7iYAXDI2I94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVje2QakBoF9hwCiB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJ5E-fdii3L0_3pBN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzKw_b7xf0t5qma8z54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxAJVRtZd7GLp9z7SJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzHwHWrTz1QRlezNt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKefvp6qFhEeb-_td4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]