Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One of the big problems with LLMs is that they are idealogically biased. Just th…
ytc_Ugxw8IeGl…
G
Maybe we shouldnt use Ai for important things. Tech and disruption does bot mean…
ytc_UgxyQUJM0…
G
I feel like AI “art” should be used for an assignment that you got but can’t fin…
ytc_Ugx6cmxAh…
G
Hope this slows Shadiversity down... he has mentioned in the past that he steals…
ytc_Ugw_-DtWi…
G
Its too late , AI is the future.
We're screwd.
Soon the government will be makin…
ytc_UgxUK5fgw…
G
One plausible answer to this conundrum in the medium term is that we'll have to …
ytc_UgxBp-88Y…
G
If you’re investing in A.I. in 2025 (or the Internet in 1998), you’re funding th…
ytc_UgyeYq5r3…
G
Minute 4:20: “a person would’ve clearly say there is something big in the middle…
ytc_UgxsoxQIT…
Comment
Great interview. Makes common people think in all the possibilities, but it is hilarious to me the fact that we are saying that something created by us could achieve a collective consciousness and yet most of humanity tend to deny ancestral knowledge form ancient civilizations like the Vedic Rishis, the Kogis from Colombia et all when even quantum physics has shown that everything in the universe is interconnected at a fundamental level with particles and energy interacting in ways that are still not fully understood by our finite knowledge of who we really are. Maybe the idea of an AI taking over our existence is just a part to help us return to this universal consciousness that most of us is trying to deny and even reject because this existence is the only thing we know.
youtube
AI Governance
2025-06-18T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzqTwvnOjMgwlljrmt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwv29xMQzLH6Z08l1t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyq7iQQAXJI0ifBzlp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbXxnyhrRk2qttsnp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwR16Rpf_UzMEsKlC94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAMOn0rPGMx7QKVfZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwOkmOA8Oge0Mr17a54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyrKlWT-_EpQXw7XMp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyXmDpzCPemoRTeQvZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw3t_zrcZuyw6TWDyp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]