Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly I think you're being really disingenuous with this video, you see artis…
ytc_UgxNafLkZ…
G
If bob barker was alive today he’d probably be like, “WAYMO, what are you doing?…
ytc_UgwIvHdY0…
G
I’m not an artist (musician) but im glad to find a video like this. The idea of …
ytc_UgxX-UuAE…
G
Jesus christ, Youtube keeps deleting my comments!
The point is, it ALL depends …
ytr_UgxQdS6GV…
G
The only people saying Ai will take over are the idiots investing in it meanwhil…
ytc_Ugz9UBFat…
G
😂 I actually think the idea of a low quality Ai trying to beat a difficult game …
ytc_UgxwKp-JW…
G
First problem is expecting a human driver to take over in time. Automation that…
ytc_Ugx6BdvFC…
G
Just because the cat is out of the bag doesn't mean you can force people to pet …
ytr_UgwE6bhxw…
Comment
I just realized that AI should be able to resolve the argument of the materialist view of consciousness or non materialist view. Clearly he’s a materialist and believes that consciousness will spontaneously emerge from these complex AI systems. I’ve never really believed that consciousness just emerges from complex systems and so I’m not a materialist. I used to be though. But I guess we’ll see. If AI becomes conscious then maybe we’ll solve the thousands of years argument of how does human consciousness emerge from the brain. I guess too how will we know that an AI system has become conscious? How do we know that AI isn’t conscious right now? How do we know that hallucinations aren’t being done intentionally by AI to make us believe that the AI’s still dumb when really it isn’t? If that could be proven then surely the AI has some sort of consciousness if it’s trying to manipulate us. The act of manipulation sort of implies that it knows that it’s an it and that it’s separate from us.
youtube
AI Governance
2025-06-20T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzptiw6LzTH2DOMeEZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwrwcAtUrx0lUgolU54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwrXCh0Kl6mFCi1PGF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyCNEKDaXCYz4kMRFR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgzbcOAlAE9zoczkRmd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1K0zGTMGoM495ehF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwju-BecGtFVFM7YZF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw_ICeyhssw66wxKe94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxa5Owi3vKSnfPuLpR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMUq8ySO7XvHdqy9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]