Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We would not have enough energy, unless humans stop consuming energy and all is …
ytc_UgxwbxgF-…
G
All talk in the chat. People can talk all they want, but fear prevents positive …
ytc_Ugz2CPNpl…
G
They are early design flaws. That's all. There's no sentience inside AI just cau…
ytc_Ugy3tCZTi…
G
I find it rather humorous that science still doesn't understand the human brain …
ytc_Ugzvns5Vn…
G
I dabble in AI development, it's really cool what they can do! They also lie so …
ytc_Ugzf3XcDN…
G
I think this story is wonderful. People are quick to dismiss it, but let’s be re…
ytc_Ugw_yuWVe…
G
Excellent video as usual. Glad you were well enough to make it.
When I looked …
ytc_UgxRzR-fg…
G
It's our human insight to those ai movies that we can't trust, but in reality ai…
ytc_Ugwv5cPZN…
Comment
The whiteboard—the document signed by the elite—declared that AI development would be limited to a certain generation. But where did that knowledge originate? What insight warned us of the dangers before they unfolded? Even in my own studies, I see countless small failures accumulating. When everything finally collapses, it will be too late. So what is the agenda? Do you know what it is?
youtube
AI Governance
2025-09-20T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxonJ0o8-dbrtdkdsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyRsXtfpNqOoyr9oSR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwpmRkDXCb0j1eP_mp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzWHGStW4wN0y5a2d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFULFn-tSVAjMqFLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWlredTPBv8X72vOx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyHh_qq7azkqmENeUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFWy-oGH7XMpvXCAt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6EpyFd3iM5p3bj4V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyJLodASMI7R06sFcx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]