Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe, just maybe China ought to 'win' the AI war (which doesn't exist but I dig…
ytc_Ugylm_3Kq…
G
The disgruntled AI users have been getting really volatile with artists who are …
ytc_Ugx_gs1D8…
G
4:38 I feel your counter argument to "AI is learning the same way a human does" …
ytc_UgyQwI2d4…
G
fun fact:
you don't actually hate A.I. because true A.I. doesn't exist yet in a…
ytc_UgwUm-sw9…
G
So elon why don't you stop the hole ai thing and stop sending satelites in space…
ytc_Ugz8cUoXO…
G
i don't think ai will ever be conscious, as any future ai will still work like s…
ytc_UgxbKitwc…
G
The only time when ai is okay to use is when it’s used simply as a tool rather t…
ytc_UgzqSWeKc…
G
So true the final puzzle to AI is to learn to fix and maintain itself then we in…
ytc_Ugw2mTRg-…
Comment
So his main risk of AI is the fact that capitalist will get a hold of it. The main risk of AI is that people will stop thinking. They’ll lose their ability to reason they allow a machine to do everything for them. They allow a machine to replace key relationships, such as wives mother’s and fathers. This is already taken place in some corners of the world.
youtube
AI Governance
2025-06-25T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzYM8NwyoCis42zvLl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQbsfODWU5XpzZkgV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznGm6NQDj9xm53-Gl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzlpBzZuWiIY6AbCLR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwgSJMILOCvFfFZpF14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyN0hefIFYjv2lO7TJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbEYlhdXIQIXsE2kx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxH4fAj6jUO6DETUNp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwXFVLaymu09bSVgld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_c3fE7uLha_PxKB94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]