Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ty sans your video was the most level headed topic. Better than scamboli or moth…
ytc_UgxK04TyM…
G
I could only listen to 30 seconds of this clip because I quickly realized that m…
ytc_UgwIY7SGO…
G
No as the other reply said, think of AI as more of tracing art which we know is …
ytr_UgxdVlnO9…
G
AI’s moving fast, but we need ways for everyone to benefit. Shared ownership bea…
ytc_Ugz8uNqXz…
G
The problem with AI is all it can do is copy, it can't create ❤…
ytc_UgxDFdWj0…
G
I would be curious your sources that this is conclusively disproven. I cannot fi…
rdc_oh3jnee
G
The really scary thing is to understand that AI will create huge productivity ga…
ytc_UgzAMohWz…
G
Maternal death rate is at niveau with Russian federation and all Balkan states h…
rdc_dcxerw5
Comment
OK, firstly there is no "Button", if there ever was it was 20 years ago. A metaphor for AI would be fire, it will keep us warm right up to the point it burns the house down. AGi is inevitable, it's not a question of if, but when. Robots, machine, drones are just AI's route into the real World. AGI - there will be only one (eventually - destroy or absorb), it will be effectively immortal so others would be competition a threat to its goals. Control of AI... laughable. The notion that you can control it because you own it, is a nonsense. There is a narcissism in the very idea of controlling the AGI coming our way, that narcissism is a belief in our own uniqueness, our spot at the top of the food chain, a special place we give to our sentience, consciousness, being self aware - we have a very unpleasant surprise coming our way. I'm not anti-AI it's inevitable.
youtube
AI Governance
2025-12-04T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkaHi4i4WbTfYN07J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnJy42h39uZwlo5jB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXqIvlEBfeXrykYBR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwdT4i8HUSmgV_svd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzm0F6DA5rfGzctsDp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlGobmdj7hFgQcBDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzC_ctZBKXWJZIxhW54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh8pqiexJcFvZ72FN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXDZ0Mb03n7h9LcJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5sO8zIEoVkp8POOx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]