Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am against people stealing specific art styles from modern artists, especially…
ytc_UgxFF2hYS…
G
The main problem is the people who are developing and FUNDING AI are the people/…
ytc_UgypRlstq…
G
I want to understand why he's focused on plumbing. What is it about plumbing spe…
ytc_Ugyl1Q7qo…
G
Every single human who is working towards these robots and AI, needs to be put i…
ytc_UgzDduBkY…
G
Alex Karp is the co-founder and CEO of Palantir Technologies, a major data analy…
ytr_Ugyt64cSR…
G
As a Designer when people tell me AI will replace me. I respond. Good design loo…
ytc_UgzN7wBoE…
G
I disagree with LeCun, in the fact that he thinks the alignment problem is an ea…
ytr_UgzvAE9c8…
G
"Blinker" in the context of being blind to human needs implies a lack of awarene…
ytr_UgyHpK-Ba…
Comment
You undirectly said that in that part where you said that only the AI companies or those who supply them grow. But what is not explicitly told in the video is this. Even the board will become obsolete. Because it is quite possible that their product will. I work in a cool SaaS company and we deliver quite complex tooling that integrates to other enterprise level SWs. The reason those companies buy our product is the level of complexity. It is cheaper to pay the monthly fee than develop and maintain this in-house. It's one of the tools that for instance Google will outsource because they have no interest in developing it and they can simply buy it. But if we really experience such a rapid improvement of AI that you show in the video, they wan't have to outsource shit. Because a person responsible for that field within the company will either prompt the AI to build that tooling in days or the existing AI taking care of these operations will do it without any human interaction.
youtube
Viral AI Reaction
2025-12-02T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyE427Lw17mgeTyU-54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-wLqdN6PiuaTmBt94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6N8yUy-k_8CVx4T54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAjBSkipD96u0hI2V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxECA8K2JqnS63Pehx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSjmDCfYdznWWTVn54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1_4fVH2MJfOCfyL54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3qx1-lchd9hGdgg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwNE5foT6sGKYutPzJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzAWUn3n4AuGlHC_Kx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]