Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sora failing doesn't mean AI video won't revolutionize Hollywood — it just means…
rdc_ocpj1ya
G
Art is effort, ai is no effort, thats like a kid working 10 months to get a Play…
ytc_UgwjwZn4s…
G
AI is starting to remind me of the movie Weird Science. Those of us who are old…
ytc_UgxLk9X0R…
G
from the outside it really looks like the thing that is at the helm of all of th…
ytc_UgyuAenyN…
G
The silly thing about AI in the future of creating things is it could lessen the…
ytc_UgwBv2oGk…
G
You known poisoning the AI makes it harder to poison, right? It's called adversa…
ytc_Ugyym-wbL…
G
It's half correct and half wrong...
AI will obviously consume jobs in probably e…
ytc_UgyI12O5s…
G
Correct. The "people" who complain here are worth substantially less than the ro…
ytr_UgyvqMci6…
Comment
It ultimately boils down to , if it gets to the point where it is replacing all the executives - the ultimate endpoint of that is that AI is so good it can run the whole company on its own, and all you would have to do is type "make money" into an AI agent and it would simply do so and you would be set for life. It obviously doesn't take a genius to realise that that simply isn't going to work.
When you consider the reason WHY that isn't going to work, philosophically the answer lies in 'what actually IS money?'. Money is simply a token representing how much you, _as a human_ , are contributing to society relative to every other _human_ on the planet.
If all you have to do to run a company is type "make money" or "start a company" into AI, then every other human on the planet could do that, and you are no more valuable than anybody else, and ergo, your company is valueless.
youtube
Viral AI Reaction
2025-12-02T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyE427Lw17mgeTyU-54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-wLqdN6PiuaTmBt94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6N8yUy-k_8CVx4T54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAjBSkipD96u0hI2V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxECA8K2JqnS63Pehx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSjmDCfYdznWWTVn54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1_4fVH2MJfOCfyL54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3qx1-lchd9hGdgg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwNE5foT6sGKYutPzJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzAWUn3n4AuGlHC_Kx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]