Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fact you mentioned a tree does not support any Pro-AI arguements. And bringi…
ytr_UgzyDuDqA…
G
Certain careers in the blue-collar and healthcare industry requires human qualit…
ytc_UgzUIjm28…
G
Homie basically said Ai is doing a Kage bushin no jitsu, and then training and g…
ytc_UgxD75ctu…
G
I hope you understand that by making this video, that you just trained AI to do …
ytc_UgzwT56_9…
G
@avgjoe5969 No, they both use HD mapping (even elon said so), and they both use …
ytr_Ugz9tHz19…
G
Very intriguing...
"Self awareness".
When reached, A. I. is on its own...
Am I…
ytc_Ugzq-1Dsk…
G
AIs, specifically LLMs are designed to predict the next word they are going to w…
ytc_UgyvNfm6l…
G
We are approaching the event horizon of the black hole of AI. May God have mercy…
ytc_UgzGMiBlf…
Comment
I see three scenarios 😃:
Scenario 1: The rich can do whatever they want, but one thing is certain, supply and demand will continue to exist. Now, extremely efficient machines are helping us, with people's purchasing power increasing absurdly and ways of generating income dividing between:
- Owner of production (production of tangible and intangible goods or services through complete AI automation),
- Exclusive and intrinsically human jobs (things or services that people will buy simply because they are made by humans) with a few hours of work being enough for you to save enough money to live for months.
People probably will generate income by doing both...
Scenario 2: Skynet kill us all.
Scenario 3 (less likely, but would be awesome): AIs help us discover an efficient way to make long interstellar journeys and we arrive at the age of great navigations II.
youtube
Viral AI Reaction
2025-11-25T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwNMsrFCd7axS1Vs0x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwPHWaho6sisLpEeEJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5Zc4vPg2_ZBvToU54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJD0Qi6LrIXDmTkEp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyoPNxHynRdta8YQr54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzefVTGEPvxZrKK6At4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugydp9RP51P4y4gfRoB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxb57r_qKIr3cQjwQ14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynwMHmTwOoQgVQDTp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwBgxv5LPvy-EVLdLx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]