Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know, I sort of agree with something this guy said. The person behind the AI…
ytc_Ugw-N27N_…
G
It won't work because the Tesla car is an AI itself .
However, I do believe tha…
ytr_Ugxie9Ds5…
G
As an IT innovator building the foundational layers at the origin of the IT emer…
ytc_Ugyz-0zwb…
G
Many non engineer jobs were gone because of automation, it happened for years. N…
rdc_m6ybls7
G
When I saw hyper-intelligent - I naturally assumed it was used to describe Elon.…
ytc_Ugwj8q4Ia…
G
"Hi Ashish, you got the right answer. Kudos.
The contest is over and winners hav…
ytr_Ugynl20Z2…
G
@SzalonyKucharz I feel you might be over simplifying the quantity and quality of…
ytr_UgxsQhmJh…
G
It's probably better to be polite if you want this role playing ai to help you !…
ytc_Ugyh5j8--…
Comment
Unrealistic, who will consume the "AI" products? B2B with people becoming irrelevant to economy just means, that most of B2B products aren't needed.
But more importantly, no working people, no retention of knowledge about the jobs, no problems, nothing to solve, no innovation, no needs, collapse of modern world. Thankfully the world is big and many places wont be affected by it at same time, how many citizenships you got?
Another unrealistic thing is, the AI will have to deal with people if the projects cross into human territory (construction, maintenance), but who will respect it?
The entire premise of AI doing human jobs means and becoming economically relevant means that at some point AI will provide jobs to humans, but humans DGAF.
youtube
Viral AI Reaction
2025-11-22T21:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxcrIvXEk1yOdgo-U14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwv1OXSDP6H8B7TPop4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgxwWC3jMRA5szL_l3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxjENX6Rh3x7ehj2u94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3CQg8NEmIV654p8N4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyhq_NBSp5LvQS3DsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_ApDXYs7hCrwaSt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx1Cxwn2KKBtoMXOhh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxCGy87yPcJvS24Jfd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyWORiv6Kr25Gs_2fB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]