Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hank Green: I don’t know much about AI. Also Hank Green: making content about AI…
ytc_UgxlbDFp5…
G
I understand where you're coming from! The dialogue in the video highlights the …
ytr_UgyqyejKh…
G
I have an answer for you, but many people wont like it, or even understand it. A…
ytc_UgyEMBB5r…
G
Lmao I can’t believe these guys actually thought about if ai needed rights. Of c…
ytc_UgxBsqmy7…
G
The irony is that Elon wants regulation of AI by the Govt however Govt agencies …
ytc_UgztRJKKq…
G
Throw in that foreigners aren't considered Korean and aren't included in the bir…
rdc_ljbfxzu
G
Here's a genuine comment:
"The framing of 'hiding' often misses the more mundan…
ytc_UgwpruNLM…
G
ill say that to an extent it is technologically impressive what can be done with…
ytc_UgxX5ZQHn…
Comment
I initially considered UBI as the good scenario, but now I think that its not possible unless AI can do ALL professions, otherwise you create a society where some people sit and get paid to live a dream life and others have to work, in which case I honestly don't know who would choose to go and study in order to become one of the few left that have to work.
But let's it happens! Let's say in the future no one has to work and you have infinite wealth. I don't think people can live without goals. Without real achievements. If there is no need to get a job many people most probably will avoid education all together and focus on recreation and petty things, creating an even more self centered, wasteful and shallow society.
I honestly don't see a good scenario in any of this.
It feels more like a metamorphosis stage and humanity is the cocoon.
youtube
Viral AI Reaction
2025-12-02T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8LHknMGiHcKVyazx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3_8G5cejBw2S2W_h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwBzAzEFU4oErYZ0r94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwlmCP6ULNxQYlb9wN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyQVJ6wfnbQ-eVGlOF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyk3c7h8MFEb0ILaS94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyX6sPoloYGXXB5Na14AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx2UzCjEDVRfIDhEFx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9TvVUScROnXTEY2Z4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzUfYJ8XPTqYwWQN_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]