Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI has the potential to fundamentally reshape humanity’s future in ways that cou…
ytc_Ugx2RL1u1…
G
Not one person speaks about Recursive Self Improvement? That is what these AI co…
ytc_UgwhUW5O9…
G
An artist is nobody before his art, he get famous thought his style and art, aft…
ytc_Ugx2ragC_…
G
I realize (now) that this is an "old" video but I feel like my comment is still …
ytc_UgztbUrzu…
G
When you said purpled haired teachers you mean liberals yes we dont want them p…
ytr_Ugx2wsKtZ…
G
Imagine sending that to another earth-like planet with Type 3 civilization livin…
ytc_UgyU7cMhh…
G
It's easy to place Musk and also Trump.. We have to screen people for cluster B …
ytc_UgxoMInyi…
G
ive heard the ai bros on twitter complain that artists are gatekeeping :sob: lik…
ytc_Ugy3QDXtO…
Comment
Hmm I have Q.
1. Will the world have enough money to build AI/Robots to take over every occupation within 5 years?
2. If this happens, what will become of humanity? Do they expect humans to not work and sit around all day, where will we get our money from? Or will this be when the UN brings in the universal wage and pays us to do nothing but spend the money on AI businesses?
3. You’re then talking about humans living for eternity, but why would we live for eternity if humanity will be taken over by AI, what would be the purpose in this.
4. And when this technology runs into problems or the robots breakdown, who will fix these? Or will they build AI to fix AI
When men play God in never ends well…
youtube
AI Governance
2025-09-24T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxF45ef5FH4SJVZv8N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrgB08CB-9u_0YJ894AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxItMJrzUv2BZvsPLx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXzMPkpyrBCvQX-C54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwmKUapO1RWDhN98IN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxcJA6oeeF_tdiSb694AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxxrpmFLKu2RKA0CHh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxC3kc8gK5Ox4KG6v14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwt0iKZIs6AcEKTS0J4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzni-49bytdIzs2sfd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]