Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Seriously... tech bros are very easily dazzled by things. We've been with this g…
ytc_Ugw3Yton2…
G
Could people just be assigned a robot that does their labor and still get paid ?…
ytr_UgyjW_riY…
G
the extreme AI push was to push Gen Zs and Alpha to enroll in the Military. 10+ …
ytc_UgzWtXhNd…
G
There should be a place on earth where we ban AI and robots. Where human work is…
ytc_Ugz4cEVi0…
G
"anit ai is ablest"
bullshit say my actually disabled friends.
i have the notio…
ytc_UgykO9Nho…
G
one of my best friends is an incredible artist, and i ache to see that she has t…
ytc_UgxojdU_5…
G
Humans are too greedy for this and can easily be bought and sold good luck with …
ytc_UgztAZsFO…
G
i say we make a word for ai "artists" thats can be used as an insult.…
ytc_Ugz0xQCrQ…
Comment
I do not understand the debate around superintelligence at all. You are creating something that will be more powerful than any human alive, can improve itself autonomously and can copy itself and infinite number of times. An entity like this is almost definitionally uncontrollable. Its unfathomable to me that anyone would want to create something that would have near total control over every single thing on the planet and potentially far beyond, forever. There is no room for humanity in a future like this. Its suicidal. Everyone I talk to about it either obviously doesn't understand, doesn't believe its possible or deliberately suppresses their feelings about it because they feel powerless and would rather just ignore it until the last possible moment. They both just sit there like "Hmmm what if AI did kill all humans. Would that be a good thing? Lets give pros and cons on both sides of the argument." I feel like I'm taking crazy pills.
youtube
AI Responsibility
2026-04-21T17:5…
♥ 46
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdlxNlMG7JZ_g1Q1R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIvG5UG39OHmvfBRt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9yegn2CggmQaxgS94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQ3Ihl32s4wHScXAh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwMC4K_HBc5s_QA8V54AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZFEbG6SMOW4LvNjl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugwa0_-Wu5d_T7l6jNp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxQFnAxaik9gdosJJV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz1jTThpEism_zuPOd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwKEzwBsfbejQpOcIR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]