Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
OMG! Can someone answer this ONE Question. If I use buy commercial rights from a…
ytc_Ugwaj0KjS…
G
I wonder what insults the A.I. would come up with if you just gave it xbox live …
ytr_UgwkbOXUI…
G
Let's assume that what some claim is true—that most of the jobs currently perfor…
ytc_UgxpIrXnQ…
G
As much as I don't think we should be doing any of this for the long run, we're …
ytc_UgzHhmeNL…
G
On the flipside, because we know that they are ai-generated, don't they lose all…
ytc_UgyEh2Jwn…
G
Calling urself an artist while using ai is like calling urself a chef while only…
ytc_Ugyo3NDJQ…
G
Fascinating. Everyone should watch this. I’m not at all a materialist like Hinto…
ytc_UgxFzJofy…
G
Thanks for educating us on AI. Safe words are low tech, just like The Club, for…
ytc_UgywbXgNV…
Comment
Neil’s Optimism Feels a Bit Outdated
Neil deGrasse Tyson is brilliant and always fun to listen to—but his take that “AI is just another tool like the car or computer” feels dangerously optimistic.
Yes, past tech revolutions replaced jobs and created new ones. But AI isn’t just automation—it’s cognition at scale. And while technology is evolving exponentially, humans aren’t. Most of us aren’t becoming exponentially more creative, imaginative, or adaptive.
Telling every ordinary person to “just be more creative” to survive this shift sounds like asking everyone to be Olympic athletes just to keep their job.
The danger isn’t that AI will replace everyone—it’s that it will replace enough to reshape the fabric of society, while we keep telling ourselves it’s all going to be fine.
youtube
AI Moral Status
2025-08-02T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzx3cehIRJTdobB30V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxvQqgYpSTcjUxNwdB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzr_4Mx4l_orioo4Sx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxuu1ybIY83-94zH5h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwrs60cRfTBY18oqqd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz414nl5nZ1-nel3S94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzKY4a2mku8wUtcg9F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxj9fL6RR8qSAjNOHF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKSbolSkvwsFbIxh54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGjooVHvMiVBhXAMB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"indifference"}
]