Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The best thing about "AI artists" is that they've become obsolete BEFORE real ar…
ytc_UgxXMA6TJ…
G
Then they say it's their art. No buddy maybe the 5 words you typed out are, but …
ytc_Ugzb2ThMB…
G
This raises a question I’ve been wrestling with: is there ever such a thing as ‘…
ytc_UgyJDZwx0…
G
Risk management in AI and technology is indeed complex, especially when politica…
ytr_UgyepHd5d…
G
The only thing ill buy from amazon is a robot so he can do my ebay shopping…
ytc_UgzrA4nGC…
G
It’s “supervised self drive” you sign off that you will stay alert and make corr…
ytc_UgwuQRD2k…
G
i thought it was going to be way more climactic than not having enough doors…
rdc_d3s0j8s
G
The first thing you should understand about AI is that there is no awareness or …
ytc_Ugzyub3C3…
Comment
I'm sorry, I have a lot of respect for Dr. Tyson, but he has fundamentally not understood why people are building AGI. It is NOT to do all the mundane stuff in your life (although it will be able to do that). The fundamental argument is that creating an AI which is better than us at creating AI will create a feedback loop ending in something that is significantly smarter than humans at all tasks. The data on the Internet is fundamentally not an obstacle when we are now training AI using different methods (especially the reasoning models). Once we have something this powerful, we have "a country of geniuses in a datacenter", smarter than Einstein at physics, smarter than bismarck at politics. The small-mindedness of the answers is mind blowing.
youtube
AI Moral Status
2025-07-24T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQPGaWw2oLblu_K494AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRJ8oGF2CzjRvCL354AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgzKrc3_MLdyyhT8hr94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUCPhWeAt1zGBJVz54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweOEIWfipmM-CXKql4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzl9IkPR9fV79xkSTh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxYicX8_vFODHzoD614AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyWjAIrzOVVmRWx1Qh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw0VcjXjnPGnU1S9sh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwaMwWH2JAEoz1oCbp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]