Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These layoffs can buy Tesla AI robots to earn a living for them now, isn't it th…
ytc_UgxQy6801…
G
I should start using nightshade on my stuff! (or uploading crappy images into th…
ytc_Ugydoyubs…
G
We do not know what contiousness even it. Maybe ChatGPT is contious, maybe it is…
ytr_Ugx-NEbnu…
G
I always argue with those who think AI is sentient. But don't forget that the se…
ytc_Ugzj5zbvT…
G
As a Software Engineer/Ai Developer he's absolutely right, If the average person…
ytc_UgxqOXK5J…
G
Giving the fact that a lot of a.i safety researchers are afraid and quitting, an…
ytc_Ugz9J8rw2…
G
These big companies don't care for the people they want the money so of course a…
ytc_UgxpkfURy…
G
William Shatner well more informed about AI, he is educated.
Mandel show his ign…
ytc_Ugy9QcCSV…
Comment
What is utterly mind boggling is a *majority,* of AI experts, even many of the optimistic ones, concede that AGI/ASI poses at least a 5-10% chance of existential risk.
Now imagine if you were getting on a flight tom & the pilot, aircraft engineers & staff told you there was a 1/4 of 1% chance both of the plane's engines would fail. How many of you would board that flight? Yet when it comes to AI, we are at the gate & can't wait to get board🤦
youtube
AI Moral Status
2025-10-31T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugys3zvCSrVUTpoR0YJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxNC1ldH5Q90GVT_Xh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwBq06t9TgsapRGa4x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwWA4uKCk7hDjqY6AN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzPlj5fFqHc5BHFZdR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxEZhvJjgO0UamHWLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgylstjrWecd55poqEl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz-Pp7yEfvy42tViC14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzzK5jDY2y4Z3PDnnx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0P68mIQEyP_eMyb14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]