Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Many have already said that 2030 is impossible to predict.
Consider the state o…
rdc_ohn9gj6
G
My brother is an ai nerd. An ai bro. He has converted my mother. I’m losing the …
ytc_UgzlN4Git…
G
In a sense, AI art is just rolling a lottery over and over again for content you…
ytr_Ugy32dCel…
G
Real artist's "theft" is like seeing a cool house, being like "hell yeah" and bu…
ytr_UgwKK2oOA…
G
I’m glad he’s shining light on this subject because we have no one monitoring AI…
ytr_UgzqiDCGf…
G
It’s a great platform to stay connected with people I’ve met in life, whether th…
rdc_egq3d28
G
Real artists can use digital tools, because they still have cultivated the skill…
ytc_UgznEudkk…
G
I’ve noticed that AI-generated videos don’t do well with written words. The word…
ytc_UgydTerpI…
Comment
People give way to much credit to these people. A.I is just another tech hype bubble about to burst, crash & burn like so many before it. AGI is a technological pipe dream, we simply don't have the resources and energy capacity necessary to create what is considered true AGI (artificial general intelligence). LLMs are not even real A.I, they are not intelligent let alone self aware or conscious, they're not even true A.I, let alone A.G.I, they're just a shinnier, fancier version of a traditional chatbot. They can help and assist us by condensing information but they will never be the solution to all of humanity's problems or an existencial threat as many seem to believe.
youtube
AI Moral Status
2026-02-05T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx9ZncBH6qIILk0SqB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwvBacaHOwnr-xvJJx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyqPdOqGpS0ehuHHQB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdPtccwq4ZG5KeXmB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLqPXqM27PxnWTIhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzuXk2OwGa2ji3s0U14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwuxvdcwt3x7Z_C0th4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmnbEJYWMLNrhk5D54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxGekNxCJJ-UOrg4Cp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsVO_nVag33YlCDxZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]