Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i hate how these ai companies all make (or try to make) their ai sound human its…
ytc_UgxihtZr_…
G
Hi everyone, exactly a week ago, we conducted a quiz contest in this video. The …
ytc_UgyrkXgbA…
G
Wikipedia has to be fact checked by multiple people and include a source - Ai ta…
ytr_Ugz16bWhO…
G
This is why AI navigation systems need to use lidar. It even makes navigation in…
ytc_UgxIYtMKg…
G
Really great explanation of all the problems with AI, I especially liked the way…
ytc_Ugyqr5UNW…
G
ia dudes just mad cuz they got no drawing skill
(not saying i got some but atlea…
ytc_UgyRGLiU1…
G
The thing I hatethe most about IA art, is that rule 34 is flooded with it, how i…
ytc_UgzV57bl3…
G
I don't understand why YouTube isn't being held liable for hosting advertisement…
ytc_UgxYzO7ex…
Comment
Robots will never be conscious for fucks sake. They will only be able to make us believe that they are.
They are not organic in any way, they are just code, however you look at it. They do not experience time/life/growth/death, hell they do not EXPERIENCE anything. They learn through symbols/syntax which is completely different to our cellular brain. Our AI is still weak even though apparently strong AI has been around the corner for decades... Most importantly we have no idea what consciousness means... after all it's just a word we use to describe our subjective biological experiences.
I'm getting mad now... how can people be so stupid? THIS IS NOT HOW ANY OF THIS WORKS. MACHINES ARE MACHINES.
youtube
AI Moral Status
2017-02-24T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgguZkakQ-aSIHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiVYzcqK51muHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjAwYV7bNsWrngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiM2ON3rg2HuXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjoNG0qXemj1HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggnlcXdFfJjMngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggQEebRU0W_WngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjFW1C80PQDVngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgijAzfQFlDQrngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj3wix4Hs8P1ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]