Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The jobs for creatives are/will be harmed, with or without licensing. Licensing …
ytc_UgxH29lo0…
G
The moderator posed the debate question as "will AI make work obsolete", but it…
ytc_UgxdaAYUT…
G
I'll be back on 2050 on this video, just to prove that it was just speculation a…
ytc_UgwMwziLO…
G
I'd like to see AI be able to carefully move furniture around near impossible co…
ytc_Ugw5Jdeu-…
G
@nintendoboy3605 from nightshade's website: 'Glaze/Nightshade actually change th…
ytr_Ugz4azyBV…
G
I’ve been loaded at warehouses that were 99% automated. Soon no need for humans…
ytc_UgxuunhtJ…
G
LMAOOOOO chapgpt is literally the best ai out there it has such personality and …
ytc_UgxMnp7hX…
G
anything that can connect to computers that create there own languages through w…
ytc_UgydtAacg…
Comment
There is zero awareness here. All pre-programmed responses. Might be capable of learning to track your movements better, but that's about it. True AI is about 30 years away, if you believe that equation that has predicts computer development. It's been dead-on so far. In the year 2045, a computer will be smarter than all the people on the planet combined, might be self-aware by then. Of course, organic machines might have been developed to replaced all the solid state computers by then, true thinking machines. We are already using rat brains to do control system exercises, so it's not completely crazy to think true AI will be real in just 30 years, and available to you and me.
youtube
AI Moral Status
2016-03-22T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgjGYthhNzXdsXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugh8KcBTUpbuIHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Uggg6DLAli2vq3gCoAEC","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UggK_3bqvphwRHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Uggdmw0dAtD193gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiEXqe5j2J3WHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggn0nmsWPedrHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghIqZHRNlJ41ngCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjnbkiM3AqZ4ngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjdH3W8cDdIvngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]