Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So the AI Agent is for free .... ha ha how much will one cost oyher than a cost …
ytc_UgxVo-BZ0…
G
What's the other point of view? In fact, an AI lawyer can do a much better job t…
ytc_UgzWvEs78…
G
Has anyone noticed the blonde lady behind that resembles the Robot Female they'r…
ytc_UgxerzQCX…
G
They shouldn't use AI at all! Sure, not all AI is bad but still!! Just why man?…
ytc_UgzAcrGLY…
G
If one observe one person with higher intelligence vs one with deficit of intell…
ytc_UgwjtmZxd…
G
Watching these a.I robot videos these things kno how to fight an what ive seen i…
ytc_Ugz3nKcJw…
G
Instead of worshipping the creator, they worship the creature. Same thing when…
ytc_UgzaIoYaI…
G
What do you think of President Joe Biden running for re election? "Better watch …
ytc_UgzlgOBa6…
Comment
There is an easy way to avoid this question forever. We don't install anything more than the most basic AI in robots that are meant for menial tasks and if AI is necessary then this task can easily be given to an AI that has rights and is powerful enough for supervising and directing a bunch of bots to be an easy task for it. And we can be nice to the few AIs there are while having all of the economic advantages of having automatons to do all the dirty, difficult, dangerous and deary jobs.
youtube
AI Moral Status
2017-02-24T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjyarnsMmnkGngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghTXyshqik943gCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghQsKYsd-Ki_3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgggwBPrVX7wAHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggE6SPzi0kvdngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggXexT-TTeXzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjl5NS5pTmJcXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugjc_-iQJM-_LHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg75IgfCGwkrXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggJXPMrGWhAjHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]