Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI took sales jobs a long time ago.. when’s the last time someone had to walk up…
ytc_UgxNJpuxG…
G
At the end of the day the only thing that matters is the end product. It doesn't…
ytc_UgxWgMYJC…
G
Everyone, no matter who, no matter nothing needs to stand together in all the ma…
ytc_Ugzdoq4oh…
G
I then asked GPTchat this "Is it possible that a copyrighted image would be used…
ytr_UgykWJSor…
G
Also another senators with GPT. it doesn't strike A war with another country or …
ytc_UgzCy8Csq…
G
How long ai does the coding as per our needs? Need this tech asap as it will hel…
ytc_Ugx5DiDfe…
G
Protection of our data needs to be a right, and we should be compensated for it…
rdc_fejxx63
G
Pretty cynical take if you ask me. Nonetheless, loved the video and all your dee…
ytc_UgyNHKcXN…
Comment
I don't remember if it was in regard to racial rights or animal rights but I heard something a while back I like that applies, that once a being can ask for rights it deserves them. Understanding and wanting rights would be a pretty good indication of self-awareness and cognitive abilities. Unfortunately being able to ask requires they can communicate with us, a major hurdle for animals. And yeah they'll probably want a different set of rights than us.
What would be a bigger problem than computers reaching a level of complexity that they develop consciousness is if (like many times in our past) we learn to recognise the consciousness they've had all along. If it's new we can just use older models for things like automation and factory work and it won't have as big of an effect as if everything turns out to be conscious.
youtube
AI Moral Status
2017-02-24T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghBsdvkqrytYXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgiuleNNrJVRZHgCoAEC","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Uggd38vfndHWt3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg5R38fstOz_3gCoAEC","responsibility":"creator","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghU6immMZEHlXgCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Uggd8NAdlsfsRHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjQcetBhk6wU3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggXZRI8LEbBcngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj8JCZ6OH21Y3gCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghZ2hVEk12VdngCoAEC","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]