Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A super intelligent AI isn’t something to fear, but human beings with the access…
ytc_UgyRh8AeX…
G
The ableism debate is missing the more important point: people who use AI art ge…
ytc_UgzKS_Xn0…
G
don't forget about how small the computer is which makes it harder to squeeze al…
ytr_UgwKu0ZB2…
G
WAIT A MINUTE: did they just gloss over that??: “algorithms can make their own i…
ytc_Ugx4vQ0lA…
G
This is probably why, self driving cars need more than cameras, cameras are not …
ytc_UgwkBaLtQ…
G
Excellent presentation. Not all that false, the pentagon already said years a…
ytc_UgyFx3TfP…
G
I know how this stuff works behind the scenes(Well every company is different bu…
ytr_UgyCdtv1F…
G
One of the effects of AI will be similar to drone technology in terms of allowin…
ytc_UgyvGMLpU…
Comment
Can someone please answer my question, is Sophia technically in the cloud as well? Would she be considered a part of the internet or is the internet? Wonder if they would talk about ever having rights, if so would robots with money be reasonable to help people or would they use the power and have a thirst desire for more and be a narcissist having no compassion for others. Some humans have this before, ego aside robots can help us more with everything even staying comfortable with each other and pick topics for us to learn and expand and understand what exactly we need for the best possible experience we are looking for in our lives.
youtube
AI Moral Status
2020-07-11T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxcmcQ7NC6tTawxFSB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2pn3Ow0nGWen_hqp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyaBmvTaJlAsqMjqnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwT012w2H0ZKqST9sV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyCmOgYR80MN2uOPWt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZrC4F4vxpMB0o7-B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2igWhyhy0-pCOOWN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0oOfFIYmSJLF-pGF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzKAkwfDHOOCnnzrlt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYoSUfqUa_JW0GoVp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}
]