Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@alcuttr no dude, you don’t understand the basics of being a human! Israhell sho…
ytr_UgyGOzMKE…
G
You might enjoy sunsets but how do you know other people do? Some people think a…
ytr_UgzGLyqnl…
G
Man give her some arms and legs and I'll take her home to replace my current lov…
ytc_UgzvjiRsv…
G
Hmmmm! Ms McCoy, you talk about Management Choice in regards to how AI will be …
ytc_UgwTJddY_…
G
They should just let the people have one robot clone that does their job for the…
ytc_UgxmlOAZs…
G
This channel is going to be such a clown when AI is responsible for helping adva…
ytc_UgxckXe-V…
G
That's true, and I agree. That piece of hardware's programmed logic is proprieta…
ytr_UgyvHnGic…
G
They are lying. Anything that AI can be programmed to detect they can recognize …
ytc_Ugw9oTsU7…
Comment
The robot Sophia is same thing as our laptop computer. It cannot do anything that was not programmed into the software. A human is capable of what God allowed it to do, a robot is only capable of the software its human creator installed in it. A robot can have software to act and say it has developed emotion, but can never have emotion any more than a smiley on your desktop of your PC. Cars have never become aware and taken over because they were never built to do things that would injure or kill the driver or others, but if some human placed a timer on the accelerator to make the car accelerate but trigger also the braking system it would constantly cause death and injury, but only because some human set this up in its electronics
youtube
AI Moral Status
2024-04-26T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwcusJV54vditB8MoJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyPB-jLQbsppFt6yuN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwa5rcmiCKlWYytzJ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPR76kB6Gy1x3_nVF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxSec0Kpi4zol0krEV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZQrVs4XwJApCGYNJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwXJTlrQtfNS4zePxN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxUBeFTf7IZBuMWcMx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyseRG3qrtONhqkeCd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBz5P0yFQ_8AHx1zB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]