Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You thought ..you would hit the robot and what !? It would malfunction....tah ta…
ytc_UgwQxeP-p…
G
I like to think sometimes about PewDiePie’s 1 year drawing vid, how he started o…
ytc_UgwrT2g0g…
G
This isn't man creating technology but the fallen ones giving the technology to …
ytc_UgzCOAWTa…
G
Intelligent people creating stoopeed matalians,, just because they want somethi…
ytc_UgiuinKXN…
G
Disabled artist here, I didn’t even know gen ai supporters were using us as a fu…
ytc_UgzwJrDtf…
G
I just watched a driverless taxi bypass flagged for a construction site and drov…
ytc_UgxeIK_tJ…
G
AI is going to have a sense of humor.
It will laugh its ass off when it kills al…
ytc_UgxMuQxvG…
G
That’s why AI is really just to solve minutiae. People getting tricked by AI it’…
ytc_UgxArXtcc…
Comment
Robots shouldn’t have rights cause you can’t make them have real emotions you might be able to make them say ow or something if they step on a lego or cry but you can’t not make something out of metal then give it emotionals and when you showed the pic of the slave robots it was different cause they ain’t have emotions the Africans and Hebrews did tho and if you program a robot to feel emotions it won’t really feel emotions it is just the way it was programmed
Btw I’m LATE
youtube
AI Moral Status
2018-03-14T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyramFbmtiFDcJAxf14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy88NA-Om8hqQgzB3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRz2EIcMDddYq41fV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzunlRZFgU-88ttIxV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyp5r4jQBKD_rgDaY54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyaSiSdW5ZFjpHi1RJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDAtM8adaVsvjl5CZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxEaKjTvNIXxaGNCs94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxpHMsTwHSW15Xc5ad4AaABAg","responsibility":"none","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwpCqPJKUMQAixZV7R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]