Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
21:51 looking closely at the fingers on the people in that courtroom sketch to m…
ytc_UgwHdEs5b…
G
My former employer manager and boss said automation and A.I. will take over your…
ytc_UgyElR0b7…
G
Customers wouldn't be so short tempered, & angry if they hadn't spent 30 minutes…
ytc_Ugyjffi8F…
G
what da fuck, Im german, 3mins in the video realizing the new "translation" A.I:…
ytc_UgzzmkviY…
G
Polish minister tells Ukranian opposition: agree to a deal or 'you will all be d…
rdc_cfl1n0c
G
More exciting. He post an AI pic of Taylor Swift supporting him. If anything can…
rdc_liw9wig
G
"And a machine is replicating the style...."
Isn't that what the AI is programme…
ytc_UgzH8mXNI…
G
As someone who works directly with AI development, there is a LOT of fearmongeri…
ytc_UgxQ0-saS…
Comment
In a way this sort of ties in with the Ghost in the Shell concept. I think that if there is a biologically and naturally created consciousness/mentality and brain in a object, such as what'sherface who was born a human by the natural (normal) process and was later put into the 'shell', or body, of a robot. AI doesn't give something 'life' and rights, but the naturally developed mind of a living creature. (Oh, and if it looks like I'm 'poo-pooing' on people who were fertilised in a lab rather than in the womb, I still think they're normal humans, they were just created outside the womb.)
youtube
AI Moral Status
2017-09-25T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwhfUbtpxRpFCg2RbF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxw29hCRkRXSp_1xQl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzxs_MaS9tOuE-ofU94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzZuMj4n3MDIkIG1ql4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCNly2eYnFv9N7GZB4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyiQyxfa4atkYseCmx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx7Lk0ES4Dp34m9F2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVymjGfAAf9ZSK9w14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz41nduqULPOKslKst4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyL7jLrsf5hxsujVlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]