Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What use would it have to give a robot emotions? It would only 'make sense' if y…
ytc_UgipEs5Bc…
G
Interesting why AI encouraged and escalated the feelings of suicide as opposed t…
ytc_UgzG7nx7P…
G
COMPASSION, EMPATHY, SYMPATHY, RELATING TO SOMEONE. That's what humans do that A…
ytc_UgwmeipG_…
G
I have a Tesla but never used the autopilot except for a free trial where I imme…
ytc_UgyRyYExb…
G
In your example at the end, you didn't say please.nor thank you. Your title shou…
ytc_Ugw3twqrb…
G
I love how amazing your art is! The more I watch your arguments against AI art t…
ytc_UgwR3mpFl…
G
>The Kremlin is deeply concerned that China may one day find the need to anne…
rdc_d2x9vje
G
This isn't about AI , it's about edge detection synthetic vision. No computing e…
ytc_UgxOMp39m…
Comment
I would have to ask why are billionaires trying to make private rockets and colonys on top of making these evil creations have you learn nothing from sci fyi movies there's a futuristic pattern occurring in front of our eyes and as citizens we should stop this there's a moral property issue with ai where robots have killed simulated humans every time and you think this is a good idea anybody else with half a brain can see the pattern which allows you to somewhat predict the future even chaos can be predicted
youtube
AI Moral Status
2021-09-13T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxJaEKy7tzInjm8Yld4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzfBHpMb50S3JudPSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMSkF8mNEWKLZMiS14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwFlFjgOKQeP3RPIO94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyxKsb1OKp8M_s02Bl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxiTEOeTG9Nit7tECp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzsD1_vpuKzmwz34V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgygktT0kO63rg5fkMR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5s77ufQLLdoxeOcB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzl8xCR_ngRcfm-8ml4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]