Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, and that I should stop talking to people who try to convince me of these t…
rdc_mxfiqzr
G
A lot of BullSH#@T! We don't have cientific knowlege to create a real life AI. M…
ytc_UgyP7mFwG…
G
Replacing people with AI has some pretty hilarious mid term consequences that so…
rdc_nbhflg5
G
AI is a fun toy. I have dabbled in the "art", but it was more... playing around.…
ytc_Ugyz4-qhj…
G
I think the lack of personal view communication in AI art is what makes me think…
ytc_UgwvmY0yc…
G
I hate AI "art" for stealing people's work, for taking jobs, for enabling lazy p…
ytc_UgwsxzSHS…
G
It's weird that when Alex backs the AI into a corner, it ends its answer with, "…
ytc_Ugw5ohfF_…
G
Companies haven't explicitly said AI was the cause of their layoffs. But it's in…
ytr_UgxnN6900…
Comment
realistically, we dont want robots so advanced that they are programmed to be able to question things. it might be fun and interesting to see how far we can get with AI technology but there is literally no point in having it in every machine. who and why would anyone want say, a forklift robot that only lifts heavy objects if it gets a raise? or a nuclear reactor control computer that wants to see the employee health plan? before you know it, youre just making very expensive humans. *all we have to do is not make AI this advanced.*
either way, good video and im glad you got round to robots.
youtube
AI Moral Status
2017-02-23T16:1…
♥ 283
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghaD-5ZxaeiFHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjxCutHJJTNAHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghrVsZWbl000XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UggyWjVGG2TWQHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgjjNbr57AKOtngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiEIF1_NIDjCngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgicYfYblhiTRngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjXXsfNK0XwjXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiiuYeq49lLEXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UggqMCeyDBik1HgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}
]