Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate how ai prompters immediately go to doomer logic. Like, I don’t pay for co…
ytc_Ugy1GF04k…
G
@dodobo8175 think it through. How big of a difference do you think it will make?…
ytr_UgzzWedFp…
G
CEO of the most biggest AI company today tells us that the whole world has to us…
ytc_Ugx0mDHNZ…
G
UBI may be introduced but it will stand for Universal BILLIONAIRE Income. If AI …
ytc_Ugx05-PFr…
G
@xjood805 Ok, so would you scream at people who use AI to generate images of yo…
ytr_UgzIuD0ay…
G
Teaching kids actual skills theyre going to use in daily life. Yeah i like that …
ytc_UgzABQwBu…
G
I'm currently a university student studying computer science to work in cybersec…
ytc_UgwYCQmzr…
G
Another U-Turn because of woke political correctness. Facial recognition is desi…
ytc_UgyBPbDSD…
Comment
This video says it nicely that we need to program them with emotions in order for them to be emotional. And then it goes on some rant about 'what if this, what if that' ignoring entirelly what it correctly stated earlier. Also, we do not need to program them to become self aware. They are a simulation. They do exactly what we program it to do, and guess what, if we don't like something, it can be removed with some reprogramming, and we can do that as many times we like, and its an exact science, no guesswork needed like in every damn hollywood movie!
Ok, so I'll just sit back and wait for the usual slurry of replies from people who haven't written a line of code in their life about what they think A.I. is, and how some Gizmodo article told them it works. This applies to people doing thoretical IT too which has nothing to do with real life xD
youtube
AI Moral Status
2017-02-24T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh0c4l23P6EYHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgizmdfK6BHeengCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiS9-lmbu6FW3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggg7_XeDnLEkXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjlRCoviv8l7XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgiAi7l2Sx79l3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiM-TwLKWJZ13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UghE_QrjN0MWgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi_n0NFADJiGngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Uggf753UlzgQ93gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]