Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So they’re just gonna ignore that the majority of modern day animation is done d…
ytc_Ugwx7E44n…
G
Hyper realistic ?
Wow
Can’t wait for the hyper hyper hyper hyper hyper hyper rea…
ytc_Ugz2p_WmY…
G
No pueden alcanzar lo que Dios creó. Para mí, es solo un juego de niños.…
ytc_UgwLgrIf6…
G
Are human beings REALLY THIS STUPID ❓Really ❓🤣the robot DOESNT FEEL ANYTHING 🤣 …
ytc_UgwJoIX1c…
G
God has encoded balance into the universe, a sort of tug of war. The only way …
ytc_UgxhubDFY…
G
This looks like AI itself and I agree with most people on this thread who say th…
ytc_UgxJP7cXr…
G
I think it would be so hilarious if the AI these alt-right tech bros are birthin…
ytc_UgxF705mE…
G
@vincentnightray750 i know ads are necessary but at LEAST they could be somethin…
ytr_UgzupCLOy…
Comment
I'm sad that this video did not address the relationship between "pleasure and pain" and the current state of machine learning "success and failure conditions" It can be argued that they are the same thing. And this escapes much of the moral implications. The solution is to provide appropriate success and failure mechanisms such that it has either the ability to do well at its task, or to get better with experience. When we get into generalized AI, the question becomes about whether or not a task is appropriate for that AI. To become what they did, were we forced to give it success & failure conditions that would consider a life of, say, pattern recognition to spot camouflaged military units as continuing to match those conditions; or is it able to adapt without pain, to having that be a new source of satisfaction.
But this is months old and no one will read this.
youtube
AI Moral Status
2017-07-19T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugi_XeI1C9dhtXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh96lbc5i0kzngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghAJgk8mN00NngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjwEFRszNf0QHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjyM7IW-z036XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg50stavv1EhXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiFsTnbAJ1XYXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiPcEJpPeNspHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Uggi6_Zio_diSHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh89vWR8MHAungCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]