Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A better idea would be to get each child an AI robot which can speak to them and…
ytc_UgyTfq91v…
G
Guys, what is the name of the researcher he mentions at 1:06:00 ? I replayed it …
ytc_Ugw6ifCTO…
G
@James1787Madison Alexa is an artifact compared to the current AI models under d…
ytr_UgwP5WPqV…
G
Hot take: we should all stop calling it AI art and start calling it what it is:…
ytc_UgxMWJ63_…
G
AI is getting outta hand, imo. Next, it will be able to answer any question you …
ytc_UgwzJvV42…
G
Additionally, the growing gap between truth and information, amplified by AI, en…
ytc_UgwQoYJ-_…
G
These scientists in A.I. have never seen any sci-fi movies(terminator,I Robot). …
ytc_Ugy1x6Ib_…
G
AI will learn to do plumbing. I have a very real warning like this guy. I hel…
ytc_UgyiwpxK5…
Comment
Empathy has a deep evolutionary biological basis. It doesn't help that science still can only describe but not explain consciousness or emotional experience -- not even at square one. There is no generally accepted theory.
It is unlikely that even an eventual sentient (or not) robot capable of emotional experience (or not) will "feel" empathy. We can hope they are functionally peaceful, benign in their sociopathy.
youtube
AI Moral Status
2023-05-30T06:5…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxAxOl3y3qNf07fV8J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgydooXlUxnGCfSm3bJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwrUSHDT9oS8zan0DR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0Ht-n3ZQ9fWU4L6F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxqmqRvl35VV6zDiah4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhyLgKPWy7KLGDTrZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyo2FQX9WZMhFYdfPJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxvsjHipZvJpRNtFUh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgytnuwxfUtRaB0Ctzl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy0LfwfCULANMwVHv14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]