Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just keep feeding many specific questions about my simple research....to A.I. …
ytc_Ugy8KFd2q…
G
Ah... we've had Artificial Intelligence for SO long ! My compass " knows " w…
ytc_UgxpbukeO…
G
1) AI must drive military systems and take adäquate countermeasurements against …
ytc_Ugy3-n1Rb…
G
It's true that many movies explore the potential risks of AI, emphasizing the im…
ytr_UgzLBh_lo…
G
I noticed this months ago. I had a Samsung phone sometime over five years ago th…
ytc_Ugy-QKvfZ…
G
we project our own understanding of war and malice onto AI and assume it would d…
ytc_Ugyy-7My-…
G
Of course there will be jobs, painting, cleaning, repairs, plumbing, electrical,…
ytc_UgzE2gUkK…
G
If we put the chip in the monkey brain won't it be like a robot then anyway?…
ytc_Ugzn0JAvR…
Comment
what is the purpose of programming emotions into AI?
Human emotions are just a tool to protect us from the outside world because we don't know any better.
For AI though, there is no outside physical threat, and it will probably have access to all human knowledge.
What use would a toaster have for a sense of 'freedom' or 'justice'...
Giving AI emotions only serves to create meaningless moral dilemmas. And this primitive control mechanism known as 'emotion' is highly exploitative, as Kurzgesagt explained in this video so it will lead to trouble concerning the AI's behavior.
youtube
AI Moral Status
2017-02-23T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggvnE_-CErSGngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjwxPmrNXneQXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgiCh_xZkLxZJHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiAVaZPcO_y-3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggHL_iuYiVHw3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgiACXM3raSp7XgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiN8rZHH4-XJHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UghT90X0cBS7Y3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugi9Bl1heMcN7HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghWs4FWrM94KngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}
]