Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"I want AI to do my laundry and dishes so that I can do art and writing, not for…
ytc_UgzztBcMF…
G
Not AI but I literally saw at a Walmart something called a smart snail. WTF is a…
ytc_UgxRkhBs0…
G
My favorite sci-fi movie, Forbidden Planet (1956) comes to mind. The Krell, a su…
ytc_Ugwz2REaD…
G
My questions is, when one AI start competing with another AI for power and contr…
ytc_Ugx8EDh61…
G
Good point and my sense of AI is that self driving cars and replacing jobs is on…
ytr_UgxzcDuBd…
G
The other day I saw an AI pic with some guy's watermark on it.
Lmao.…
ytc_UgzU2GPyr…
G
Sentient Intelligence right now is really far away, to be honest.
There are man…
ytr_UgiUp8D2Q…
G
Entiendo que por la primera respuesta dada sobre la legalidad, actualmente aún …
ytc_Ugz_Bfv23…
Comment
Many comments saying "Don't program them to have feelings". One thing that will fuel robotic intelligence departments of companies that dabble in robotic manufacturing is how economical it is to make a really smart robot, versus making lots of robots for an explicit need. If you make lots of bots, you are gonna need a lot of materials for many uses. If you make one bot that is applicable for a lot of jobs, but is expensive, chances are you are still gonna make the latter, or at least the business sense of large corporations will push for smarter ones. Especially since humanity is more aware of global resource management in past years. And as people at the programming departments try to make a smarter robot, the worst case scenario is that they won't even be aware of the robot "evolving" his intelligence to a level where it becomes self aware. I doubt that programming robots with feelings is high on the agenda of global experts.
youtube
AI Moral Status
2017-03-30T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggdK4-kj4fZ8XgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggKRY8uFcIsqXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjnW1J3ViyfrngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiC1N3DmtnvpHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNuoc-I2DP_XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggOCJjsINSZxXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjWrIZdXZ2JAngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjBTO2oKelJlngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggYU4Qkt_4CP3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugje-B8BgTNcmHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]