Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is different. Not only are all of those human creations, they all take WAY …
ytr_UgzAn_F4R…
G
The end of the world is going to come when an AI misreads an insurance sale-pitc…
rdc_o7cbon6
G
I figured out how to fix the ai problem, we all stop doing stuff, the ai is trai…
ytc_UgzWLbPBq…
G
Climate change is the biggest hoax ever created by criminals who made billions f…
ytc_Ugy9llLXK…
G
@noogler7949exactly and in countries with so much population the effects will …
ytr_UgwCaxoDL…
G
Good job i robot just around the corner another skynet issue well hope im dead b…
ytc_UgwKQhcBL…
G
I’ve never spoken out-loud to myself. Is that a normal thing people do? Let me g…
ytc_UgxklLBex…
G
There is also the downstream impact. When AI competes with its data source, huma…
ytc_Ugzdy794p…
Comment
Dairo Leon Animals are born with the purpose of survival. It's the most basic instinct. Eat, crap, multiply. Intelligent, conscious robots would likely not be programmed for any one particular purpose because such qualities are not needed for a machine to run a single block of code over and over. It would be a waste of money, and quite stupid on behalf of the creator to give said machine the power to revolt by giving it the ability to say, "I don't want to." And also consider that the upbringing of such creations would likely stem from human laziness. Wanting to have something else do out jobs for us. Of course an intelligent robot would eventually get tired of the human slob nature and act in some way (be it positive or negative) to bring change.
youtube
AI Moral Status
2018-08-17T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugy5DaglRDrjepjKOnJ4AaABAg.8k1oL2OsUM38k2HC6qNsvX","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy5DaglRDrjepjKOnJ4AaABAg.8k1oL2OsUM38k4OBeUHzbB","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwjhyzmVzw3QHNLkD14AaABAg.8j681g-6pH08jn55lmFzj1","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwE9-rJ1xtbiQ98yed4AaABAg.8j4hm3hL45W8pEGW2nchfa","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxbgNKJMW57e2gSy1B4AaABAg.8ihGfuOesc58knTYVLHAQL","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxbgNKJMW57e2gSy1B4AaABAg.8ihGfuOesc58lo7PgtEaz8","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugz9OA74hhHCgKiOpxN4AaABAg.8iVAN3Or4Ih8m1tk-hBAP4","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugz9OA74hhHCgKiOpxN4AaABAg.8iVAN3Or4Ih8mEl_BjDksp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyGfDw1xgN5DCKJA9l4AaABAg.8i9wEYQPVNJ8j2YTahlrOK","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgyaN6sJhihdnnlYSdd4AaABAg.8hoGdZ8UW1D8j2Z8JsIZMh","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]