Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So if 99% of the people will be unemployed, who will buy stuff that AI robots ma…
ytc_UgxB6qoFw…
G
This could be now. It's simply a matter of teaching the model to work is all. It…
ytc_UgyJ00Kdl…
G
I’ve been surmising this. When the big platforms we know are terminally fucked u…
rdc_le63qly
G
Everyone is pro-AI, until it's their turn to be replaced.
PS: there are also AI…
ytc_UgwqX401S…
G
Haha, good point! It’s always wise to have a plan for safety with AI. In our liv…
ytr_Ugz9AtCUU…
G
Bro, ive worked since I was 15, I'm 41 now, Ive done everything from constructio…
ytc_UgzlEWrG5…
G
Ai and its profits needs to be owned by the people and not private corporations…
ytc_Ugz5MiyBT…
G
One day if you slave hard enough at the only opportunity left on earth you might…
ytc_UgwlIWAgX…
Comment
While I agree with those being legitimate and scary concerns, autonomous robots will not just remove morality from the equation. They will also remove emotion, greed, prejudice, and human error. They will follow protocol instead. If we install A.I. with a set of correct instructions on how to react in any given situation, then they will be the best trained officers this world has ever seen and will be incorruptible unless hacked. That will be the thing we need to defend against once a proper set of protocols and parameters are installed into the A.I.'s thought construct. I also believe they will negate the need for lethal interjection in the line of duty. They will not be "alive" so they can take the risk while employing non-lethal measures in the field.
youtube
2015-07-30T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgiG1VbD93Hl9ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjmk0vQ39_GpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjlP3MMVlkjBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UggD7tYfVbQtU3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughm6vEeTLi9RXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugiw0vwfohKCq3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjrD7whMK2ahXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjXz5wvV6sOe3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UggRoh03TKwiPXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggS5-aiz9SI73gCoAEC","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"approval"}
]