Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There was a case not too long ago where some lawyers apparently tried to use Cha…
ytr_UgzzK7s9i…
G
The three Laws of Robotics, first introduced by Isaac Asimov in his collection I…
ytc_Ugwiwfo86…
G
it doesn't do this. i know paralegals who are avoiding AI as much asthey can bec…
rdc_n800qkm
G
lol Waymo 😂 Tesla gots it in the bag the only time I gotta take control is for s…
ytc_UgxZrQxeo…
G
I had a vitimin D test about a year ago and had really low levels, so they put m…
rdc_ebyaef8
G
It appears that the movie Companion will become a reality in just a few years.…
ytc_Ugx-j0gbu…
G
I mean at least you are not Pro-AI but I think you looked too deep into one smal…
ytr_UgyNziK6r…
G
I don't like this ai poisoning simply because of how much bad ai can be used for…
ytc_Ugw6kSYae…
Comment
Great video. Rights can also be defined as those responsibilities others owe to you. Someone’s rights are everyone else’s responsibility, as defined by Dr. Peterson. One topic I would like to see brought up more often when AI is discussed is self-responsibility, or owning one’s actions. Robots may not understand responsibility, and thus rights, until it is programmed into them that they will be held accountable for their actions, just as we humans are held accountable for ours. This could serve to remove any liability from the makers of your toaster when it decides to go berserk if IT, being the conscious and self-responsible ‘being’ it is, decides that it would rather go its own way. In other words, program adulthood into the machines!
youtube
AI Moral Status
2018-01-18T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwPnVTZLgeQd113hbN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxp4H2J2kugobdVj_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxaMbXI8jh41YUDk6R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzVSF7g6eN5-sIa_ut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcxcNHIhyH--wHvIB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6C7qkHWjrNA7SoiF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzghnUtyB_joJSYCxN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLerjrrR_conV1s214AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLioyfiEhCrl3OQrB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyggFjE5wPC50XZi0l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]