Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is a very real and very easy solution but you all won't do it so we're fuc…
rdc_et6j9gu
G
In 5 years, when AI has taken over the human world, we're going to look at this …
ytc_UgynyAsp8…
G
Humans will be the servants/toys/pets of AI. That is what awaits the future of o…
ytc_UgzUi8m_N…
G
I used to teach bicycle safety and commented to my students that they had to be …
ytc_UgzvRSZ5U…
G
I’ve used character ai a couple times, and I’ve started a fight with Hobie Brown…
ytc_UgxTclmJw…
G
If you are backend dev, you always thought frontend devs are imposter devs and d…
ytr_Ugx_4jjLj…
G
Can't pay an ai to install electrical devices and run power. The trades are safe…
ytc_UgwVEbSv0…
G
What type using and almost all of us are using is about 2 years behind what's be…
ytr_Ugx-b_QGf…
Comment
A human's main purpose is to survive and to pass down genes through reproduction. Everything we feel (pain, fear, elation, satisfaction) is an extension of our innate will to survive. Every action we perform can attributed to this (seriously, give it some thought). A machine's purpose is not to survive, but to follow its directive. So even if there were an AI that has infinite knowledge and computing power, it wouldn't ever try to overthrow the human race, because it would be counterintuitive to its purpose: to serve. It would have all the necessary power to conquer us, but it wouldn't want to do it. Kind of like how you could kill a child but you don't want to, because it's against your biological purpose.
youtube
AI Moral Status
2017-02-23T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjUKMnhflFwrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiltTSEWD_SEXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggEVfo-0BT3v3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggxUeCR4fvePngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-C0VSwgP-VXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugiu3igcszow23gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9D6n1e0Y6IngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiLfvLZG9z0PHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg8zOaOKpgfSXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggejCERUBBXa3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]