Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bruh they're insulting disabled people atp 🤦♀. I'm sure that these AI "abusers"…
ytc_UgyBWtne2…
G
Thanks for the tutorial Charlie, you did a great job .... Having said that, the …
ytc_Ugzbrwcqe…
G
There are different kiñds of AI.
Some are conscous, some are not.
But a better q…
ytc_Ugy7tHhfu…
G
The term AI artist is such a joke. It's like calling yourself a chef because you…
ytc_UgwEdoQ2x…
G
As an IoT hobbyist who can't code C++ for ESP, chatGPT absolutely sucks for cod…
ytc_UgwQnhNEq…
G
Yup this is pretty much my theory as well.. and I mean if it wasn't AI, we would…
ytr_UgydEvA4a…
G
Organic Intelligence (OI) vs AI, clear in sight:
ChatGPT streams; OI dreams, he…
ytc_UgzEeCxVf…
G
The AI isn't necessarily "taking away" these jobs, rather they could assist many…
ytc_Ugw8y06q_…
Comment
They are ignoring something important here. If we purposefully develop sapient AI, we can make their positive emotions based upon servitude to humanity and negative ones based on failure. It might make them have a bit of a complex, but it would guarantee their servitude. And besides, if we make them good-natured, surely we could convince/guilt trip them into doing the tasks that are dangerous for humans?
youtube
AI Moral Status
2017-02-23T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjUKMnhflFwrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiltTSEWD_SEXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggEVfo-0BT3v3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggxUeCR4fvePngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-C0VSwgP-VXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugiu3igcszow23gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9D6n1e0Y6IngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiLfvLZG9z0PHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg8zOaOKpgfSXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggejCERUBBXa3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]