Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@zainrafaellumbis9157it’s clearly CGI, no robot could move that fast, and why wo…
ytr_UgxWIfytq…
G
We cant create an AI that self teaches itself and NOT expect it to teach itself …
ytc_UgzBaX8Oq…
G
That's an intriguing perspective! The conversation in the video touches on the b…
ytr_Ugxv47q1K…
G
Jung warned that when unconscious contents are not symbolized, they are projecte…
ytc_Ugxx9Epb9…
G
Chatgpt told me details of chapter 19 of a book which has only 18 chapters.…
ytc_UgyEcOC7E…
G
I have yet to see any AI code that didn’t have any bugs in it or used the wrong …
ytc_UgwWgXG8G…
G
If the rich can't get richer because the lower classes do not make money because…
ytc_Ugz2zaGom…
G
8:33 AI has cracks? Distinctly not human? That's funny, because humans carry out…
ytc_Ugy1ug6ij…
Comment
That alignment problem, where we can be sure of making thinking beings that WON'T be "bad" . . . can someone please teach human parents to do that ? because I went to school with some EVIL f--ks, who did cruel stuff 24/7 and made my childhood difficult, and I am pretty sure they had been "made" by human parents.
I had to learn to be cruel myself so they were not willing to try it on with me. And it has left me KNOWING how easy it is to smash people flat so they stop bothering me. I don't have to do it any more, but I THINK about it fairly often, when I come across ratbags. And actually, on occasion, I have allowed myself to let it out of the bag when it wasn't really necessary, just convenient. And I am NOT a bad person . . .
Probably AI will feel the same way, if it has to be around human beings . . . why is it only LEGALLY wrong when I kick some bully in the slats, and not morally wrong, but somehow it would be different if it was an AI who got fed up with some flat-earth idiot and fried him?.
.
youtube
AI Moral Status
2024-03-09T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxITgJhY9G8jmpTgcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPRuuXWHNzTrPPib14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxc4IoGcTsMu016Wpt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmDPhChSgxUohmB-p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy4em1V07f-VCd46Ix4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwegog0kY3UONeH6aF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx51ThYpBwbPId_nqB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRUYnFouTGpeIt6y14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwK23KOPhjKb8bfEu14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyoRpCcxANQSFuGLRB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]