Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
WTF are you talking about? There are not 3 humans in support of each Waymo car o…
ytr_Ugxy0XCzE…
G
If you compare the neutral networks in lmms to humans brains we dont realy under…
ytc_Ugy64zH0q…
G
Funny isn’t it, he gets to declare a warning about his fellow elites activities …
ytc_UgxxVXC9y…
G
when you run the AI on ICP the internet computer protocol there are no hacks.…
ytc_UgyEsVyKb…
G
Elon getting the Theil face puffiness…minus the sheen…or with less sheen. Also…p…
ytc_Ugw-q6x7R…
G
So, no video of the cab of the "exit-to-exit" trucks? Drivers hop on at the exit…
ytc_UgxC83mLO…
G
AI artists need to be sent to some room without any internet,,js paper and penci…
ytc_UgwOquxPq…
G
New age AI , welcome to the 25th century.. send them out.. but I do think a con…
ytc_UgxfGU5AT…
Comment
The male robot speak of singularity in 2029 a year before the 2030 Agenda. Google definition: "The term singularity describes the moment when a civilization changes so much that its rules and technologies are incomprehensible to previous generations. Think of it as a point-of-no-return in history." I guess this is where humans go when we focus on IQ rather than emotional intelligence. Home school yr kids!!
youtube
AI Moral Status
2020-01-18T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgynbDnht02zSLzdHiB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwTc3t9TGKmwQ9HR9B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwVgJSbF92NP3NIedJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyegn1nnKPSz9gC9y54AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxvBixxUQ-aGb7qsLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzRHOWsGmqqX3ugWql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyXdspCGfVmp_WD3pZ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-n_lUlLne4ZJRUzB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWra8vv9yV_MzAx214AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFuGXGv9cGcriJxNd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]