Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the mind's thinking and the LLM's operating where the same kind of process, s…
rdc_oerqeyb
G
The idea that new jobs will pop up to replace the ones automated away is very si…
ytc_Ugy1ntHZg…
G
Lol you newbies.
Here is how you train an ai to decect glaze and nightshade.
…
ytc_UgwrBpg0o…
G
There is a guy who draws epic tags with just his eyes. He can't move his body, s…
ytc_UgyUCeIAd…
G
The Bible warns us about this. Psalms 115:4-8 ““Their idols are silver and gold,…
ytc_Ugzb_FyTS…
G
This is so fake you can tell the screen is messed up as they are asking this rob…
ytc_UgwMhRXmz…
G
I also don’t really care because literally as you sent this, I was busy thinking…
ytr_UgwUCvaEv…
G
200 years from now when Robots running the world because they're the evolution, …
ytc_Ugykqejru…
Comment
No but for real this will lead to the collapse of civilization. Humans will prefer to surround themselves with AI robots rather than actual, fallible humans who may annoy them or disagree with them. The population will quickly decline and humans will have to breed babies in artificial tubes to maintain the population. Child neglect and mental illness will skyrocket as people become completely unwilling to engage in interaction with other humans and the responsibility of raising children will be unbearable for most people. The same thing happened in the mouse utopia experiment conducted in the 50s which found that when mice are given the perfect utopia to live in, they quickly begin to neglect their own babies and the population quickly plummets to zero soon thereafter. I'm tellin you right now.
youtube
AI Moral Status
2023-09-20T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzvf8j0L11NJ2J-RLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwueIkwOAmQv6qGeSx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx2-t8O9mEspvYII_N4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXeRp4BDYyMV0_ePl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugz638dqEYW7usGzUkZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxftIG0HBe38RZ3qw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3u2oVGDjWmLz-Atd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzuEOiZ0_EjJ2tRnz54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxd9_mihnWI4yGojK14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5Q_GT7z5zQX746j54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]