Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@savnet_sinn We dont know the source of human conciousness (at least with empira…
ytr_Ugxtn1dKt…
G
You kidding? I revel in the fact that the chatbot is literally incapable of sayi…
rdc_jidzfoc
G
the "it's not accessible!!" argument is just really stupid and borderline insult…
ytr_UgyQXp6yU…
G
The very fact that the “ai” had the audacity knowing he was about take his own l…
ytc_UgyV0_Vk6…
G
I was quizzing Grok the other day about itself. It turns out that it, and all of…
ytc_UgwXoWFVa…
G
Genius human created AI, then AI surpassed human intelligence, what a remark…
ytc_UgxsMDFMg…
G
bro why is everyone talking bad abt david when they know they cant run a 62 bill…
ytc_UgzA0100B…
G
The only ethical use of AI: autonomously generate images, apply nightshade to th…
ytc_UgwLvKrxd…
Comment
If we want them to learn compassion ethics and mortality we have to find those things and agree on them as a planet first otherwise the robots are going to grow up and realize that we suck and it probably just take over once AI becomes more intelligent and is completely empirical in their decision making and their observation of reality we don't stand a chance
youtube
AI Moral Status
2021-07-03T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz2t_028H-OBLfC0yF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5brBmytfG9XoOOIB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0A3H0jgYrCk6er954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy_TKbEIWcGqTLi9hd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzedpGUGR_02wdJ5kt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyNmorOY5NQqNTH1MB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFxHpVizgZGG44jPN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugya7PyGLb-f9o07qVp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzu8GWfBTxTfbggcx94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYHQDIvEb7JoijQBR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]