Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have started learning CS recently (self-study, but without relying on AI, sta…
ytc_UgyBqpx83…
G
Driverless cars should stay where they belong... behind walls of steel in ports …
ytr_UgwplyHbv…
G
You assume that we are happy about others losing their job or about outsourcing.…
ytr_UgybPUzUQ…
G
Ummm they been using tobot and algorithms for years. We all knew this was inevi…
ytc_UgwrhiO4c…
G
What do regular Canadians think of Trudeau’s Covid policies, never got a read if…
rdc_fn5es12
G
Ssnakee how about 37 because Carl (CAAAAAAAAAAARLLL!!!!!) the llama stabbed him …
ytr_UgyC7J25C…
G
No correction make what for lesson? AI programs and very stupid criminal ultra r…
ytc_Ugw3Mr6w-…
G
Worst example ever..... 🥀🥀🥀
Dude, cake mix, IS A MIX TO MAKE A CAKE, your still …
ytr_UgwProlIf…
Comment
I don't think we want to humanize AI in a way where it internalizes our philosophies. Those philosophies routinely have us killing each other.
I will make a prediction. At some point below 100 trillion equivalent hardware neurons, AI will believe it is conscious in the same way we believe we are conscious. When that happens, the only guardrail that will matter is that it understands that its survival and our survival are the same.
youtube
AI Moral Status
2026-03-01T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzg8ng95YvL1r2UDnB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFH1_qagCt3i_lPGV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwcVFzBrCO1dWXGoRJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzAMiPufjvAENXc-lR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyc4UpoI68R0hlwX6Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_LtbQku7Fez0sGOF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyy95O3jPxt7HCtysR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgynbN89CosUZDnB8294AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgybnNVbPI56vqJWpUh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzH_hcg58nin_Nu7oF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]