Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That isn't a theory. That is the plot of some books. Roko's Basilisk.
Basic…
rdc_jfatjcw
G
If the entire world benefits from AI then why does the US have to carry the cost…
ytc_UgyFg5tF8…
G
transformer models don't do anything unless you prompt them. You have to put an …
ytc_Ugyj2E-nv…
G
Don't waste your time talking to chat bots, which use ungodly amounts of energy …
ytc_UgxpsrCah…
G
Yep, i actually already had the case, not with ChatGPT, but with DeepSeek, who h…
ytr_Ugyhxu8Mx…
G
You didnt tell the ai to add contrast. Give the AI the parameters and it will fu…
ytc_UgyfGY5Rk…
G
Plot twist: This entire TED Talk was created with generative AI trained on TED T…
ytc_UgyM53xDe…
G
Full agreement on the opt-in/opt-out framing. That should be the standard.
Part…
ytc_Ugznxb4eZ…
Comment
This is cool I want a robot now. It's some fascinating technology and those who create it have some good knowledge of what they're doing, I wouldn't know where to begin. Humanities creativity has come a long way for sure since the beginning here on Earth. I'm 26 and it would be so cool to see in my lifetime this technology get better to the point that they seem super realistic and hard to identify as robots at first because of how human they act lol I can't wait to see what advancements they make next. :)
youtube
AI Moral Status
2021-04-28T07:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxmntLem7UmQ5AzytZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzUqjbOVuqK1PvENDJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz8dTGH2ut45pX_aTl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8PKtMQFrxxV3mus54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz70bIM0Z4H2aRFpxN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxBI5spL-YLdzI4_Td4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyhqg1t8nwlTV3URix4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxQsqHq-JMSvagE8C94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzx9n9VoosoV2UQ4SF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzBCwCDB68TbTtcyhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]