Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AMorata604 "Ok, then I will just let myself get taught from Teachers, how to u…
ytr_Ugz_nZBEK…
G
> Now us as a society, are supposed to rely solely on the information provide…
rdc_n7ohy8x
G
The cop definitely knows it’s a self driving car, he’s probably hoping the camer…
ytc_Ugz6UGrwI…
G
Not AI but I literally saw at a Walmart something called a smart snail. WTF is a…
ytc_UgxRkhBs0…
G
So long as we don't AI-automate control over the physical breaker switches and p…
ytc_UgzPmD-LJ…
G
Eh, I'm still not sure whether this is a good idea. Or if they'll even succeed f…
ytr_UgxjC_WkD…
G
Others - ahhh i got mini heart attack in last clip
Legends -chil bro it is ai g…
ytc_UgwqW7T7g…
G
I agree with Zielen. If we develop AI, we could pack that AI into a robot body a…
ytr_Ugh4g1uNg…
Comment
Suchir Balaji (November 21, 1998 – November 26, 2024) was an American artificial intelligence researcher who was found dead one month after accusing OpenAI, his former employer, of violating United States copyright law. His death drew widespread attention because of his purported whistleblower status and claims of foul play made by his parents and others.
youtube
AI Moral Status
2025-11-24T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | liability |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwAbR8g9MlxrYktYa14AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEpyV7WitLIdW0F3Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxHkFArr9UaugXTwGV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"liability","emotion":"unclear"},
{"id":"ytc_UgzRa4hzfWrF8TWe2Wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwMI_9j6Ut_Lq_viJN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwN8H8nj2zPniEJoXl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwnGmj3OLjW-GlrpS54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz9KjvwZw9a8IgTLP94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzRwI56s4x5sRNNsJZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6tmMN58l_LEXGyKx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"}
]