Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Screw Mark Zuckerturd, Google's Evil Team, Jack Twitter, Silicon Valley Jackhole…
ytc_Ugxzf_kTt…
G
These creators are too full of their own stuff. Llm ai plateaued at gpt 4 , not …
ytc_UgxrucNbe…
G
some recent research, such as work discussed by political scientist Ryan Burge a…
rdc_o9u6lk2
G
Valid points, but I think some facts have been twisted to fit this particular na…
ytc_UgwQZ5Pbv…
G
Self driving trucks are about to take truckers jobs so everyone in that career n…
ytc_UgwpJxfeR…
G
@brown7180 completely agree, and also ai is just really bad for the environment …
ytr_UgzgSvB-S…
G
@zacbuyco3429 Citation needed, peridot, IMO
For e xample, it seems foolish to s…
ytr_UgyRKFeec…
G
Thank you for another great interview! I've been a subscriber for a while and lo…
ytc_UgwF2wwuN…
Comment
most excellent! probably out of my price range, but I sure think that would be fun to have one… One that can respond and tell jokes… Did you know that even Alexa can tell jokes… Kids like that a lot but more than a robot I’d like a driverless car so I’m gonna have to save my pennies that’s for sure❤q
youtube
AI Moral Status
2025-10-29T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyFlK4-_bBUE1MyfNt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAjV90fdAi98T4vox4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxpyIW55rEqSHMiMaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyCDgwxNWJY6UNKPw14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz_c_tq5xSNba9qXNp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz0ZempszscfWXwy9B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRIxS0SKDZ8yK3ywd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvKWyi4x0kBhqWNHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbPxTWJRecIor6hBt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnwsohVQo9zyAz0Jh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]