Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I sometimes wonder about this, that would I prefer AI/robot doctors and nurses s…
ytc_UgyE9j2bh…
G
Let me clear things up for you: OpenAI boss Sam Altman, Mark Zuckerberg, and Bil…
ytc_UgyAYArak…
G
no , AI won’t replace anyone , anytime soon. I work in the field, it is all bull…
ytc_UgyI4_Gt_…
G
This is just a symptom, not a cause. The fact that people in those situations sw…
rdc_nk454or
G
All useful content. Props to Google for watermarking their AI generated content.…
ytc_UgzMW866n…
G
Just another tech person wanting more money to go towards AI....The problem with…
ytc_UgycrsZJO…
G
@missstripedsocks The way their eyes are always wide and bulging and their teet…
ytr_UgzS4X7eP…
G
Hold up Is it me or ai making those black kids eat cats or dogs 💀💀💀💀💀 (not tryin…
ytc_Ugy0fT5ZJ…
Comment
I don't really care about trucking. I don't think we should be worried about people losing their jobs. I think we should be more concerned about how we're going to take that money from the rich people. I also think that it is insane that we are allowing them to test their vehicles on our government-funded highways. I'm sure this is going to end very poorly when people start dying. The other thing I'm very not concerned about is that this is a problem that is likely to only affect the South at least outside of winter. You cannot have self-driving cars and win our environments. They can't get any self-driving cars to work. Why are we trusting it to the type of vehicles that are significantly more dangerous than cars?
youtube
AI Jobs
2025-08-01T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzc-6_BAacPHbQD9yl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwxlBoPpOw5vEwfTlN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyTNnZMk5iflQOJpT14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhNXSImaSHLpo_rLV4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxvPNZO593onP-t6sR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzG4BTlmNozQBn_hoV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxTNGxziT2qeo2lBNp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwKboBOORBvFRd7JwB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugx7fwuQMBFaPZ1l2tR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwemIAWVpFaL6ooaI54AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"}
]