Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this was such a stupid conversation. just replace the word "practically" with "c…
ytc_UgyfLfU9p…
G
I have no idea how the legal standing of that is, but there's a difference with …
ytr_UgyvHnGic…
G
I love how once an AI kills something people are like you should be at the wheel…
ytr_UgxVxjwpU…
G
LOL -- When Greed is your Creed, pretty words come easily -- this is about stopp…
ytc_UgzwiFSAO…
G
I honestly dont understand why people hate ai art
It dont steal art it yous trai…
ytc_Ugxi46G8t…
G
@jakemccoy that would require humans to manually categorise a vast quantity of t…
ytr_Ugxw3wDEF…
G
AI will never be able to do everything a human being can. Let alone better. It c…
ytc_Ugz1M11qY…
G

Seriously, stop. If you use AI this much and m…
rdc_oi3tj63
Comment
Let me get this straight.
They are all for driverless trucks that cannot account for the biggest unknowns on the road, ie human drivers and we know people are going to die and probably pretty quickly if they go out without the human waiting in the driver's seat.
Yet they pull over a trucker with brown skin and subjectively decide they are not proficient at English even though they passed the test, and put them out of service.
Gross!
youtube
AI Jobs
2025-05-28T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxHQ5ejoBVnaU8nAHt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwgY-RsgM02TN4jsOF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxcQyZ43LF2uS64xzF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwlyqdmmgadM0nuWdN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgyOiNtAcaXHIzT-ZCh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgznIxqg1bLTPA5SMyZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwQQ-OLKkI4wJQx1J94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyVBQ82fcf6nBkPpVR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgwQeGgnTH1rhWh1GGl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxRtapauDH68WUaPbp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}]