Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m concerned about this redraw AI art trend. Are we basically teaching AI how t…
ytc_UgzXYf4tb…
G
He wanted to use Ai for the strikes on Iran. Company said No. Thats why Hes so a…
ytc_UgwmD-UGG…
G
so other women deserve to get deepfake into porn without consent because some ot…
ytr_UgwpbbM4O…
G
Yeah, they a laying people off to pay for the billions they are spending on AI h…
ytc_UgxEHMyiz…
G
This is absurd , why are they wasting time debating when theres nothing to debat…
ytc_Ugw0eXvk9…
G
*It freaks people out that one of the top AI leaders says AGI will cause a 2 wee…
ytc_UgzPro5SQ…
G
@ yeah I get that mate but it’s weird that you said “I just looked at the finger…
ytr_UgxD2nLxa…
G
Hard to tell about the AI....I'm sure it's memorized everything on the planet...…
ytc_Ugxlr0bEF…
Comment
Well here we go self-driving 18-wheelers. The identity is do they know what humans are across the street. Too many things he go back in a hurry. Who will be held liable for a truck running over a human. Animals starting out in front of the see my trucks such as deer children. Too many things could go back. Without a human behind the wheel. I would say this is probably the most dangerous thing on the planet. I believe if one of the automatic robot trucks kill one person. Those three guys should be locked up in the penitentiary for their entire life for murder. And so should the people who permitted self-driving trucks on the road.
youtube
AI Jobs
2026-02-03T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyabJZEk9foAeztmZF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwAgdTJaAUUN2YxS1x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytcEgFb_4YnXC9kcB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzX3OTEGq6942DninZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzPSbLa_iQ6lniebPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWzS7arKTQS4g9JD54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy5_QEz7u3GaBUc0oV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcZ8Z1K1zCZrIAw2p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw36xDRUKJL2E0LV0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxgxSBuQYqbDiYZkW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]