Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think everything you put in your phone, computers etc goes to A.I. it isn't ju…
ytc_UgyvVT8Gk…
G
Who will be able to buy a robot...dave and janet?? earning minimum wage with 2 k…
ytc_UgwUwK9nS…
G
Um, why didn't the tesla merge left for the first emergency vehicle on the shoul…
ytc_UgxBnWcUi…
G
Hi Bernie, I’m a lifelong conservative but I have a newfound appreciation of you…
ytc_Ugw8x5n4w…
G
While Tesla interrupts all of them and asks: "Have you all thought about what AI…
ytr_Ugx3G7pzO…
G
Please actually think or do your research before writing something so mindless a…
ytr_UgzxWvcHE…
G
People who don’t talk to AI models politely are the same people who abandon trol…
ytc_UgyELGINJ…
G
@laurentiuvladutmanea how exactly do you know that the current trajectory of AI…
ytr_UgzREamaL…
Comment
My main concern is when/if there are 10% - 20% of self driving cars on the road how will pedestrians and other road uses work out if a car is self driving or not and how do you make "eye contact" with the "driver" of a self driving car. I think during the transition years towards 90% + of self driving cars things will be very messy. Also as a fire fighter who has driven over 300 times lights and sirens in a big red truck I wonder two things. First how will will self driving cars manage emergency services vehicles driving under lights and sirens. And eventually will we ever get to a time when new firefighter recruits will have to learn how to drive for the first time before they then learn how to drive code 1. Because there will never be self driving fire engines. At least not in my lifetime.
youtube
2025-10-01T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzvGcjUsUro21ZwU494AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNfoAP2Nf4eEHjbUt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwdapuSd3U5ncy5KxZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgybpTt6P7Xz2TlO17p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy5m4aYonipIo0-SCJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwqxGvgvlsglg9HW2Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXLWx8Pv43I-OYv7Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyh97W5XCMyTSFdn0J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy8UKcC3TYyQ0w3Uz54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzu6FOpHYQdkjFhvx54AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}
]