Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im not even an artist, im just a consumer like all of the rest, but even so, I w…
ytr_UgzMHYj6V…
G
I mean if AI takes over and kills us all, could you really blame it? 😂…
ytc_UgxJq66Dr…
G
As soon as I hear the calculator argument, instantly skip the vid.
Dude , AI isn…
ytc_UgzVA-J9t…
G
No , AI cannot be smarter than us , because life it’s not just about logic and w…
ytc_UgyFQhzWk…
G
Imagine being so uncreative and dumb that you can't even make good art with Ai…
ytr_Ugw9H2E3A…
G
So let me get this straight. We’ve had Assistants on our phone for 16 years and …
ytc_UgzB0vocg…
G
Now everyone will be a slave no matter what color you are lmao way to go smart h…
ytc_Ugyh045VV…
G
So happy the guy doesn't have a huge ego and make dress the robot the way he dre…
ytc_UgxzRI-kp…
Comment
The "Level 3" title is a bit of a marketing trick. While Mercedes takes liability, they only do it because the "box" they allow the car to drive in is tiny. It’s like saying a train is "better" than a car because the driver can sleep—true, but the train can only go where the tracks are.
Mercedes DRIVE PILOT is basically "Traffic Jam Pilot." It won't work at night, it won't work in the rain, it won't work in tunnels, and it can't navigate a single turn or exit. If the traffic speeds up past 95 km/h (or 60 km/h in many areas), the "self-driving" ends.
Tesla FSD is a general-purpose driving AI. It handles the suburban turns, the roundabouts, and the high-speed merges that the Mercedes system isn't even allowed to try. Mercedes has built a very safe, very expensive "slow-motion" cruise control, while Tesla is building a system that actually understands how to navigate the world.
youtube
2026-04-19T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyqvoNmTPYulYaK5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDqoekMFy-jDF4TBR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyxB3riyliF2Fc6oNZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwD617gNaS-e9kRnhF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3ka2l-ZTe5wwdiK54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyeSOLBytSpW2Rb9oN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-rhMnHLd0YxJ6QrJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKLrXZLtKT_furlnB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGcgVwQ3wm_9iUAvF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwUGlyXn6Mh2T14CJp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]