Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No Azure for apartheid ! Microsoft’s A.I is being used by Israel for target bomb…
ytc_UgxzQEIvO…
G
Best part is how they ask a random 80 year old what they think should be done ab…
rdc_k20owoa
G
What future do humans even want though? As a US resident, looking at the world a…
ytc_UgyeIif_i…
G
This all just makes me think we could be in a simulation as AI wants to live lik…
ytc_UgzvWeyAS…
G
AI is bad, yet this guy is capitalizing off of AI. And he thinks it won't eat h…
ytc_UgwtL6RAm…
G
If you work in a job that requires efficiency and repetition that follows a logi…
ytc_UgyRZlecs…
G
You'd feel less heartbroken (assuming you're not just exaggerating for effect) i…
ytr_UgxmN_9cd…
G
I believe they will not harm us. The human will use to harm humans. It's nonsens…
ytc_UgzDeC5s-…
Comment
We bought a Tesla a few weeks ago, and with it came a 30 day trial of FSD. Given that we want to know whether or not we’re going to keep the FSD, we’ve been using it a lot. And even though it is automation, there’s still quite a big learning curve with it. I know that sounds weird: Why would do you need to learn how to use a self driving car? For me, it’s like I have to learn how to “think” like the computer driving the car. And as I thought about that over these past few weeks, it’s not so much a learning curve as it is me wanting to know psychologically how the car is going to act so I can be safe; it’s predictability that I want. I want to know that the FSD is going to work and keep us safe. The only time I’ve had issues with FSD is when the road, usually an intersection, is atypical and it’s a little confusing for the computer, much less any human being, to know exactly where to merge and cross an intersection safely. Other examples are one-way streets. you’re not allowed to switch lanes or merge at an intersection, but close to me there’s a one-way street that spreads out into multiple lanes. And sometimes the FSD has trouble know which lane to pick. And of course this intersection has very poor driving lines and many of them are faded. And when I first moved to this area, I had trouble with the intersection because I wasn’t familiar with it. And that’s how it is with the FSD. Other issues connected to the FSD and when I summon my Tesla. Again, parking lots can be confusing places. And if the car is parked in an angle, and I’m in back of the car, the car needs to go down the one-way “aisle” and then has to loop around, if you follow me, to get to location. Overall, the FSD is amazing. And when you do buy a Tesla, they do go over the quarterly data, comparing crashes with FSD and crashes without FSD. And hands-down, the crashes not using FFD are much much higher.
youtube
2025-08-28T15:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJttEQW7Smz-N-MQp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwvS3lpzXOz7gyoGuZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxynCozCXwThmJbM2F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw12f3Rxi-bhkZZ7Pt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw6VLW-vMv-eiyXF5x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgywY5Uyw0H3Ik-raKJ4AaABAg","responsibility":"media","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxPDJwg1Wv1FyyHRQ14AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyyZghA6c5pR-9L2m94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwZtDWfCXkEbYL5upF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxMKMGa_P57D2rR7i54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"}
]