Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Me: As much as I love AI I hope companies didn’t use much on it I want a hybrid!…
ytc_UgxztiooN…
G
I am frightened at the proposition of "talking" to Jay Gatsby: he is a character…
ytc_UgzvOVi_D…
G
Imo the AI isn't doing anything normal humans don't already do, but I understand…
ytc_UgxDk1YfU…
G
Al final todo es una cadena y si millones de personas pierden su trabajo también…
ytc_UgzwTYyDj…
G
Because they are encryption keys. If you had a single place to register names fo…
rdc_f4zdgq6
G
Autonomous weapons have been around since WW2, they are called missiles. Movies …
ytc_Ugw1-aTee…
G
What I want to know is if Robert McDaniel was compensated in any way for being s…
ytc_Ugwj1sv8W…
G
This is not how software engineers use llms in their workflow. They use it as a …
ytc_UgyzUJHbN…
Comment
It would be useful for some additional context information. I haven't driven an electric vehicle nor specifically a Tesla, so I don't know how the steering operates. Can FSD steer the car without the steering wheel turning? Is it drive-by-wire or is it a mechanical linkage? If you hit a rough section of road, would you feel the steering wheel feedback and have to fight it? Do we know just what that torque is measuring? Is it external torque, or does FSD also apply torque to the steering wheel to effect self driving steering? I must admit, either way, whatever caused the disengagement, it does seem scary that it can be so sudden and catastrophic. Even if it was a knee knocking the steering wheel when going over a bump, if that's enough to disengage FSD like that, then that is a problem with FSD!
youtube
2025-06-01T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzrWJstsJEmFO71eMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHEAKhAO7x8sTXnEF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw77GzS8Ksbma-vm_V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwe6cOnlNzpecTXGHZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxMrn7d5ofrk9IS5pJ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmFMrPlU3lDSzydQB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxOjh51CmaJsTLlpxN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwJLaUt8S2J2BKzTi54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOoqoNlYCMCCvsWq14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxUxe-Sls7XYXTDvEN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]