Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol ... China being regulated ... They will use gen AI control their population…
ytc_UgxNMNo4T…
G
Aren't robotics and AI great? Now get out in the fields and start picking, you s…
ytc_UgzHRDrGI…
G
Chatgpt can write code and AI is making a lot of coders redundant. Why should a…
ytc_UgxOhIuF2…
G
Self driving cars can act like the roots of trees in the future meaning that sto…
ytc_Ugxic792s…
G
False Dichotomy
[ fawls dahy-kot-uh-mee ]
Phonetic (Standard)
IPA
noun…
ytc_UgzTx6QwO…
G
Advent of AI is the "Beginning of Human Oblivion"......AI is nothing but human s…
ytc_UgylTqTdx…
G
America: a nation ruled by lawyers. China: a nation ruled by engineers. Place yo…
ytc_UgznBi8Hc…
G
Without a soul a machine can't surpass humans, we have brain and that is the mos…
ytc_UgxMmEld-…
Comment
3:10 Does FSD have single point of failure in steering wheel torque sensor? If the torque sensor simply failed and started to show "user is applying heavy torque towards the left", could that alone explain the whole crash? (Compare this to Boeing 737 MAX which automatically rapidly guided the plane towards the ground when a single AoA sensor failed.)
See 4:28 for an example, the steering wheel torque starts to increase with zero change in steering wheel position. I would consider this as an example of sensor failure.
The accelerator pedal and brake pedal typically have dual sensors to avoid single point of failure but how about the torque sensor of Tesla steering wheel?
youtube
2025-06-02T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyavX6Egk-rS_jacl14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxtV3hpG-wjCaYofSh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyvDgzX8rmfrFKW6114AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugybfoa_WO7lTOUiWBh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8f_LaJ0YYqr_DrJZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz8l773WT9wDfrBuLR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxID07JbIE3t7bK5Ql4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyIESFDiXfLMKPRh894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKaGEFLWJUP7DRQkx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzykjV9gVzkg5rGoG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]