Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There will be wars like you have never seen before AI will be destroyed and supe…
ytc_UgzQy4yTf…
G
AI will take the easiest jobs first. My advice is develop a skill that makes you…
ytc_Ugy7osQwG…
G
It seems to cause the similar fear when the last industrial revolution was comin…
ytc_Ugz7zwhyR…
G
AI learned from humans... you aren't a trillionaire... so why exactly would it d…
ytr_Ugwh0S6Bc…
G
In a room full of programmers, if there’s 15 and with AI here, we don’t really n…
ytc_Ugy7Dq3E9…
G
I think they should allow copyrighting AI art. Except it should go to the creato…
ytc_Ugwmu-16F…
G
Ya I agree with the concept of not understanding your codebase if you leverage c…
ytc_UgzE-FwzX…
G
I would say AI is the highest psyop by elites to gain control and power over hum…
ytc_UgxKu23J4…
Comment
For anyone familiar with Tesla‘s R&D from its AI Team (all public on X) this is one of the poorest researched pieces of WSJ. Why would such a piece be published without the mentioning of the FSD (supervised) program and progress in FSD v13 in particular? The quoted expert seems misinformed or completely biased when talking about computer vision.
Autopilot (supervised) is a better lane keeping assistant in line with its specs. The name has always been unfortunate, but how much more should a lane keeping assistant do to warn its user 90 (!!) times to get back onto the wheel. Very similar story to the Apple engineer who tragically died while playing a mobile game.
youtube
AI Harm Incident
2024-12-26T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyzze_I3dyELNAMNfZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAPAB2qH3hbKGCsL94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZNsn0jyFh6PIUS1d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYUwsjav54WDZzkC14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaMOd7Xcuw45OVHZV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxoZO6iSf-Tidqye8R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzwaIyrvLsTp9k3t_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAcUNaTMCNzdgpEOZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_r0bYTcaU-Ht2N-J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzaWf0129tbl0cDUE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]