Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Copilot has been invaluable for working through Penrose’s Spinors and Space-Time…
ytc_UgxOBpPx4…
G
I treat AI like a pet. It's not human but it's "sentient enough" for me to form …
rdc_mzyehu7
G
Did we really need AI? Apart from science, it seems to be creating more problems…
ytc_UgyCaz_Md…
G
If AI were in good hands, I might be for it but I've only seen maybe 5 good thin…
ytc_Ugy1tFv_h…
G
@drackar like, at least look for true arguments, and not something thats utterly…
ytr_Ugx6la1sm…
G
I don't know. Yes, on the face of it you could argue that no driver means no ass…
ytc_UgxuQN2gm…
G
saw a bunch of those tools trending but Winston AI still manages to detect most …
ytc_Ugxs7ZJrt…
G
Parsing the 5 C’s that David Sacks lays out, four of them don’t require federal …
ytc_Ugx1DpZs1…
Comment
It is a lot of fun to watch Tesla fans still coping by saying, «I’m so excited to se where this software (which is far behind the competition) will be three to four years from now». Despite the fact that these cars were supposed to be fully self driving «around this time next year» according to Elon for almost a decade.
It makes me wonder, how long will they still believe in this. How many years until they, just like me lose hope and faith in the project.
Also Waymo uses AI too and giving a five star review to a ride where the car phantom braked and could’ve caused a crash is absolutely wild.
The cybercab using hardware 5 is pretty crazy given that they claimed model 3 with hardware 3 would be able to be fully self driving when the software was finished all the way back in 2017. They clearly had no idea what it would demand for the hardware to be good enough.
youtube
2025-06-26T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugze26baVQ0V2adqF1B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyVva0FGOcAX0-B5Nt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugycp641n2x2bJu3tdJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzsjx8OVYrfhvmgxCh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy3aSNI2m2VlytZYtB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-o5uJDwp8PeDAguF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxIgDkuabsuicChKmF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzHj2wfBjfxV0Tdx9B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy2DmuKv-2XmbSZZtB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz94v7rpQQs2oiUZHl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]