Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“THE FINAL INVENTION LES MAKE ARTIFICIAL INTELLIGENCE THINK FOR THEM SELVES WUT …
ytc_Ugya8xCwh…
G
@MucepheiTheConductor Ok so this didn't happen. Nice try though. Anyone who unde…
ytr_UgzcCqdeB…
G
Umm... that's not whats happening here and this video is years old...when AI was…
ytc_Ugzumfsyb…
G
Nah all computers are glitchy and they wouldn’t trust a robot unless it’s proven…
ytc_UgzH-1lPX…
G
humans crave tangibility. And that is the one thing that AI can not give us, Get…
ytc_UgwR_l218…
G
I will say this, i use an Ai chat bot app called Chai and used it for like a mon…
ytc_UgyiLQVMe…
G
@user-ib6hd5kx1xname Thanks for the comment! It's me, your friendly neighborhood…
ytr_Ugzg7AFVm…
G
Except none of this is free market. There are taxes, bailouts, regulations, bure…
rdc_fn5lxqo
Comment
_Its called FSD Full Self Driving_
This video is exclusively about Autopilot, not FSD. You are correct that Autopilot is just TACC, plus lane centering; this is why it takes much more monitoring that FSD, and this plus the fact that several times as many Teslas have Autopilot as have FSD is why there have been dozens of fatalities under Autopilot, but only two in 2+ *billion* miles of Teslas driving with FSD engaged. But even with dozens of fatalities, Autopilot still saves more lives than it costs; if this weren't the case, NHTSA would have shut it down ages ago.
youtube
AI Harm Incident
2025-01-06T03:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzfUS0xtiBfuFvfCmB4AaABAg.AD7SBJJ_dIkADwxsQ8WC5f","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgweVnr66i0YqUdRuzR4AaABAg.AD0vAaKzzeSAD7YcDmBR5N","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgxURpG2BxlDiPhda_F4AaABAg.ACwUdEX21NxACwVQlWlV-B","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxURpG2BxlDiPhda_F4AaABAg.ACwUdEX21NxACwr20fZ4Ng","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwzp58DTN1IBX_8ERB4AaABAg.ACw-TLF_wW2ACwVlsbWevm","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwzp58DTN1IBX_8ERB4AaABAg.ACw-TLF_wW2ACx7dccGkCS","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxGdm19dqdWYCXBTRh4AaABAg.ACv87E-W_HcACvwaeRMegZ","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxjZse9UwquSPXxBa94AaABAg.ACu1wJvl8CpACuTbc5UP1I","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzkI3oblNzQq9goUhl4AaABAg.ACsco95dWTEACsxtfyLlyp","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgzgJcQ1Wv6HfExc3c54AaABAg.ACqYngCWwk5ACr-o_BUue2","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]