Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
a ai more inteligent than humans i guess that be posible in the year 2025…
ytc_Ugw9bdt9T…
G
Conflating robotics and AI is confusing. They are two grossly different things. …
ytc_Ugy95JFpa…
G
Well since yall live under rocks and are not well versed with AI in general, her…
ytc_Ugx_PuuN8…
G
AI is a lightsaber, but some of you are still trying to use it like a butter kni…
ytc_UgwDV-eUk…
G
Unfortunately, "residual" and "error" are often used interchangeably in statisti…
ytr_UgzaZQogf…
G
@lepidoptera9337 Your joke was implying I'm stupid because I said AI don't eat.
…
ytr_UgzaaAe59…
G
Every single surveillance technique is in place at comparable extent in USA and …
ytc_Ugw4aWlTU…
G
Ai is very good at making derivative worse versions of things actual artist has …
ytc_UgxdgGA4u…
Comment
Man, this feels like the laziest hit piece I have ever watched. A few points:
1. the human, if paying attention themselves can prevent all or most accidents
2. Autopilot is NOT FSD. One is Cruise control with a few bells, the other is a developing level 2/3 autonomous driving. Properly defining and understanding your terms helps lend credibility to your journalistic pursuit.
3. To not seem bias, compare the data across all auto makers with Autopilot-like driving assistance
4. finally confirmation bias, if you only look for cases of autopilot going wrong, you will miss the 1000 plus times it went right. Fords crash everyday, you don't see anyone asking for a ban on Ford cars.
youtube
AI Harm Incident
2024-12-14T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxO7Yy5thLrxDjjDnB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzeSpRh3eCXBvP_rZB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_Ugwm5s3Lm3hB_8dJdVh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugx-NmIdlBtl_38jXFp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgxOg2O0Bqjo4GEYU_N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxEr5U12jviP2Q9lAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugy2KoCVow2wqdNid094AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugx0FAravm_KUWK8-Hl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzD7sZ_kL4tx_g88iF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyGXuqufy4OiabyZjF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]