Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Notice we aren't trying to make male bots. Ladies, y'all need to get your crap …
ytc_UgzlCKttp…
G
You don’t think anything (not necessarily everything) that goes on in our brains…
rdc_iodc04x
G
Honestly this argument is weak.
From the moment that the entire AI base is dev…
ytc_UgwHFVS84…
G
You may find it fun to poke at AI Albeta, but I'll have you know you will be eat…
ytc_UgyxM8oVa…
G
So will an owner operator be contracted to babysit the driverless trucks followi…
ytc_Ugyup2n3Q…
G
The issue here is AI can only combine information in mock patterns that exist. T…
ytc_UgxAveG1j…
G
imagine a battlefield, where AI marks the target and asks the targeted solder "Y…
ytc_Ugxve2CUI…
G
Its our jobs to automate everything , we will therefore work towards that, there…
ytc_Ugx2jAfSU…
Comment
In its present form, Autopilot takes the person responsible for operating the vehicle, and makes them a monitor of the system rather than a driver. Humans are notoriously bad at monitoring tasks. So if self driving car systems are to exist, they should be responsible for the entire task, making the person a passenger with the same lack of legal liability as a passenger would have.
youtube
AI Harm Incident
2025-08-16T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxoil0R2Yz_u6Bidsh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzixCdZJf_b2MHtdhl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"},
{"id":"ytc_Ugy9ui0_R7Mm-07boJx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwDcIQCxHFxKg3E5694AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_47p-YFVOI89zSGN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3ZsGkLuQit1ZMARt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxre77aaFSu2w4UnJN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPPanZucDAuSozccN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoBcvqYoGR-FxwetZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxME6KarRHKXrxQhYx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]