Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI 'artists' Are mad... And that's the end of the sentence. They're always m…
ytc_UgwGUt8fS…
G
Arguing about AI is difficult not because people don't understand how AI works, …
rdc_mzwz8yq
G
Some day. Man will go to war with robots. Than their will be a movement of human…
ytc_UgwiXTX6L…
G
You know what I think the biblical book revelation may have revealed what A.i. a…
ytc_Ugx8MEhNI…
G
You’re still using it and that will still train it. You shouldn’t generate AI ar…
ytr_Ugz0OlgIq…
G
Maybe it kills us maybe it doesn't, but they will gamble with our lives to make …
ytc_UgxJJYCzV…
G
Nobody's work is being stolen in any case. In fact, most AI tools CANNOT recreat…
ytr_UgyDrNEwO…
G
Funny how I heard about all of this from 3 different sources but they can only c…
ytc_UgyL3wwn-…
Comment
I think the whole "not marketing AUTO pilot" thing should get enforced a lot more. It's a marketing trick and really not a true statement of what the end user driver will think it is. Buyers are sold the notion of complete automation without the need for any driver input for hours at a time, when the reality is the driver needs to be alert, watching the road and be willing to take over from the car at any moment. Legally and ethically that is what should be happening. Every other car maker will enforce hands on or near the steering wheel when in cruise control, even if the car is semi intelligent and able to adapt itself to curving laneways or potential collisions without driver input for safety reasons, which is basically what Telsa is offering. Either get Telsa to enforce the driver touching the steering wheel ever 30 seconds or less, without some funky viral video work around, or just stop with the buzz words. Preferably both.
youtube
AI Harm Incident
2022-09-04T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyO1_FlZgc7zGEtvQZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyfIOXETd9sQUahzfh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_TvcXG6hYfTSwO7p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxkvCeiCNT01fCODR14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnarXl0MyRYDBsFg14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJAVBinUtvxjiTIWR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugys9JRs1nf8MiPHxJR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgydBCva0h09ainfQ9J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxY3cnI2ChFduxI4t14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWHhWFRjz8FklDBGJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]