Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI was supposed to make our lives easier and help us but instead it just took ov…
ytc_UgwqVmzzb…
G
@daydreamer8373 You do understand that Tesla vehicles were involved in 53.9% of …
ytr_UgwdIVEgU…
G
I use ai to entertain myself. I train models and Loras just for myself. I litera…
ytc_UgxC1ry-C…
G
The real issue is that you can't even tell them not to use the destroy the world…
ytc_UgzxYQRVA…
G
People make a lot of assumptions. I'm guessing AI hosting has pretty rigid req…
ytc_Ugx-gRFdV…
G
The top ten % of the world that was born with the gifts of looks and intelligenc…
ytc_UgzWU0XIK…
G
If I were to open an account on Weibo chat or some Chinese Reddit. Would I autom…
ytc_UgwtvFbw_…
G
Great approach by OpenAI to fully cooperate for a licensing/approval model. They…
ytc_UgwDTIPSN…
Comment
Would an airline manufacturer be allowed to test and develop their aircraft and new automated technology while flying domestic routes with paying passengers on board?
Then why is tesla allowed to use public roads as a test bed for their immature, dangerous, unfit for purpose "auto pilot" and "FSD" ????
Why is tesla allowed to use paying customers AND the general public, who have not given their consent, as beta testers for a dangerous technology implementation?
tesla should be made to cease and desist, to shut down "auto pilot" and "FSD" immediately. And it should be held liable for damages in accidents they caused.
youtube
AI Harm Incident
2024-12-16T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5u5HtLY08NTVVfgt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMaurJKyDiIbNascp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQtIE6uIkffGcwFUF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-0uxc3Wu-BwH7Fed4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzUe291MA0JNbhSmqx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyk1BNOnsfe05bx2dp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzlB6E3CAoi4migsJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqkZj5-iP5QoGVEK14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwb12XEGOL00GaWhCR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxCAXJSWP6FM0ttC294AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]