Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well then you program it to not care for such things as itself. We've already re…
ytc_UgjSLngEy…
G
AI is already helping regular people nowadays, just look up all the uses that AI…
rdc_oh6gmc6
G
I swear AI artist tend to forget that even if someone was naturally talented or …
ytc_UgxUCE-bI…
G
How would copyright protection even work for ai art?? Is no one allowed to use t…
ytc_UgyK3SQ5k…
G
Lol, they're not pausing anything. The US is not going to give up being ahead in…
ytc_Ugy0C_HA_…
G
I swear, the fact they make it a double negative, "blocking AI regulation" is in…
ytc_Ugy_D6_wv…
G
I never believed AI would kill us all, for one reason. Computers are blank slat…
ytc_UgzyQiQqu…
G
This isn't limited to language use, either. A paper came out last year showing t…
ytc_UgxX_zSqx…
Comment
FSD is a scam. The technology doesn't work and it puts yourself and everyone else at risk. The evidence is pretty clear that most drivers are better than Tesla autopilot. The part that REALLY bothers me though is that it forces everyone else on the road to take part in the beta test. Those things run over pedestrians and bicyclists that did NOT consent to be part of Tesla's move fast and break things approach.
youtube
AI Harm Incident
2024-12-16T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5u5HtLY08NTVVfgt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMaurJKyDiIbNascp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQtIE6uIkffGcwFUF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-0uxc3Wu-BwH7Fed4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzUe291MA0JNbhSmqx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyk1BNOnsfe05bx2dp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzlB6E3CAoi4migsJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqkZj5-iP5QoGVEK14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwb12XEGOL00GaWhCR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxCAXJSWP6FM0ttC294AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]