Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s quite the arrogant double standard when we insist that we must prove that A…
ytc_Ugz8UHHUJ…
G
I do not care if an art piece is ai generated or not as I will pay for it if it …
ytc_UgwT-0Rrp…
G
Carbon emissions? You're worried about carbon emissions? Foolish human, you sh…
ytc_UgzKOHIQr…
G
I just asked ChatGPT before I started watching this video and it brought up this…
ytc_UgyJqws38…
G
As a programmer, I'm more concerned about the quality of the code that AI produc…
ytc_UgzjxAymY…
G
Also you can't ever fire Ava because she's part of a multi-billion dollar contra…
ytc_Ugz_jOVI1…
G
When AI takes over how will we pay for food and shelter? WiFi bills? Electricity…
ytc_UgxFCRfOc…
G
Thanks for your enthusiasm! It's exciting to think about the potential of AI bec…
ytr_Ugx5wxRHb…
Comment
In that case first talked about the Tesla driver aid should have disengage once acceleration was manually press by driver. That didn’t happen so Tesla is at fault.!!! Elon see safety as holding back profits but he calls it innovation. The software system could easily be updated to disengage upon any drivers actions but that would go against Elon speeches and company advertisements. Drivers would rightfully complain if the system kept disengaging so it’s a clear decision not to disengage the self driving aid.!!! Tesla incompetence has/is causing deaths. The camera only approach to driverless vehicles isn’t safe enough and extra scanners are required to be safe but Elon isn’t a man that admits when he wrong.!!! Ultimately it was cost that made him go with cameras only Not safety.!!!
youtube
AI Harm Incident
2025-08-16T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzY9pmELx64ghMC_wV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLoINY9kFe03x5mZl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyN1EBnNyibJ7O40DB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_jWCDbuSaRQBAPBh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwkZkk1SpBCdpTvToZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugz6chf2timwKrLqckJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRD7zfqtNLq9FR8ux4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz8dm3JWe9ATDoddep4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHwNntf2AVGseRYtp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRV6RhMObFzKd5x5t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]