Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is the modern day slave without the ethical issues. Another thing is it uses …
ytc_UgyF7oD9p…
G
As a non artist, I mostly use these programs for fun, such as getting funny imag…
ytc_UgypP4ZbE…
G
You created an artificial intelligence called Grok. Your Grok went on a rant abo…
ytc_UgxJ6Vv-T…
G
I've used chatgpt to code, and it's still bad at coding it gives spaghetti code …
ytc_UgzDMXj1i…
G
they're talking about the higher error rates for certain groups, so your point a…
ytr_Ugxp5a8kK…
G
AI will improve to the point that we cannot tell it isn't real. Then what?…
ytc_UgyRLPVnm…
G
We are developing game and our artists are massively using AI. Actually we would…
ytc_UgyOTPNRN…
G
7:09 I saw something very similar on YouTube and when I called it out for being …
ytc_UgybTbE_X…
Comment
"Autopilot" not lane keep assist
"Full Self Driving" not assisted navigation
It's not just that the Tesla's give a false sense of security with operating completely fine and then suddenly making a severe action that can be too quick to combat, it's that it's also advertised in a way that makes drivers think that the car know what it's doing, and that it's as capable as them while driving.
I know there's warnings about needing to be alert, but once you get comfortable with the technology your attention even though you're technically looking and reading what's happening is still reduced.
Elon musk constantly talks about how his cars are safer than a human driver, how he's going to make driverless taxis etc, and even though he pushed it another year every year, it's still what he's saying.
Stop marketing it as the car doing anything by itself and start making people realize how un humanlike it can be.
youtube
AI Harm Incident
2024-12-14T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxC-iDt5UtHNFqW3794AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwV1UdqIxGX1EgM0i54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyX7mh_NtiT0pF9wQx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwC9HvADXR4N3F7zgp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz9Tm7Jb1jmH5KB2PB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzxehnkNb7E6xnNhJB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvO7ugcavddqY9rvp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw7gjfOOEbNLrZ4Hbd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYAX32t4lj0wZ_bfN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2XIfX3uCSzJzQO-t4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]