Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is not good anymore the artificial intelligence is growing every day, the r…
ytc_UgwU-UGI8…
G
AI, LLMs, are parrots that repeat knowledge that is publicly available. The mome…
ytc_UgxMQWrXm…
G
Thank you so much for this Vid. Not just for the poisoning, but also because you…
ytc_Ugx4-vzQ3…
G
Thanks for these videos on fighting back on AI. It gives me so much hope for the…
ytc_UgyXIvndZ…
G
@massimilianodelrosso5495And yet there are countless thefts with just humans do…
ytr_UgwkpTT3_…
G
AI is like zio jews at first harmless and in need of help then turns into a plag…
ytc_UgxzGTmGD…
G
Wow this story has really enlightened me on how AI works. So grateful this mom s…
ytc_UgyRzlQGk…
G
Maybe AI can't handle Enterprise apps now but it will eventually. In 2022 It cou…
ytr_Ugy4psobZ…
Comment
This is a well-though-out and even-handed discussion of Tesla’s autonomous driving mode. I have had two Teslas. My first was a Model 3 with autonomous driving mode. I hated the autonomous driving mode on the freeway. I hated it even more in urban areas. The problem is that you *think* the Tesla will make the correct decision, but you can’t be sure and a mistake could be catastrophic. So instead of driving the car, I was monitoring the the car. To me, that was a greater cognitive load that just driving the car, something I’ve been doing for decades. I now have a Model Y without the autonomous mode package. Love the car, am glad autonomous mode is gone. And I agree, a totally misnamed feature.
youtube
AI Harm Incident
2022-09-05T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxRApHhC7pNHbTXwLd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzXHwffMfIHJ-zeKmR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyVUMUGeiE6E9Kee5R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxwLZQhhmXpc-R-sA54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzEJLQTrd8jTfm5Akx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw1eFdXBnSi9_a9Y-p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzefrXgA4Ay-Me45394AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzwZsNy-lXfX3oalQB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy3B5SncS9EK2I4Sld4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxnHTajOhIDFNl8ztB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]