Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should get more " Replacing all CEOs with AI is cheaper " ads going…
ytc_Ugw2_svn6…
G
Stop using AI. It’s taking way more jobs than immigrants. Amazon just fired 30,0…
ytr_UgxIrMQwN…
G
Also it’s an echo chamber… ur not gonna b challenged or pushed to become a bette…
ytc_UgxPMjNo4…
G
Not for code, it's not. GPT-4 is legitimately going to challenge a lot of what w…
rdc_jg7e97c
G
Exactly. All forms of AI will be better: They don't need vacations, sick pay, r…
ytr_UgyW6wdZu…
G
This is heartbreaking. It sounds like during those long 5 hours, the AI slowly s…
ytc_UgzsihdIa…
G
What if you could create a god-like AI on a chip which's transplanted into your …
ytc_UgzcpKBM5…
G
I told the AI super intelligence to minimize human suffering and it nuked the ea…
ytc_UgzJGjiEc…
Comment
I am shocked to see a venerable newspaper, like the Wall Street Journal be so irresponsible in their reporting as to use data that is years and years old. In the world of technology a year or two can make an enormous difference let alone four years. Version 13, which is now the new full self driving software for Tesla is not even close to the same software that was used in 2016 2019 and 2021. It’s a real shame that you’re putting this information out here and skewing our social opinions in the wrong direction. Humans have an absolutely horrible record of driving and they need to be aided and replaced by artificial intelligence. Just search the stats to prove it. For anyone who is interested in the truth, Google Tesla FSD version 13 to see autonomous driving and where we are at in December 2024.
youtube
AI Harm Incident
2024-12-13T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzIIPIbOT8lC9DaRut4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwIoboltUEFrTh6rEF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkR3ro8PA95i-GNjZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxB9qSsC4fPPRdVlf54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy2OHKXtD0Zvy372Vh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwhUBPWA5efS3_W5E14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8_nBj-Erj0lGw6gF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyQs99yqY1vA4rGFBZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxn8xAFQpZ6xqTEllt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzb9znGLEQ5fjI-Akl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]