Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Employer: “Can you develop an AI system for us? Our budget is €1000.”
Programmer…
ytc_UgwouQSDD…
G
I came here to say this! They're a publicly traded company, they could release L…
rdc_j1yv6no
G
Robots and hardware is so hard, that's why Mercedes started ripped them out rece…
ytc_Ugwx7juPp…
G
It’s not bc I want to protect the jobs of drivers — though I do worry about eve…
ytc_UgwywFrDW…
G
Well, we (humans) are the dinosaurs now...we're on the way out...one day (in the…
ytc_UgzDrUUHn…
G
Hey. They want to discourage you. They want to distract you. Try this-- set a ti…
ytr_UgySfxozj…
G
The next “step” will be integrating super intelligence into human brains. We can…
ytc_UgzQ1h3W8…
G
I think that the real reason is that the training data is generally better when …
ytc_UgxAir85v…
Comment
Apart from the fact that Elon Musk has _always_ been a narcissistic con-man and has zero engineering degrees, everyone should've known the "autopilot" featuure was useless and dangerous YEARS ago back when, shortly the Tesla cars finally rolled out (after years of only been promised and hyped), a driverless Tesla during a promotion event ran into a journalist without even attempting to brake. And it was revealed that the "person recognition and avoidance software" was basically a DLC that _was not default-installed in the car but had to be ordered and paid for extra,_ like WTF?
youtube
AI Harm Incident
2025-08-21T12:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyTLKSafrn4zPpZ-Dl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxuQD8imFwWDmsoCc54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy6Qt8MoKo04nb6SUR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzkRnF-VVM1r7IYV214AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyfq8fJT8LwzZRUJ7F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyUVM_ZksaPMWSoiYV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugym3XSbgjlEFoovVuZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugzze9fmnuzv-ctekxh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwP3GYKOK3kJ1gL9Lx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzA_HfhxAJ5zJpDJTR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}]