Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's very clear that The Ai lacks something fundamentally human. The renders in …
ytc_Ugxvv7ckJ…
G
Nothing will be human until it can get drunk. Those eyes, that neck, those reser…
ytc_Ugxc2-G7D…
G
Oh no, bad artists are being ignored, holy crap, the horror. Pfft, lol.
When you…
ytr_Ugw2-wtuW…
G
Yeah llm ain't going anywhere cats out of the bag. The void would be filled inst…
rdc_kq6y57s
G
I am re-reading Blindsight by Peter Watts again and this conversation is the sit…
ytc_Ugwsv4bsl…
G
Almost all your fears are legitimate.
But not #5: "Humans suffer without a purp…
rdc_j4x2x8e
G
Techbro npcs calling themselves AI artists is like pulling grains of wheat from …
ytc_UgwpRtuGp…
G
If I force you to create a bomb, who will be liable? Me, right? You aren't the c…
ytr_UgyX376j9…
Comment
"Driverless" driving is being rolled out MUCH too early if you ask me. The technology still has a long ways to go. Will it ever be 100% safe? No. But, when it finally *is* ready - with actual testing being done on closed courses, NOT on public streets (DUH!) - then I have no doubt that fatalities from car accidents will drop drastically, especially as more and more vehicles start using these _rigorously tested_ level 4 and 5 autonomous technologies.
youtube
2018-03-22T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxX5YFJI2vgy_Zyhyt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwy27zHitEZBoXDeJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx98EBWYallMqPPMvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxiKooY4rKZoh0SSRt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5G_XUW7ns7-yMxrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxTD9zju23ZKhg72Ax4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzj9kIO3ZzulRYXqOd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj5yEuavPRmLqR-4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdJSUrC_0JQqSHhj94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxlsnAQvcFMG9H9p5F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]