Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Never give AI free will; especially when they can cause harm to humans. Speciali…
ytc_UgyAxETvj…
G
The work industry will just collapse because who can afford to pay for skilled p…
ytc_UgwDuZ5Os…
G
Some problems aren't meant to be solved by traditional programming, some problem…
ytr_UgxLSK0Zy…
G
No you are free to look up on google "art pictures" put it on the AI and let it …
ytc_Ugyj9z_-C…
G
Perhaps the solution is to achieve human superintelligence through science drive…
ytc_UgwBYP2JF…
G
EDIT: music has plenty of legal precedents and sentences passed in court, so is …
ytc_UgwvuVS_F…
G
The problem with this channel is that it tries to pick up interesting and hard q…
ytc_Ugi0w_Bes…
G
@Tharros95 There is a man who remembers every day of his life in detail. Joey D…
ytr_UgzDnS_vp…
Comment
I think it has to do with a crisis of agency. When a human kills someone we have someone to blame, sue, hold accountable, get restitution from.
When a self driving car kills someone......who is to blame? Who covers that? Does the owner get charged with manslaughter?
Theres also the agency of choice. A human feels like they could have made a better choice than the machine. Its a fallacious argument from a bias.
This is one of those "the ends justify the means" things wherw losing our agency leads to objectively better outcomes for all of us.
youtube
2023-09-07T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxG5eCF5anbSv86v7R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwgA0xzbgWr5e2hizp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwu6moQ3OAA-QKkgwt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZbilT559jURqwYRp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzKgERlc8c2jTYyLuJ4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8xhFGXJnIZZqO6Lx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"unclear"},
{"id":"ytc_UgwYW9G7fsUhCdj0oqJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyaFpuKzMWJ_yck6lR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwg85YLd3LJd-qrirF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxj63I_MUoW0svGlHR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]