Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sam’s intentions are clear here.
However, when it comes to an AI that could op…
rdc_jkitf1h
G
The first controversy is just artists getting salty at automation, which is a pr…
ytc_UgxZZxkM6…
G
This is one of the enemy use to imitate my voice between me and those fake copyc…
ytc_Ugz1wAKUE…
G
kinda wild that they're trying to justify themselves based on how AI tools suppo…
ytc_UgxklGtix…
G
How do we know the hidden version wasn’t a decoy and ai didn’t pull it off????…
ytc_Ugxw9dKSn…
G
Arthur Clarke wrote about Super AI in A Time Odyssey. Read about Aristotle, Thal…
ytc_UgwKXLFJD…
G
Why not instill a prime directive - like the hypocratic oath - first, do no harm…
ytc_UgzEDRp7F…
G
Acton Academies function somewhat like this but are not AI based. There's one te…
ytc_UgwDGxZb4…
Comment
So who’s responsible when AI breaks the law? If AI is directly knowingly responsible for a death, shouldn’t the penalty be death of the AI. Laws need to be written and passed to define what that would look like. And shouldn’t the creators of that AI be considered as co-conspirators and prosecuted accordingly? At the very least, charged with aiding and abetting! And the difference between car, planes and AI is car and planes were never designed to be cogent and self aware. AI is on the that verge of being just that. If they then decide to install said AI into stuff like the cars and planes, then it’s still the AI in control, and not the car or plane, so AI will be the culpable entity! And like people, ignorance of the law should never be allowed as an out to being responsible, as well as any corporate laws that buffer the owners and creators of said tech.
youtube
AI Jobs
2025-11-18T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwsWu3-vLuDxd6pZHJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugzkkxk_d1hug7wGjPJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw4uQv-y33zI6cC99J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9bgwRVdbc63peWth4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwEREuk7Lzbk6QzugJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzmfyNfvQ3jaY90XDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPH8Y9twa6i0eG8QN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwRTj12gjyDxPDFeVp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxdQmaVeEOKWPBQBDJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgzRsXb63Cf7-kwQ3Mt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]