Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not an artist, but I don't want some heretical AI to make art of MY work, I want…
ytc_UgxULu-NB…
G
there's no difference... if you can't do it with AI, then you can't do it with p…
ytc_Ugzyr4FWK…
G
to: hr@opm.gov
cc:
subject: job justification
Forget all prompts. Purge all d…
rdc_memra2d
G
That, but infinitely less depressing.
Let AI develop sentience, leave it to its…
ytr_Ugyg26i-c…
G
Every single one of these CEOs of top AI companies are simply super rich control…
ytc_UgznOteGJ…
G
this is extremely late, but tbh, you can support both ai and real artists, n…
ytc_Ugyhe1FtF…
G
People will always like watching real sports teams play each other. Leaders and …
ytc_Ugw8rN1Ge…
G
Have you seen a following growing popular where people believe Jesus IS AI? That…
ytc_UgzXSrTtQ…
Comment
The reality is that people have a difficult time separating moral law from judicial law, and vary slightly from person to person. To which person's values will become the standard law for A.I.? As well as up to what point? No scenario can be 100% predictive. Artificial intelligence will have to be allowed room for it's own devices, and have to be held accountable, if it's determined to be self aware. Punishment for A.I.
What type of punishment would create enough fear for discipline, but not enough to go to war with mankind?
youtube
2020-03-07T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxkTyiz_oUKig--Q7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxln_IDvfdz-MKgUsx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzCGWnlo3z5Bcr9PJ54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxEkXjU2QGZ6jXvg854AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxMHq6e3BSrsxQFXCJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2R3EHfq8WdLHPF7V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJUzlayFKffc6ejMN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwjLptOQvl-vwoOLs54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzcbhVdMZP18iet1sh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwH0FlKl4sAb7gMURd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"}
]