Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon Musk disagrees with Sam Altman a lot on ethics of AI so it has to be Sam.…
ytc_Ugw-rv7F7…
G
>restaurants in eastern China are using artificial intelligence (AI) robots
…
rdc_ogpmoq3
G
Some relationships are not meant to be
such is the case with AI and Art…
ytc_Ugz1PLNJL…
G
Most systems are very basic, Unless the crime is discovered the day or week it h…
ytr_UgzEOt-VM…
G
Just wait. In less than 7 years, humans will begin fkn these things, developing …
ytc_UgzHhHC4z…
G
The problem is everyone in America can agree and laws can be changed, but, if Ch…
ytc_UgwgayN34…
G
i dont approve robot cars, but i admit there are a lot of idiots driving around,…
ytc_UgwPinJXc…
G
YouTube has been flooded with AI fake videos, and it's getting to the point wher…
ytc_UgwlqHajt…
Comment
The thing is ai is summoned into existence and stops existing when there is no processing. It exists because it is of use to humans. It has no self driven motives outaide of its summoned orders or context.
youtube
AI Moral Status
2026-01-31T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzooU7og5yiZubruLx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyt9K0L7cxKO-LtNQx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFZpniVLIAvvKnvXV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxoQDkGmIIz7fvrIaZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwdm7BRmOHSE3wfMOp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyzzrOF9ifuU6xXw9B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwaLTEVBUIwltc2tlh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwXhiZUv86HiX1A0lB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwfFFcWNtrZ53U0aER4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzcCxbF9AtqAN0naUt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]