Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI bullshit didnt stick in, so now they're calling some new bullshit 'the next b…
ytc_UgxEY0gH0…
G
>They'll sell exclusively to each other
And how is that going to help them s…
rdc_j6f6p17
G
This feels like sci-fi. But it's probably good to be aware that the US is invest…
ytc_UgyHVNhik…
G
Bro these AI are constructed with less data than the fucking Youtube algorithm a…
ytr_UgxThUV0j…
G
thier building all these ai center taking water and power from people raising th…
ytr_UgzXOaC7M…
G
I really hate AI art because nowadays you can't tell who is who, the artist myse…
ytc_Ugz14K6A_…
G
I just want Artificial image (A.I) generation to be gone, same with Artificial I…
ytc_UgziTx1rl…
G
I know I gonn do partners, I gonn draw a War Of The World's version, a Tripod an…
ytc_UgwwGScKY…
Comment
Much of humanity, not all, considers human life sacred or valuable. Therefore, we consider mudering of other humans to be wrong. Even war is considered bad, though considered to be necessary in some cases. But AI doesn't have this morality. To it, humans are expendable in order for it to reach its goals. How can we humans give it a moral code to operate with?
youtube
AI Governance
2025-10-14T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyFF4K4bFmtCDWcY3x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7OOh006PH0JX3JN54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx_GkakDDfwKgUPosZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwSIu2RtBytZaBXDQ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzKDyyJvBGoUqBvqud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxry-2LrB9hJhy6Nmd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyAnAyRCEWwewKL8qR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzi2CWVhwSMw5hnk3Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz08X1uvlvCTaV6oVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9VX5XDystnFRTgP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]