Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a truckie I wouldn’t feel safe putting my family on the road with self drivin…
ytc_UgxNZZyN4…
G
I'm not into AI. Never have been. We should be fostering HUMAN not artificial li…
ytc_UgwYJhffe…
G
The bigger companies grow the less productive they become. AI as a tool has noth…
ytc_Ugws0TYsg…
G
1997 movie stoker “what are you 2 doing it could be dangerous come back”. 2025…
ytc_Ugzh5uDyR…
G
I said thia when it first came out. The biggest danger of AI is that dumbass peo…
ytc_UgyA5_Ldg…
G
“As an AI language model, I can’t condone the drinking of Pepsi. It is important…
rdc_jg84jef
G
I knew where this was going but decided to watch the whole video to give the vid…
ytc_UgzBxA-H-…
G
@artistsanomalous7369Yes but no, the people who hated photography were less…
ytr_UgxwCOP_S…
Comment
I remember reading the first Dune book, long time ago, and there was something similar to what it is mentioning here.
I might be wrong, but something like an AI getting too powerful and they shut it down and use engineered humans to do calculations or something.
I might be wrong, if any Dune fans can enlighten me if I remember correctly.
P.S have a wonderful day whoever reads this
youtube
AI Moral Status
2025-12-11T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyAvV2Vqvbq_enMXf54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNsJR20LNqm4E3SHZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx9Gvwpw-E_USS2FXB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrqOgieZ95CWqOO594AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjNQErUTycyR_gasB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwou43ZE3fr9tnOp2t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgydFOukhScKKYun1Dl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-4I6m9BGuDBWSOU14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgylGxOc1DlcSdWHPER4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyWpU0iEtSVC_C2J8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]