Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
(US) Gen Alpha has 30 million less people in it than Gen Z - a 47% drop in a sin…
ytc_UgwO1XUtA…
G
2030 -- "Just learn to love your AI overlord bro. He takes care of you"…
rdc_lma10yt
G
This is wild. It seems to be high quality AI crafted into a story about the dang…
ytc_UgyGCtS7H…
G
Robots demanding rights? Unplug them.
A robot couldn't actually feel, it's just …
ytc_Ugw7dI6Vi…
G
The Statement: The Universal Zero-Day Exploit
The Theory: If the universe is a …
ytc_UgylasSC1…
G
I uploaded the 1st act of my screenplay to ChatGPT and it actually wrote a good …
ytc_UgzQ6fNa6…
G
China is doing both. Actually Chinese AI is being used to build everything up fa…
ytr_Ugx8e8zXZ…
G
Ironically, you are actually helping the A.I. companies to improve their product…
ytc_UgxhI1vrs…
Comment
Personally I don't think that a perfectly nihilistic Ai that is would ever have the goal of killing some stupid chimps,
But the chances of systems going through catastrophic unintended failure are extremely high
and cases of people fiddling with systems they don't fully understand and F'ing big time are all over history books
this should not be debated, it's just a common fact
youtube
2020-01-31T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzL0pUMLLwL1ct1UcV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPgKgzmi6ht-zyIm54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVoJoWz1X0ALjyI3N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBb7bGF9NefEGD_K14AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-19ekQVhFElTInsN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMARCQefsn0MDvXTp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzR5VxVqXBDei8wyyx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzh3M4DPh9TlNuB2Ox4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUvujdDSASDkqD3lh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyM85k0NGfHG-MmcSd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]