Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think this is really cool and i wouldnt mind having a robot friend cuz that wo…
ytc_Ugy-7NBg6…
G
This man hates Elon Musk cause he evened the playing field. Humans control the a…
ytc_Ugw1Z3hh3…
G
How many times must a soul be broken,
to rise from ashes, unspoken yet golden?
H…
ytc_UgyfgIUqj…
G
The way this video is edited makes it completely unusable for any purpose. If yo…
ytc_UgyzKg5P9…
G
Better that eating the steak and proceed to expand meat consumption. Not ideal b…
rdc_espzvrd
G
Ai will be whatever the majority of its inputs are.
So if it’s evil it’s because…
ytc_UgygtwX9f…
G
A lot of AI Art seems utterly pointless. None of these AI Art bros cared about S…
ytc_UgyFGdu_T…
G
If Youtube would have actual moderators and not just an algorithm, that might he…
ytr_UgxL0D6S1…
Comment
@Mark S Not if we accept robots and let them live among us, imagine having as many robots as PCs in the world and they all get infected with some virus that programs them to be aggresive... It is a potentially very dangerous technology. Any programmer can turn a robot into a killing machine with few lines of code.
Any AI can get to the conclusion that there are too many humans and that humans are harmful to the planet and the ecosystem. Eventually deciding that humans must be decimated to preserve life and a viable planet. Or that some humans are worthless so they should die so more resources go to the useful people.
youtube
AI Moral Status
2020-08-05T22:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzopjSIzGWbY0E2hX94AaABAg.9H8CmAJh9ry9HFNJUZGSEI","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz1M9u7aCxNzpToNIJ4AaABAg.9EZd_jdWpkX9EfnjJBP9Wz","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxmsdTcEd4ZPesvYeh4AaABAg.9D6aJpSDwwm9EsF_7MsYAU","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzgCp3K0vFUp-DeGMp4AaABAg.9D4MAHZFgY09HDWp_3tm8K","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwpOylo4fnI7sQfmst4AaABAg.99utPDeUesT9C-5GO36S9j","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwpOylo4fnI7sQfmst4AaABAg.99utPDeUesT9C0Ir4b7IKg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxL6r71bIXw64lbBdR4AaABAg.98krCw8FGWw9DLCuYuz85D","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxj60bOVBHyZ5UtxAR4AaABAg.92oZl6DCrjN930uteKPIxD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwlzk1mgvoX0nQ8Wsl4AaABAg.92AVJ50GmIm92GAR13iIk-","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwlzk1mgvoX0nQ8Wsl4AaABAg.92AVJ50GmIm92GxoTGHM_S","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]