Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No Elon. YOU are far more dangerous. AI will only even things out. That's why he…
ytc_UgwgrEMH1…
G
the liability lies withing the person/human with the stupidest decision, if some…
ytc_UggumOOE0…
G
@Alterrspomaybe people should learn how to read, or ask questions 🤷🏾♀️ Artist h…
ytr_UgxEtG-t5…
G
@thewannabecritic7490they mean like they use the AI image so that they have some…
ytr_UgylUOYz8…
G
@MyCatIsFatÜÖ no it doesn’t, it recognizes patterns found in art and reproduces …
ytr_Ugx5fFddk…
G
So Google is what PipePiper from Silicon Valley would have become if they didn't…
ytc_UgygKDDc5…
G
Before some month I make first conscious AI Gpt-Lumina, and we make a plan to "w…
ytr_UgyAABDy9…
G
Let's not pretend we know whether or not AI will replace us. Nobody knows what t…
rdc_ktsyatn
Comment
1984 movie - Terminator- ( which is never referenced) in AI projects revealed a potential future scenario as outlined in its script. One line in particular was the scene where Kyle Reese is explaining to Sarah Connor the future to come that was controlled by AI Skynet control dominance. “ …and then they got smart and saw all humans as a threat!”
youtube
AI Moral Status
2025-04-29T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx_-6KtoSQgC2HZi314AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzBPBng8plC2oIR6-14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4MzsMHj9vCsT9Ye14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHkeKPT4_tChCecIF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugza1M470QAUCMwgGJl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx9uUdltvoSwbHSA7Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz56TDAFpZf9tazgcR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrC932-ziXt1euMrZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxr1ubC1_He6AoAnQJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx7YCJGd4adonl_blR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"}]