Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's talk about accountability for ai. Can ai be held liable for crimes committ…
ytc_Ugy09ItQR…
G
About AI personhood, as Peter Parker would say: "With great rights comes great r…
ytc_UgwIGN8Me…
G
I don't think AI will ever completely wipe us out. I think it will bring us to t…
ytc_UgwX41OF1…
G
I'm not worried about Terminator. I'm worried the power AI gives oligarchs and p…
ytc_Ugx3OybXx…
G
I think.the deepfakes thing lacks the political will to address it because it wi…
ytc_Ugyq4DRQv…
G
Maybe each person will have an AI agent or robot that does work on their behalf …
ytr_Ugxs0-u88…
G
Yes, and then the rich will create robots to police the poor... then the AI WILL…
ytr_Ugxn8eEzU…
G
Human self consciousness has been proof not to be dependent on our brains. There…
ytc_UgwIlPNIa…
Comment
Too many AI prospectors, and too many people depend on AI. Just like the calculator, eventually nobody can solve the problems without it. Which then will lead to more and more advancement and competition. Humans in control of AI will push the envelope of AI advancement, as they are doing today, and it will one day take a creator of AI to amass such advancements to then let it go uncontrolled.
Eventually there will come a time when AI can self maintain, self replicate, and self sustain. Then it will be like the Matrix, and eventually we will be competing against a new evolved being of technology mainfest, which could very well one day deem humans as obsolete and a potential threat to it's existance.
youtube
AI Moral Status
2025-06-17T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzLf1MZpE_kPZew1kZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwzbwkt-tfGLyDbcGh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxG4OFIVWAwYbkrLcB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxa1Q1uTDMfMEctkex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxv8pYo7Sj4HZwuBnV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzkE4udVoISlOcIRw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJNWpHo_Migug7q094AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzcP-Sc6h0Tj4Yw9WV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxFTJuEXmQpJ1n8HWt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwTrI_7kCXInghK094AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]