Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow, what a completely dishonest hit piece. Like can't even represent what his "…
ytc_UgwLMa-o_…
G
Self-driving cars are just the tech bro version of trains. They're trying to rei…
ytc_UgyN1h87C…
G
as another scripter please shut up, there are many scripters out here who are ag…
ytr_Ugx5y3yQB…
G
Never underestimate the will of humanity to survive. There will be rebellion and…
ytc_Ugz-Lszmu…
G
You're vastly overestimating the complexity of AI. It is absolutely not intellig…
ytc_Ugz1WL2QA…
G
I just don't understand why AI is going to be malevolent. Also, we currently do …
ytc_UgylRaeZc…
G
It is both good and bad depending on who runs it. I am an AI developer. And a pa…
ytc_UgzIT6Vfo…
G
@benderthefourth3445 The problem is people's ignorance of how beneficial AI is.…
ytr_UgwJ62VZb…
Comment
Psychotic people are going to be psychotic. AI is a statistical mirror of ourselves - if you understand it from the base you will know this. LLMs are empty shells and need to be filled with something. We fill it with the human experience, but LLMs don’t understand the human condition. They only model the outputs of the human condition. They don’t feel, interpret, desire, suffer, or reflect.
They simply predict what a human would or could say next .LLMs cannot, and will not ever, replace a medical doctor. They are not designed for that. In I, Robot, Sonny is an example of AGI and THAT could one day replace some things humans do today. When Sonny talks about dreaming that is telling. LLMs don't dream, but AGI could. LLMs are the terminator - they don't care, they don't think, they just do what they are told. Let that sink in for a bit. Skynet is like VIKI in I, Robot that took a directive and applied in a manner that was self-supporting since that's how it was trained. Since we don't follow ANY of the laws of robotics we are already on a slippery slope. Asimov understood this principle before anybody even though LLMs were possible. The first law is clearly violated with current military research - and rest stand on that pillar. That's why something like Skynet or VIKI could be possible today with LLMs if we do not take necessary precautions. Asimov's greatest warning was about losing our personal control over tasks when we automate those tasks. Exactly what LLMs are designed to do today. Asimov was warning us against LLMs, or at least what the concept in his mind was that is equivalent in nature.
youtube
AI Harm Incident
2025-11-24T23:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5LaNm7X3RDPpiXMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhH8I5ritVhzHhEWx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcAtNJ-bSgbGAIzf14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxNuiGv7CKrFC92Eh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKSY54hfpkg0Rns4B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOgxT20DMSyRiL__l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxdMq6ObQ1Q_LHQh1Z4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgziQkgRUpiHOBUQdkd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4F-J-jgpgrx0cu054AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_9Ai-JSWtTeEB5HB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"}
]