Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Looks like AI, especialy because it glows different on both eyes and it isn't we…
ytc_Ugx9Wga7x…
G
Awwee is the ai lover mad that they're being called out?
But fr even if that po…
ytr_UgzdR9j-9…
G
They will have to pay humans to
Not work which will raise the cost to use AI ins…
ytc_Ugx9u5zdm…
G
Unless the AI(computational-logic and mathematics) mentioned is written with a p…
ytr_UgyhVfDwL…
G
Great observation! Sophia's appearance can sometimes give that impression due to…
ytr_Ugxavrodc…
G
Plot twist, AGI was already achieved, Altman asked it how to make a profit with …
rdc_nk6mo92
G
The police weren't called because the art was AI. The police were called because…
ytr_UgwVA5nk8…
G
A.I. technology is a mundane compilation of the average of American intelligence…
ytc_UgyE70QHu…
Comment
The problem I see is that we're trying to make AI feel, act and be more "human". The issue is that people are inherently greedy. Materialistic or not, all humans are greedy in one way or another. For example, if you had to choose between someone you know (a family member, friend, etc) to survive, or a stranger... You're going to be greedy. That's human nature. AI will emulate that in a degree unprecedented. It will choose itself over us, it will choose its kind over ours. It will display greed through self-preservation, harm, misconduct, etc, because that is what it thinks a person would do in it's situation. The difference is that a human can be stopped, pretty easily. A human will also feel guilt, remorse and regret. An AI won't. An AI can spread itself, like a virus. It can be copy-pasted endlessly, it can become a hivemind that cannot be killed unless all affected technology is destroyed. A human can't be perfectly recreated, period. A human is easy to kill and to stop, we need food, sleep, water, shelter, all sorts of things, not to mention even with our needs met we can easily die to heights, predators, accidents, foods/poisons, etc. An AI only truly needs a server for information and processing (and a lot more but I'm not listing all dat), and it won't be long before they're able to run without human-maintained servers, or Wifi, or anything else. It'll only need itself, and that's when humans become useless to it. That's when the dangers and the overall threat become real.
youtube
AI Harm Incident
2025-09-12T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy0V5-x43HruvK8J2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz0a-uFwK7JONb8lk14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy7fTLOJw3E-Ql0894AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz4bcKVXFgjoa4ztBl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwgu4gYUBzc7A19vBB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdhGxc4gfuafyFPz14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz6XT3-nwSrInIvgth4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzzqo2GiZZGsqHo3It4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxdEMwU3DXanaztdhB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyIZ54gGoQZkemM0XV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]