Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need laws that prevent self driving cars unless the manufacturers are held ac…
ytc_UgwqXxKw8…
G
They could, and they will; that's why you should have regulated AI years ago. In…
ytc_UgxSQMzyv…
G
They will still require a college degree, even if you are just doing some menial…
ytc_UgwzgXGA4…
G
Checked Gemini AI. In the year 1900 there was about 21 million horses in the US,…
ytc_UgxNRe5pJ…
G
Copyright for commecial use of "AI" art? Sure. But only if you give the original…
ytc_UgwtXchcs…
G
Buddy, I thought the same thing! :D
It really seems similar to NFT.
Many peopl…
ytr_UgwV3EPDm…
G
Alternatively, concentrate on making a really realistic robot, and a voiceover a…
ytc_UgxxCFSwE…
G
AI is based on memories, historical data. No data, no AI. AI must be an auxiliar…
ytc_UgwV3N8eY…
Comment
In my opinion, humans are not inherently evil, nor are we inherently good. We're naturally neutral. Unfortunately, some people have found that acting in their own best interest is often in the worst interest of everyone else. Which unfortunately snowballs into more people acting like that to stay afloat. Which again, unfortunately snowballs into those people being used to feed A.I., which A.I. then uses those values to reach their goals. This doesn't cause an inherently evil A.I., this creates one that completes it's goal by whatever means necessary. That's the problem.
youtube
AI Harm Incident
2025-09-08T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzXkTplztvIshMi7kd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw2TbQLSfe8aGJpkHd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzYMX8AIxXUOvDUV-N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgygAiNgU7aAYr44rwJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxGBxynFkne5lZsEOh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPWWQOEoETwf8GWGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyx9mOhbva7GpKbRQN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxbEYGBdkJ6zBp9kKB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgybKov30aMR3kH-49h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyU0OOIGMbTzmUQfMx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]