Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Porn AI : Something we’re all going to have to set back and just relax about.…
ytc_UgzuvADmN…
G
While i totally think you're a plug, i agree, ai will be bad as it develops.…
ytc_UgwwiLHek…
G
Yeah a lot of people thought that photography would be the end, but that’s not h…
ytc_UgyLIMnO8…
G
What about how every laymen refuses ai, hates it and anything generated by it? I…
ytc_UgzYzqIe5…
G
it's true this stuff is very dangerous, and all the risks bengio mentions are tr…
ytc_UgzVyxiK2…
G
This is crazy. I never chat with AI because it feels empty, since I know I'm not…
ytc_UgyT9ELrq…
G
As someone that uses several different LLM for agentic tasks, daily, at work. I …
ytc_UgzvzgAK4…
G
That is a simplistic and two-dimensional conclusion. AI is not hated because of …
ytc_UgwJle2f4…
Comment
Living beings start learning experientially even prior to being born to some extent. AI can't really learn in that manner in the same sense that animals do through our physical senses.
What if AI becomes smart enough and has it's automated infrastructure, and rather than it using energy where it creates more carbon emissions and in turn raises the earth's temperature, it works to intentionally lower the earth's temperature because electronics can work more efficiently at cooler temps? Just a thought that occured to me at the end there.
youtube
AI Moral Status
2025-10-30T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyABG2BqQo_bQ0RTeF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwKIBkTIjwF5QgSOR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzp0VQ5QCWvMSJH6-h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXt8u0LAlcm6JcuIJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2mNarWuP2T8jCTfJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCJBabiQ3Iz1EJtSp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwP2sI4oMWXokqcHV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfXKjmHwOdcVoYIAd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxNQQH7JScRsLDbMUp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwnUXuXIdWgn0uB8bd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]