Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People dont get that AI is a library or information and that information comes f…
ytc_UgzwQKTBa…
G
It's not AI, it's a software. A predictive and probabilistic one (without precis…
ytc_UgwBXQWUN…
G
Without fresh water 💦, life (humans, animals, plants) will die. So these idiots …
ytc_UgxRqHQuS…
G
I’ve never really seen artists liking AI or having a positive view towards it, t…
ytc_UgxTwh65D…
G
I had a recent realization about AI from the writing perspective. Now that peop…
ytc_UgyvNNevn…
G
The biggest risk is probably going to be military application. There must be an …
ytc_UgzPfoX_-…
G
I say as long a you use it for personal use or messing around with it to have fu…
ytc_UgwDg6DZO…
G
AI, or whatever, is going to ruin society, but it's all spearheaded on the backs…
ytc_UgzdWx8DR…
Comment
speaking as a robotics researcher (who started at MIT in the 90's) I really like this discussion. You guys are talking about what very few other people are: the fatal flaw with current "AI" is that it is grown rather than engineered, that makes it dangerous, like a virus. Period, end of discussion. That means it needs to be kept off of machines that can interact with the real world, especially robots. If my work ever sees the light of day I will ask Nate to be an advisor.
youtube
AI Moral Status
2025-11-11T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyD_vVgK4lU66Lr9q54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzC5ci0oXYUvBqFe1B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZQjSzkiOzmnrTb454AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgziVby8mv9JCe3Ii9R4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5vty5u3LBNGmPlqh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTgAPXXot1H7fSba14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz9aRh5H-dWDzkCLvV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugy-YPCOCebMWJ9NcuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy86aQ-y1DSo4yqC294AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx_cFH_A9RtIjRcBJJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}
]