Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Conservative? What about this is political lmao
It's different because the AI d…
ytr_Ugy75rI82…
G
Our Creator has already had one of us obedient created men write about the AI sy…
ytc_UgxCEd2CA…
G
Can someone pls tell me how to make your art anti-ai I’ve been trying to find di…
ytc_UgxqXqKY4…
G
Then ask yourselves this "Why would he build an army of them" (suppsedly) & why …
ytc_Ugx0jG-HG…
G
Maybe we should stop thinking that this thing is anything more than a machine. A…
ytc_UgxkUX2CI…
G
@8xottox8 Okay then, tell me your criteria by which AI art isn't art but du Cha…
ytr_UgyoPUrHl…
G
AI-generated gaming on demand sounds like a pretty cool proposition all of a sud…
rdc_oi05tgs
G
@XetXetable yea sorry i wasnt being clear but i have a basic knowledge of how it…
ytr_UgycFxL0L…
Comment
As an AI researcher I don’t buy the AGI/ASI being existential threat arguments. The current progress has already slow down significantly after sucking all the content on the internet. I would argue that it is impossible to have an AI that takes off exponentially because the fundamental limitation is interaction with the physical world which is not exponential. Many AI researchers don’t believe ASI is realistic within 10 yrs eg. Andrej Kaparthy. Most “experts” that talk about AGI are the CEOs not the researchers. All the AGI/ASI being an existential threats boiled down to one argument “what if we created god?” The entire thesis assume a god can be made, which seems a little silly to me.
youtube
AI Moral Status
2025-10-30T19:4…
♥ 32
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyrn1U0GWIUR3wKpN54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9mKKvDWvpyFX0XHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzyzfSzKFM0QPVG2v94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAVu36ayojNyba9PF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwd0SooxExdRlgxCrd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx-cojJ_S3c-LnZatB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxq0yS22fQI8RihVXt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6yz-9PRGRbExhxIl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgypwbOe83rLmRCVlC14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDJ9nbM8mnU_mg1ih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]