Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Um, how is this relevant? An air-conditioner will never be capable of designing …
ytr_Ugz7uwC_X…
G
Elon helped found OpenAi? good job R*tard another dogshit move from the shitking…
ytc_UgxGY2fw6…
G
Surprised it's Boston, kid seems like he's learning capitalism not Boston's reli…
ytc_UgzXNFnnk…
G
As the paper that apple published and showed LLM’s have almost zero reasoning. O…
ytc_UgxTQZzqU…
G
Commenters should educate themselves before commenting. AI is our next big chal…
ytc_UgztMZ-K1…
G
Lets think this through: all jobs replaced with AI... to what purpose? To make m…
ytc_Ugx3HC2mp…
G
AS per my exp 2 years with C,L,V0 AI tools
all are waest time,
As per your requ…
ytc_UgxY1Ng08…
G
So is one reason bills are so high. Ai requires ridiculous amounts of rss to mai…
ytc_UgzNB9xAJ…
Comment
Neil’s probably right that AGI isn’t around the corner (and now the scaling law has failed), and AI today is still pretty narrow. However, consider China—40 million, of the 200 million urban workers are in precarious gig jobs, as technology and automation have transformed the landscape, especially in manufacturing. The disruption has started! Even if AI is still a narrow thing, it’s shaking up people’s lives big time. So, ok, don’t freak out about an AI takeover just yet. Still, seriously, we need to get our act together and figure out how to help those who are actually being or will be affected by this now.
youtube
AI Moral Status
2025-09-26T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwP7llphkOQwLmzMex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOqtsgTBKLnh6gO9l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgywIOUOYLnviRivsYd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBVuaXfKXF2clQiFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrW-rqNsIEvqf5j_Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuUtI7zyoyStgL8IR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqH8w4OwVCWtC6liF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgznPwf24gjDEV0sJbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_Yqg3gID2gcjjPkF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwjBiXKPRmgf1Qg5MJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]