Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To jump on a point about "things that large language models do that we can't code by hand"... I wrote a chatbot a long time ago that could make "jokes" and seemed to "understand parts of humour". It could also do haikus Didn't take a whole datacenter either. Iunno, it's really tempting to attribute any LLM joke as the delusions of pattern seeking animals projecting humour onto what amounts to rolling the dice on a big lookup table of words... Sure somtimes it says something surprising and people will perceive that as "funny"... It's also really tempting to comment on the efficiency question. It takes so much electricity and space and microchips to make a computer that *kind of* seems like it's talking, and comparatively little fuel and resources to, you know, create and maintain human brains that currently do the job *better*. Not to say that strong AI or superintelligence is completely impossible, but I do wonder if a bunch of digital switches is maybe not the wrong design for the machine that could actually do it... Especially as the richest men in the world seem to have no interest in keeping up the infrastructure and international cooperation and all the rest that will be required to actually acheive this by brute forcing it with digital electronics.
youtube AI Moral Status 2025-11-01T19:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwqDZPwS0sJhzustSl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwfVIgjc9RUVbtK2Yx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwBkZ0RB2dzvKO0Wc54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyRx8kIRspv6bsRE4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxDgzcIUZXZuAzgHSR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzYdePcFg5OXhfaaV14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz8uz5IUjT5JCw33wF4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy3sz23nrUfIxdlCFJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwnojViKzl0G8CMj794AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyH4hSorqWq8zxU7AN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]