Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My issue with LLMs (I refuse to call them AI) is that I simply don't see many use-cases for them. The best use-case I've seen is iNaturalist, which uses these programs to assist in identifying organisms that you take photos, video, or audio of. Which is, not gonna lie, pretty amazing. It also has a citizen-science aspect, where people can identify the organisms, which helps ground-truth the system. Is that worth $40 billion a year? I think so--I think understanding our biosphere is one of the most important things we can do--but let's be real, most people do this as a hobby. My son calls it the "Real-life Pokemon Go thing". Incredibly useful, but like all basic science the utility is something our great-grandchildren will appreciate, not us. Every other use-case is either a complete scam (like click-bate websites or helping kids cheat on homework), or doing something that's trivial. Using the electricity of a small town and billions of dollars in order to tweak some text, or to take notes for you, hardly seems a good trade-off. The entire bubble is built on "This can do anything!" But it's up to the end-users to figure out what to do with it, and while there are some areas where it's helped, for the most part the end-users simply can't figure out what problems this technology is trying to solve.
youtube AI Responsibility 2025-10-08T13:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz1-etPHv2U81NFkRl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugww8Zvo2LkTow_ePWx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwLaiunGrswlLGcyMV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzL7bOrCqrgsTg6ieR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwH99xCfh9G0zvFxYZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzfqOQHLl3SqLTmpUR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzilKWeU49zOpQ8AfV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwuDHkQGVVhUnqP1Jt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwvRQuVgKpuHI4UrmV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxEaR9GCkCfesFdFn54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]