Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's fairly ridiculous to compare AI to the computer in terms of impact. From the very moment the computer was created, the literal first mechanical computer ever made by human hands, it not only had a task to do, but it was a crucial task (the fucking enigma code) AND it was a vital part of it. Then it was instrumental in the creation of nuclear weapons. And on and on the examples go. From the moment the mechanical computer went from idea to reality, the world changed, and it has continued to expand in use cases on a yearly basis ever since, to the point that, as Neil mentions, *everything* is a goddamn computer now. Your FRIDGE has a computer, your toilet might, I think there's even some toothbrushes with software in them at this point. Compare this to the AI we usually talk about. (Primarily LLMs, the only ones we ever conceive of as a precursor for AGI) Billions upon billions upon billions spent, half a decade has passed and it is integrated into every fucking online service and new gadget. Not a singular use case. Not one thing we can point to and say "This was possible due to the AI". Comparing this obscenely expensive toy positively to the single most revolutionary technological development since the steam engine is... I mean it's borderline insulting to humanity as a species.
youtube AI Moral Status 2025-07-27T18:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwiJDGjR9PDsERQJpl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyi-YIDYNx1jBbmUV14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxP3Jgtg-zcgky_Vcp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwHGsYL1KxWiaFHwpR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy11q0K8t6o3ZQHr794AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy8ZBfo9ZrSKocT1pp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugzp87n4ek_Ch-Rs6XZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxPcTDtlbVWcNpDvQV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxfKYBhIQ6LVGbpbH14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyV8m1BFiYs3mF-r0J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]