Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Look to nature. Evolution had only one tool: trial and error. It had no theories, no equations, no foresight, yet it produced eyesight, flight, and intelligence. Proving that no knowledge is out of reach of trial and error. Theory doesn’t discover, it rationalizes. Every “great discovery” was stumbled upon, and then a theory was wrapped around it after the fact. Researchers mistake luck for skill, then spend decades floundering, trying to reproduce what nature gave them for free. The problem with trial and error isn’t inefficiency, it’s poor record keeping. Nature loses most of its lessons. Humans do too. But give trial and error perfect memory, an information system that never forgets and it becomes unstoppable. Theory is not the friend of knowledge. At best it is an afterthought; at worst, it blinds us. The only method that has ever truly worked is trial and error. And if you need proof beyond nature — look at AI. The most advanced technology humanity has today wasn’t built from theory. It was built by trial and error at scale: billions of experiments, endless adjustments, and memory systems that never forget what worked. That’s why AI exists. And here’s the kicker: the theory hasn’t even caught up. No one fully understands why AI works so well or what’s really happening under the hood. Attempts to “explain” it are still trial and error themselves. Discovery came first — theory is still chasing it.
youtube AI Moral Status 2025-08-19T10:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgyvRAJi0SlhhfyiACt4AaABAg.AM6eFG2l09HANtwAY-gdI-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgzUHaWkkoAscvN9wZV4AaABAg.AM-aqBqeD1qAM-fURPhk_x","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzZW3yEgubENJG08JN4AaABAg.ALvHrvSWWMRALvIjJ1-Qoi","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgyWduU1tl922bw940t4AaABAg.ALu82K6MWC9AMym62gjtdP","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgzbqS3LDiQHy6gw08d4AaABAg.ALtaMI6IzVTAT2InRcvM_5","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwWLvvfaeT9tqi9YGl4AaABAg.ALqwPwyYN1EALqxmtOy3XS","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugyk_cC2LeCR5VpIWll4AaABAg.ALqOQ7puH0OAMOI2ccpUmW","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugyk_cC2LeCR5VpIWll4AaABAg.ALqOQ7puH0OAMUNMxQcZDO","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgxilnsaM4HyOLCaDDl4AaABAg.AL_F2r9LMEnALyPcovSmCG","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugx30HbkDOYHJs9R94h4AaABAg.ALYrgk8n42xALZ2dkF5dwl","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]