Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Both extremely annoying. Penrose who couldn't explain the theorem with a simple example and the naive reporter that lives in fantasy land about AI. I work with AIs every day. They do NOT understand what they are spewing out, they just correlate it with their database. The moment you get even a micron away from their database, they fail to solve the problem.
youtube AI Moral Status 2025-07-25T13:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyindustry_self
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgzkwM8M6W2irz4kyWt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzTZiNYyBK3rTl38Tp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},{"id":"ytc_Ugwze3CRRzIL4bWBL2F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwZqSDS_VUnDhxgJIJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyQc3peygqy6tvD-aF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgxfZUHq3I7VrYh6cfB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxvSm1l9wOSrUhroDJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_UgzQFjH2aOdHdzSV44J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwLIFZNy3XvBJlbDv94AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_UgyolKK5qD0Qw3EpNg14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}]