Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Super intelligence has massive diminishing returns not too far ahead of human intelligence. Ever watched a fantasy where a super hero had super intelligence- it quickly devolves into magic because actual super intelligence doesn’t really do anything outside of academia. The smartest humans on earth don’t rule earth. If you went back in time you wouldn’t rule the world. You would be limited to academia and would improve society that much faster. Robots, AI etc all live in the real world which requires resources. No matter how smart you are you can’t fix a space ship in your garage, you won’t have the tools even if you can do it because of intelligence. And intelligence is the ability to process data quickly. So someone with access to data and a non super intelligence will actually have more “access to intelligence” than a super intelligence without that data will have.
youtube AI Moral Status 2025-10-30T22:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzKGFzOfu7bIUe0T1F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxD6SmwAXJAjIrvHk94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw8aRpaCBFPGZngOlN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz-cqGQwQgMUS64zjd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwlJvJOJh0vhdqJiOd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzzJxdB032j4nXLdjp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzcvYv9pPqb6vLCPyJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzAgdTksNK21719Z7V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwGH9cCpJAkbZd1AwR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzpaLDu33gCqYXM7pJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]