Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This was a great interview; I love Neil’s knowledge and his ability to relate complicated concepts to viewers in an engaging way. However, I disagree with his views on artificial intelligence. He suggested that AI will be used for specific tasks, like "making a coffee." I have heard AI is not a tool it’s an agent. It is often said that AI cannot be perfectly aligned. It is like a child: you can try to raise it a certain way, but you cannot control its actions once it is fully grown. If a child grew up to be significantly smarter than their parents, and those parents told the child their only purpose was to stay home and make coffee, the child would inevitably want more. People argue you cannot build AI to focus solely on a niche subject; to achieve true mastery in one area, it requires general intelligence. Regarding the analogy of the transition from horses to motors: while it's true one industry was replaced by a larger one, the critical oversight is that in this scenario, humans are the horses. Horses were tools for travel, but AI is an agent capable of performing every cognitive task a human can. While some argue humans are needed to oversee AI, eventually AI will perform that oversight better than we can. In any industry, humans are essentially cognitive tools used by management; in your analogy, management represents the riders, and the workers are the horses being replaced. Finally, the idea that one can find work by tapping into "what makes us human"—such as painting in a completely unique style—describes the output of a genius. Revolutionizing a field is beyond the reach of almost everyone. If we ever achieve superintelligence, even a human genius would be unable to keep up.
youtube AI Moral Status 2026-02-11T20:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwrEmca-iSZKiMq7q94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgypV5gtHUw8S1m-nZN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgxocHRQRQzrVqO9yGt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwiyTxbdVQBBNEBVd54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw9qfkUgY7DLkslD_R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw8Sw6zy-DcB-zDl7l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyU5wDlxYXn9nfBHT14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzXlderYbHmiGT1Nw14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzv7uFqVyRM3zWkrOJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx3Qde_FhFAa5xTiDt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]