Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Respect for Sir Roger Penrose, but I think his argument about AI and consciousness has logical flaws. He assumes intelligence requires consciousness without proving it - many systems show intelligent behavior without consciousness (like ant colonies organizing complex structures, immune systems adapting to threats). His definition of consciousness uses circular reasoning when saying "understanding requires consciousness." The Gödel theorem example doesn't actually prove consciousness is necessary for understanding mathematics. He's essentially confusing intelligence (problem-solving ability) with human-like understanding. It's like saying cars aren't "real transportation" because they don't have horses (just as horses were once the only form of "real" transportation)
youtube AI Moral Status 2025-04-30T06:0… ♥ 14
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugxlg-5UOm9vl96i6IB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxw2Oxwav0EowsDsl14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyWuXciZp25XxRaLKx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwARcp23wfD6hn6Mt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzZd1kEP1B3vi3kjdB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyHa7pj52iHoqkDYbt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxtFNiUOKdzz6EohZt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyq-5aMNXrCsz-dXTd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyRDNXTTF4BzXEfZPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyaRzbajUyZrMxYESh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]