Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To be fair to the interviewer, Penrose doesn't explain Godels incompleteness in the most intuitive sense. He should just say "any system guided by rules will have things that cannot be proven by sed rules, even if AI" and leave it at that. The self reference through this guy for a loop.
youtube AI Moral Status 2025-12-06T13:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwFddKvcNqVqDiZ_aR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugwp2AlJg0-v0dDAcC14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxNdpIDHZiVEC-LqTB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxCoQdQBuKfz3ZjH094AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyAhmQ1yTF20ZYfuaR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgypS1GFokEYnSKLtBl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwYlVVip5e9lo3ONrx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugztx6AOG5HvmjWYYzl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyTG4UtRvgNx4_1lQh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgythnpBsnZEmUTtqDd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]