Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
why doesn't it just ask "what's the resolution of your system?" as in what is the minimum distance your hand can accurately travel? That's where the conundrum ends. This isn't really a philosophical paradox but rather playing with a LLM that is heavily regulated to be polite and agreeable in responses rather than being a sharp debater.
youtube 2025-07-01T18:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgxipzfwLHqapEdjmst4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwiMOanJD5rUMNIzuh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgycbOD3FPatBFb9S7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyMnJ1JU16IEUwLghx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"}, {"id":"ytc_UgzRMRcxc6aODcJW4qt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgydLE68ACqCYiPWOnp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzTFNBbr1QZMVMKoed4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyAFgP3XSk1VAL1kVl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgwWJpPDaRECRiLIRuF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxh-haHvsCOZAY4swV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}]