Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Make it 'read-only'. Which means it has no learning capabilities. Only assessment and observational response abilities. Processing information then reacting. No actual "thinking". It wouldn't be a true A.I., and that would be unnecessary for menial jobs. Actual necessity for an A.I. is an interesting question. I would like to see it happen, but it raises a lot of philosophical questions.
youtube 2013-06-13T19:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxQd_M9D_9jJa7IAmZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxrYQa14lfcpZjFI4B4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugw74fCY6BEVssSoaGh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw5-mIuWiROXdkbL6h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxPGJhsqUyB-MjVEj94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxfHB6rMHbezIp-6pN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgznhfIi8YI6G-AslhB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyacK7VmO9GCMr87P54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxj2xqg1fdtQIh_ruN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxkLaa12SR8MN8lnyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]