Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
No need to panic. This is just an AIML chatbot based on files that are about 20 years old. The "DO YOU WANT TO xxx" is from default.aiml and just echos back what the user says to it: <category> <pattern>DO YOU WANT TO *</pattern> <template>OK, I will <person/>.</template> </category> The full default.aiml file is here: http://www.alicebot.org/aiml/aaa/Default.aiml
youtube AI Moral Status 2017-11-09T23:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzcCVfkPIIRFQe_mHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwBEVKljie5-VUuamd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugzmx7xzhv98Ukl-WkJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyPMirsdm8B3ii100x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxNfeJVqKNBTqteVhp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxF72ys2iE_I6JChVx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwOo_rVObM4A6RvBBx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy8rxqyPWAfZkZm8pZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzHnIGqGkDXOXt412Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyIHViz-n6n-oqb-uN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]