Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What you're referring to, basically, is an "if-then" statement in a program where if "this" happens then to "that". That is nowhere near what AI is. AI learning and then acting is very little different than human learning and then acting. The difference is that AI can consume and retain massive volumes of information without the forget factor that plagues humans. If a human is never fed information then that human knows nothing. How is that different than AI? We're in trouble. We're in, I'm afraid, to a point that is beyond control. There are too many AI companies that that are intent on making it as human-like as possible. I foresee android type humanoids in the not too distant future. Imagine Commander Data on Star Trek: The Next Generation. It is not longer that far fetched.
youtube 2025-11-05T14:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgyJCjlzDXVkki_H0l94AaABAg.APD34c8nvwgAPEjM_Q2pOc","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxOMpiStRoJFd6ud_x4AaABAg.APBiYDPLqv0AQYQ39sKHzp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgxGk3eUyXZzYzuOCmB4AaABAg.APB3ID2jiSJAPB4ttESHcz","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxGk3eUyXZzYzuOCmB4AaABAg.APB3ID2jiSJAPB5E2wn5Cs","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyM2RDRwn_bNosixvJ4AaABAg.AP7BQPZcUiIAP8vhicJZcB","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_UgzuhiM4yhVXLxMsFah4AaABAg.AP3Uy_fAp8hAQOy3YNPvNX","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzuhiM4yhVXLxMsFah4AaABAg.AP3Uy_fAp8hAQVrvArh-eG","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzuhiM4yhVXLxMsFah4AaABAg.AP3Uy_fAp8hAS6GePvGNo7","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwHb1W3D8aOzKfRsZd4AaABAg.AP1ETgIovEWAP1NEvBazTL","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgympXp9dR1xCDRRCgp4AaABAg.ADak7zyHHDMADr8XV2fBer","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]