Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It sounds like to me that AI may not be needed at 100 percent. Because its taking too many jobs away. Its to dangerous to think by itself. Humans don't seem to have much faith in controlling AI. Perhaps Humans have created a uncontrollable monster that may very well be the reason for human extinction.
youtube Cross-Cultural 2025-10-01T09:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyP01MJl-lc89SuPL94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzU2SNC_6i6Jo1o2Od4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx3eYLugwXZuODVbCB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzlfCFIJHMnxDf0SG54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyxQUI3Je057izcZmV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwWcdJCO-TeXqFFcpl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyF32ulKlo3XKib67p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx6fvF5iWcmxcguD2d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxWvH0VUZrL2o18jHB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugy6iv8xE3vBRjQdieB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]