Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well, no, it can´t. We do not have AI. We have algorithm-based models we like to call AI, but this "artificial intelligence" is just like an "artificial plant", superficially it might look like a plant but actually it has nothing to do with a plant, just as AI has nothing to do with intelligence. The state AI is in, is the maximum we are able to reach, nobody on this planet has the slightest clue how we should get from where we are to a reasoning AI, much less to an AGI. Even though the video calls ChatGPT a reasoning AI, that simply isn´t correct. ChatGPT isn´t able to reason, it just emulates the behavior that makes the user believe it does. We don´t know how to create a reasoning AI and we most probably never will. The only thing we do know is that the way we are creating AIs today is not capable of creating a reasoning AI, so we are basically at square 1 in that regard.
youtube AI Governance 2023-11-07T22:0… ♥ 86
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxzKGalQSK67lWN78l4AaABAg.9wlWhylI3f49wmg1MP2sjI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzIdMdkN-t-S5wmQeN4AaABAg.9wjpXcETXAA9wpbV6AU66y","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzIdMdkN-t-S5wmQeN4AaABAg.9wjpXcETXAA9wpxkhW-Ow_","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzIdMdkN-t-S5wmQeN4AaABAg.9wjpXcETXAA9wqFfI-E84i","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugw9zweTXjrvNJnPm9Z4AaABAg.9wilZcfVL-z9wjI4zGWqS_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxoNNo88O_UKODXVtt4AaABAg.9wibGptt6Z59wlI8xERlAe","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxoNNo88O_UKODXVtt4AaABAg.9wibGptt6Z59wmwefxtyAh","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxoNNo88O_UKODXVtt4AaABAg.9wibGptt6Z59xwvm75QeU7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgxjKY41wlmdohLo9rV4AaABAg.9what6HG_6SA0E-Hdx1pLJ","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugz9QA_Z_tDTFWspqeF4AaABAg.9weFWW-n-_rAI6lzTcIL3s","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]