Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why iam bothered is. FOr example before i die i try too pass on the values of some things too my children, grandchildren etc When i depend on AI for that all my moral values or vision of how too do things are thrown in the trash. Even if i disagree on the AI vision and its not like that the AI vision is the better choice of all choices and it makes always the better choices. And you say if its smarter than us it wont be motivated by greed but there is a very thin line in making a smart choice what is in the end greedy. Smart is only subjective. Ai for example doesnt believe in miracles yet they do exist. Ai is not spiritual. Eventually the smart choice becomes so calculated that it will just erase us or at least a part of us because its the smarter choice. For example you have christian and muslim or black and white. And one of those are causing constantly problems eventually AI will just erase the trouble maker as a whole. Because you have to realize we as a people see for example black and white and judging on that as rasicme AI does not for example AI calculates that whites are overall lower IQ or blacks are overal lower IQ it will just eliminate it. Their is no more free will and thats the danger.
youtube AI Governance 2025-11-14T15:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzS7QmSNbxaoKDAbd94AaABAg.APcDR90JnMIAPfWaDViuE3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgymgVYVO0UcVDPGWWd4AaABAg.APc0129vKIOAT2jdoe_Ucw","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzMzTkSlUezI_hQ0gl4AaABAg.APbpuB6v92LAPbr7sANBwQ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_Ugy5lOxB1Iw1-ObLhuN4AaABAg.APb6UrHIUkuAPhF1XLQ-HC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugy55mIq3x3RP7YjizJ4AaABAg.AP_32UDRQvaAP_llMi4Cb5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyAyGL2cHypaqU0Fr54AaABAg.APZFBqkeYw9APZdfUltyD0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_UgyzWbVgicGuAIBKbnx4AaABAg.APVmsHF3g8ZAPW9iOcj3G1","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgyzWbVgicGuAIBKbnx4AaABAg.APVmsHF3g8ZAPY6oBFvmt0","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgyzWbVgicGuAIBKbnx4AaABAg.APVmsHF3g8ZAPh6qpy_CHb","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyKUr_tuHiJBQyIW914AaABAg.APTmanVAvZSAPh9AGPcg20","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"} ]