Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Dr Frankenstein (the movie) was way way ahead of its time. The desire to play god in a self-learning, self-managed creation... and it went wrong, because it was never instructed in the evaluation of right and wrong. A Congo cannibal is very proud to invite you to dinner, eating the flesh of another human being he considered an opponent and a prey. Their computations is entirely based of their ethical education as to what is right or wrong - in given circumstances, and locations. It's up to us to program them in accordance with what we consider good and bad. If you were to show bravery in the presence of some natives of the Amazon basin, you might be soon rewarded with a stab. Why? They would strongly seek to eat your heart to "absorb" your courage - that's their natural behavior. In the creation of AI, we MUST impart it with some ethical rules that is proper to local humans and local ethics. If not, it might decide to terminate you as being too slow, smelly and useless. Cheers!
youtube AI Governance 2023-07-07T23:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugwdqje5v_p0c0MWJH14AaABAg.9rsgEP4nmM_9rsx1KZFcEP","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytr_Ugyztsump_QL0Kz1Ahp4AaABAg.9rsfuDXKb2M9rtlbrzMdE3","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytr_Ugyztsump_QL0Kz1Ahp4AaABAg.9rsfuDXKb2M9rtt5eUUbqW","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyfTHax6zE-wo8HH-54AaABAg.9rsdh0nbu919rt7apff8Fx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwGhOaeZbWlT-J4svh4AaABAg.9rsa7dp4KeP9rseFuCPqWs","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwGhOaeZbWlT-J4svh4AaABAg.9rsa7dp4KeP9rt3VUw10k_","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugweg_5Mzbcmav2CpkV4AaABAg.9rs_1peIIN_9rstw7GnXov","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugweg_5Mzbcmav2CpkV4AaABAg.9rs_1peIIN_9rtCno7SwCz","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwzjK8EMspKs5AiQSp4AaABAg.9rsW8Ta_X0n9rsifjXWXn3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwzjK8EMspKs5AiQSp4AaABAg.9rsW8Ta_X0n9rsrdSAVIYU","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]