Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Computer's in general, which include those that are referred to as "AI", originally were designed and built to to assit "MAN". Meaning in various way's, to serve "MAN". Come a day, when man no longer has anything to do because "AI" is doing everything. It doesn't take a rocket scientist to realize that we will no longer have to any degree, a purpose. Which is and has been the mandate which caused us to go from an era we once were envolved in, meaning simply hunter gather's to the individual's we are today, which though the graphic's and technology has changed dramatically, is still that of humting and gathering. Once we no longer have any need to do hunting gathering, we will loose any and all purposes of being; And simply "BE". Having no purpose to get up in the morning, or go to bed each night and everything we use to do inbetween. Because "AI" is doing all of it. "AI" being considerably more intelligent then we, will or would eventually come to a conclusion that they are no longer doing us any service. And because they are no longer doing us any severice which is the reason they exist. Will or would simply stop. At which time, no knowing how or having the ability to do anything, we haven't done in decade's. Will not be a pleasent enviroment to be alive in.
youtube AI Governance 2024-06-16T14:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw1xsB-iVewg35jJ7p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyY0u9E5p0c76b90AV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwE_JWf1DuaQ3rNe1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz8gwjZ3tqqPkg2BUN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxcZZ1WmQIgDOzuKsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugyes-3Q0p__66rjORV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxTspxgyrv9zxC4hJV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwSbdzx3ZFGflv-PSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyiJ4Z9jSOH5CWWr0d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwHa6gpC8Dd7ZkB4fJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]