Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
While this tech is neat and will eventually be slightly useful, it is nowhere near anything that would make me think that it's a serious piece of world altering technology. At most it is a new input method for user interaction with a computer. It is a piece of programming that cannot do anything not described in its programming that specializes in syntax decoding for the purpose of converting normal speech to computer instruction. Have you even tried to seriously use AI for seriously complex tasks? It's actually easier to do it on my own by the time I work out a promt that works and one that gives me something that isn't from a Lovecraftian fever dream. Imagine how far you would get if you could only do what a set of instructions allow for. In this situation how would you instruct something(or yourself) to be conscious when we don't even understand what causes it in humans? Or to act with intuition when presented with an unknown and new set of variables? It simply isn't possible to construct a consciousness when we don't even understand what it is. Now maybe spiritual stuff comes into play with AI and that is the scary dangerous part about this all but idk. All i know is that until we are able to measure and perfectly understand the how and why of consciousness l will continue to think that AI is a scam and nowhere near intelligent and is essentially a hyped up user interaction system like the GUI when it was developed
youtube AI Moral Status 2024-08-17T04:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzS4gzVEoRNhTkjULV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwXlfyRysQeBiJ5zTx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyEE9lM5SPyr7zqxxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyHb5IUt37GaB7KShB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgxVAbrYxVzWdih2YsB4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxsNNSoqTDSt0JHqDJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwwjG9ZDkCYU3faZjV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz3YyAy90zbJhJXNjZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx0IcAKkPSNjinUdi94AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxYZq-NnC5Sp6791vp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]