Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I feel as though this stuff has gotten too much, and I will clear shit up when it should be. AI is basically 0's and 1's, it is only and will only be that pair of duality. AI doesn't understand anything either; it merely just follows patterns and orders. It is not smart or wise, just basically a parrot with no brain. AI is digital and requires data; Humans do exist and require food. And if anyone counters that with "same thing, different labels", remember: We aren't lines of code running, we are beings running for a goal WE create for OURSELVES. AI will never replace humanity, and if anyone says anything related to jobs, remind them that electricity, maintaince, cyber defense, and alot of power-to-grip scaling requires a lot more money than a couple fellow hoomans. And I swear to whatever God there is, AI isn't the most advanced; the field of quantum computing is WAY more advanced than lines of code repeating lines of more code.
youtube AI Responsibility 2025-09-24T16:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgxF3uStRvq4cfaZt9B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgySoZh0HYqWwqbGcmZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzCZ3R0Hu6EA49yU9R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwNU3xdjKN_6FGzN154AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwgw4WxlENkGRVNjB14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzSmzSFCxXOdnC0ljZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxQBHicKrXd0uiU3w94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxAtoVJ1VlQwidusNB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzEGE8Wp4CBlQdbPMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxtyTNFFR_bKz-dEF54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"})