Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Just like with everything humans have created over time, this will surely be another one of those inventions that WILL BE TURNED INTO A GUN. Governments and people with no regard for anyone's life but their own will be using AI for their own gains. No telling how this will all end. But whats more concerning is that in the society we live in today, people will lose their jobs to AI, later on people will lose their Jobs robots, and soon humanity will have to justify its existence. Think about it, whenever something new comes out we get rid of the old and keep the new, soon humanity will be serve no purpose because AI will be so advance it will have no need for humans. SMITH in the matrix had a great line. "It became the machines/AI civilization when they started to do the thinking for us" as long as humanity keeps thinking you can give something untold power and believe it can control its a pipedream. We can barely control ourselves let along trying to control something as complex as AI, its not like a car where you can just apply the break and stop or turn the key and shut off the ignition and yet we still see millions of accidents each year. Imagine something where this thing just controls government systems, weapons, military, etc.... how are you going to stop that? By kicking the plug out the socket? Its not as simple as that when you look at everything technology is growing fast and more aggressive, at some point we aren't going to have sockets to connect anything to. Just like with wifi, everything will be wireless. We are unknowingly creating something that could be used to help humanity but it can easily be turned into a gun, like everything else humanity has created.
youtube AI Jobs 2025-11-03T12:3…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugwbk7zdZz3gPzOtbGh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwwU-QdMIm-Otr0a1V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyglBrDysKBlKjUq6t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx5Zz0W7Rk4-88DWNx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz0q84emTFWjSAfwPt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzYQIjOEAKYAfx4hMp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxoh2mb5FCPIOdrvh14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxZZZAY1rvWnS3zQyV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw_wnLYEXTew1ZwYah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxlkZAXX23WNf8QsJd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"} ]