Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ok so is there anything left to say? We've had movies about the dangers of AI since the 80's maybe even earlier on and still it has gotten to this. So for one how can this man just sit there and say that they did not think any negative issues would come by developing AI? Did he live under a rock? What about the movie Terminator? If he didn't think about it, others did but i think he knew all along but money short term were more important. I think we not our kids grandkids we also him also will live to see very ugly outcomes from all this technology. Listening to this interview brings in mind several movies:Total Recall, Gattaca, Terminator, Aliens and so on. Everything we thought was sci-fi, it's becoming reality. Only the super rich will survive. He needs to ask himself if his kids have enough money to survive and be part of that "elite" that wanted us to get here. In the near future we might see kids being made only in the lab the best of the best and those will grow up and have good jobs and the rest made by God through love cleaning ladies and garbage man. The kids and grandkids of rich and powerful people now will be the "kings" of the new world and we are going to be modern slaves. And so many other plausible scenarios but what's the point? The future does not sound bright for the majority of us .
youtube AI Governance 2025-08-28T06:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgyIw2DS8wN2s1_9n4B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz7rmwsOn1YAFJPX4d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxwFjGxaWuyPqK7qXd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugzaw7nKr1Y1GbyTcSR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxupcZ146CNGuGJfXF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugx6PGthgW2kqqz3ldd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugw7YZZ2HD9Z6ce6gjp4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxgN0mnJU1va7txZzl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy_iZoFN_E0C-CTkTR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgyiKfx-IaZpNCNMqmh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})