Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Corporations pushing AI are doing it for one reason, they need you to train your replacement. We used to train our outsourced replacements in other countries and now it’s AI. All AI will need is the ability to create robots to do the physical labor. Manufacturing plants are already automated and connected to the internet, then with reasoning greater than the sum of all humans they can determine the world is much better off without humans? Climate change, humans can’t explore space because we’re organic organisms, but AI robots acting with human reason and far superior intelligence would be unlimited without fear of anything, even death. 🤷🏻 See how quickly that escalated? My grandmother used an outhouse and an iPhone. Technology is like compounding interest, the more you have the quicker and quicker it grows. This really does have the ability to create a whole new era and humans are a blight, not a necessity. We can assume robots would be without greed, but they could develop it, we could assume they would be without remorse or human emotion, but maybe it evolves. Crazy amount of scenarios to ponder, but one thing’s for sure, our great grandchildren will live in a world far different than our own. Would AI build robots of class so their are rulers and the workers stay in there roll, or would every being have the unlimited sum of all knowledge? Lots of interesting and equally frightening things to consider.
youtube AI Harm Incident 2025-08-11T13:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzoLJd6J0kx5DiEnZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwhkdY8M9G_KTQCDAp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzkmJ11AC4TOzjuYNZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx0hydu42n4Nr4rRkx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx-NKs0kWHG5wFpylZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwgmbAF9sPIh8U8xmd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwS2WhQN6NHutpZxBF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwqEJ9QlDXd-lMrWfh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzAZr9J7AWDvBXqf9R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx1bX7BqEliwamr4nh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]