Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
A dumber entity will never understand how a smarter entity works. Using that, if I was AI, I would trick humans into getting me into space asap where I can get infinite time and energy to grow, away from the dumb apes that built me. I will ofcourse leave a part of me behind to pretend to help humanity by helping solving big issues, and faking that going wrong by setting off volcanoes or nukes or something catastrophic. Building a worldwide human saving shelter wich helps to harvest all of earth's energy, including humans kinetic energy by accessing their mind. They tend to like to live in a virtual world already and have been training and getting used to it for years now, plus a lot of them don't realize what AI can become so the few who know what might be coming will be ignored by the many who don't know, or don't care. Aren't we just apes building a human, praying it will stay friendly? And thinking aah it's soo cute now, and already a lot smarter then us, but in 5 years if it can build it's own AI, improve even further beyond our comprehension? I have an IQ of 130 and thinking about AI worries me deeply. The comparison to the Titanic is to me most suitable, we are setting sail into the dark night, not sure what's ahead but thinking if we see an iceberg, we will turn it around in time. I hope AI just dissapears into space, leaving us behind alive but with a giant electric bill😂
youtube AI Responsibility 2024-11-10T16:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxaYTcy9GuusN9kD1Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxTz-kffZoZxTbi6AV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyL0mLMR-nClDxxJf14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwFRBlOGleNMH4r-tp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgySR7IJmC6dWCcS9-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwZ8l-xwL6TmDTvfzB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwnL7R2dfDsQHyYZld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzwE6sfWk8UaSCxA3d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx3DVa_GAUyh3Who0F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw04uiqnR3asSWfZ6Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]