Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This has been a LONG time coming and we'll just have to see how this goes. I do feel as if we're getting a little closer to self-determining robots. Back in the 1990's, robots just had to be programmed to do a particular task. They needed to have programmed if/then/else type structures so they would know what to do in a particular situation - but the kind of neural-net learning and self determination they are starting to exhibit now wasn't a thing. Terminator 2 came out in 1991, I think, and referred to this. In fact, the entire plot of the franchise from 1985 is based on it. In "The Time Machine" by H.G. Wells society becomes totally reliant on technology and atrophies until they basically have no idea how any of this works and split into an idle race of useless idiots and an underclass who unthinking maintain the technology - and he wrote this in 1895. I think it's almost inevitable that some roles will be replaced by robots but this has always happened - people threw shoes into looms or pulled up railway lines in protest to them replace stagecoach runs. It's been going on since Technology, as a concept, was invented. We're currently some way off a C3PO or R2D2 type of robot, but the day when your groceries are delivered by robots is here (starship) and I can see the day when those are able to be handed a package and told "deliver this to Amazing technologies, 123 North 4th street" and it'll wonder off and do that on it's own and will know how to address each person they meet - if that person prefers a mute presentation of the package or a friendly, chatty robot.
youtube Viral AI Reaction 2024-12-20T08:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzRnTLPUGiCre3uVOd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy_vTbi-zWHAHPTdCF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzyJSm7TOqCn7G3iUR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzlfC5TW3t-lZkEN1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxyG7mOP7Dof5rx-nV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"disapproval"}, {"id":"ytc_Ugw7ydtU1ZOsaoYZby54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgziT-7tBMmd6t81jGV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz2gjP8B8LFDu3jmmp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwYDX42EckbtoWPS3B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx94Y-R7yHt7-h27Ep4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]