Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Level 68 equivalent Microsoft AI engineering director here - Yes, it actually does take a LOT for your story to become reality... it is completely far-fetched. You loosely base your story on a couple things: 1) AI will continue to evolve and surpass human intelligence. That requires *more than creative uses of a sentence completion methodologies*; it requires cognition which some of us are working on but a very long way from obtaining. 2) Running an AI will be cheaper than employing a human for the same job. This is also not correct for current techniques; LLMs are charged per token and it takes a TON of them to do anything meaningful. An example; we see that a developer having an AI write code for them can rack up over $200 per day in token usage; let's take the lower number and say $200. That's $216k per year and for the fairly near future you'll still need a human around to validate results. Since scripting languages such as Pearl, Python, Ruby, Scala, etc... are tailored to the power-user and not real engineers, we ca hire them cheaply - $100k or below in a DevOps organization. Mind you - you'll still need human's to validate as (remember?) cognition is nowhere in sight - we're stuck with LLMs for quite a while. 3) Media and the government will stifle ideas they see as destabilizing - yes, but you're completely ignoring the fact that some doesn't mean *all* to make your story sound more dire. Many people (not for the better) get a lot of their information from the very same outlets you're quoting as foundation for your claim.
youtube Viral AI Reaction 2025-11-29T23:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxSqdBUrDocME4z3m94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwq0hBKHx7UN9-xWDt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyLEIsTLPONHURNOCN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwNNZ6e9S_2amHPQdN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzl9NOqnMYe2nl_fBt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxf3d68JYvkHbxb1Eh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz2d4MlVzn3_O07dQ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwvNcD4I66G763qbjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwC2oh34F4zi0Ep9714AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx8PyfD8voFM9PehO94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]