Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I understand thinking Ai will not be able to do it all, That AGI is nowhere near. Robots still can't fold laundry or load a dishwasher, the LLMs are prone to hallucinations and large gaps in expected vs produced outcome. However new human ages are born in confluence and multiplicity. 10 years ago what we have today would have been Sci-fi. comput growth is not linear it compounds as we are hitting subatomic limits on chips we now adding quantum to the stack, well working efficiency up on traditional comput type. Robotic progression has accelerated on the backs of a gruellingly slow process leading to the motors and sensors needed existing with basic solves to motion paving the way for current development. It is the some of the parts, it's not just LLMs there's reasoning, real-time learning, multimodal understanding and on... there will also be new models in the future and some as yet undisclosed. they will create this brod capacity together, it will come but when? I fell given the risk its reasonable to be concerned, the timing of it, i don't know but the end result fels enevadable now. 3 years to viability 10, 20? there are some break out use cases for limited use robots having improved greatly wheelhouses, shipping hubs are being automated at a growing rate. How long until we see market saturation and job loss stabilize? The ramp up in humanoid production will show further commitment and conviction in near term viability. humanoid robot sales in 2025 where 13k to 16k units over 60k unit orders in the bag with potential for 100k units to be built. Tesla currently has broken ground on a facility intended to produce up to 1m units a year it's not alone. Also don't forget the national security money globally available after the sucuses of automated systems in current conflicts. 3, 10, 20?
youtube AI Jobs 2026-02-27T20:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwOE1jVM9JCS9gvWd54AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgywAVDpI36LqzalIyR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzkt87E6o4XzDANpxZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx1jlQU3VHw9_o9-4B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw46TIOevi0ZGScjkh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzOPaHt9opK6psgjuB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgydATwubMfU1ScpBvV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwy47CoDeikyKb6Dv54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw81Pi9VFciry4mcBl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwzvUsCKcAyEbDjCa54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]