Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is all propaganda and height. You need to realize this technology. Is decades away from even taking over the truck driver's shop? I drive semi truck myself? I see the AI technology, I used to sell computers and I understand AI is nowhere near where it needs to be you cannot have AI driving with real human beings on the same road. It's a recipe for danger and destruction. They are doing it in Texas because the roads are straight and brand new in that area and they can get away with it. But as they realize the technology's not nearly there yet. They're gonna stop production of this because they're gonna lose billions. For 1R road system is so outdated. In this country, you would have to spend trillions on infrastructure redoing. All ropes, I'm not kidding. Every road needs to be redone. Repainted and probably designed to take more views because of the weight of these vehicles. Goes today also, also, most likely. They would have to make all human drivers not be able to drive their vehicles anymore. It would have to be all AI for every vehicle. That's not going to happen in our lifetibecause you. Can numbecause you cannot have computers? Driving with human beings that are unpredictable to the computer. The AI power is not there yet. It's different for airplanes with autopilot. Think about this most airplanes in the sky can go into autopilot. They don't need humans to fly for the most part in the air. There's far less planes in the air and easier to maintain for driving vehicles, cars and trucks. There's way too many people on the road. Millions of people drive in daily and already. We're seeing Tesla vehicles getting crazy accidents. Because the software is nowhere near. These companies are just full of propaganda. Trying to get people to invest money and then they'll sell their stock and leave because they'll realize the company's doom to fail. This company you see on this show is already dropping their development. It's not going to work. It's too dangerous. Do you really wantna see a semi with nobody in it to course correction if the computer fails? And the computers do fail. Not having someone behind the wheel is crazy now to have the AI technology assist with driving to make it more safe for vehicles. I understand that. Yes, you can do that, especially if a driver falls asleep behind the wheel, the A. I should take back control and and do something to wake the driver so that they can pull over safely. That would make the most sense. People fall asleep behind the wheel. The AI takes control and safely. Stopped severe cold into a safe area. That would make the mosthere's there's many things that d AI can help assist with driving but no, it is not going to replace the driver for a long time. The software is nowhere there. NBA I technology isn't there as well so as far as my fellow drivers don't be worrying about anything. This messages to all my fellow drivers. Do not worry about this technology. Taking over most likely it won't take over in our lifetime. Maybe in 50 years we will see, but not right now
youtube AI Jobs 2025-05-28T22:1… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxPMjrnGalTavxDLSl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx9_t4mPFIVupJ54jN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwRLGtIEZDbIYovUjJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxuKfrsAWIuyF9fCuN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgydJOjzKQhAwBu5HYF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzp5TYMgYbygvA1YLJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx-eWtxaFqNpB6f0Gp4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugz24h7yvy1TNlYgjzh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzhWReZEnWnauszm0Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxAiy_gZPFSNBIVaLh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}]