Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have a good glimpse on the future: I wanted to make a website using node.js. I know javascript very well, and I know web development very well, but I'm not familiar with Node tech stack at all, so I used AI pretty successfully to start the project, but with the first real problem it got into a loop of solutions(try this, then that, then the other thing, then the first thing again) and I had to solve it myself, and I knew exactly what to ask it and what I want to achieve. On my workplace, where I work as a senior dev, I have copilot switched off almost immediately after it was introduced, because using it on a real complex project is counter-productive and wastes my time. I tried to solve several complex work problems using AI and not once did it help me, always leading to wrong conclusions, and hallucinating settings and facts along the way. I think people who have money invested in AI/working on developing AI, are straight up lying about the future to drive up the stock prices. And AI enthusiasts are just imagining things, just like people imagined we would have flying cars in the future.
youtube 2025-08-27T07:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgyxMkQjuaTYPgAqwNB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgxHDd2WzvsqpmjIAeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugzr_ytdybgJ27k9mLF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgycST_fj9_yjdrhLK94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwTYwgDs8zFqSzkNgd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgxApQpkuMMsF75nVPx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzTGo5a-XgH0Bzp63V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgwMMZfXSY-8WkWf7bF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxfP1KRbSpeGlmHbQV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugyx_036T1ilBFK8KYd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}]