Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't this AI as strictly a good things or a bad thing. Displacing real artists sucks. AI has gone through many boom and bust cycles in the past. Maybe investors lose faith in AI in the next few years, They are over speculating in it after all. Even the internet was once a bubble. Every time AI comes back it comes back slightly less shit. I think this tortoise is going to keep coming for us. Many think AI is just a copycat of things humans do and will take our work for less pay and do a worse job. I think that is true for art, however I feel the new branded 'thinking models' can solve some novel problems with reasoning instead of copying. It might be a sign of things to come, based on first hand experience this is more than marketing hype. This applies to problems like math and coding there is intertresting emergent behavior. It's because of this I believe AI someday AI will do a better job than us at lots of things. Which mean more jobs are getting replaced which really sucks. But with it may come new discoveries that enchances our lives while other parts get worse. I can understand your line of thinking more so than people who think it is all going to be sunshine and roses. I think we get even poorer and laid off, but hopefully AI brings some more frequent good like protein folding advancing medcine did from AI. I feel it will need to do more a lot more good to make up for all the bad it's done. I'm still hopeful. I reccomend to anyone who made it this far, an old video called "humans need not apply" by CGPGrey I think it will come to pass that one day AI is slight less shit. And many will be out of work.
youtube Viral AI Reaction 2025-03-31T04:4…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugw_K-KFYmEjVOz9co14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzOm6UD_Em88TFwZsN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwrxD_UfOjmoNe8zn54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw4K5Tk-OujCXU4UJB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgydL2usNkyrZhBgeJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwuSu322fM3wIevmrl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwBsOXoieLht83czSN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugys2z8WVw75KAgAWYB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwAXx6QupsOU7XNvv14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxHz03OWUeMnHDwI7N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]