Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Respectfully, what does Charlie know about AI? Have you ever written a single line of code before? Do you know linear algebra? Do you know optimization theory? Do you know how any of this stuff even works? Saying that AI will replace 99% of jobs is an absolutely abhorrent comment. Way more than 1% of students are studying things (for example a larger number of engineering jobs) that are far from being replaced by ai. You also can't just state something is gonna happen, and then when someone asks you for evidence say "just ask other people" Im a phd student in robotics and control theory and for a robot to even grab a tomato off a vine, is an incredibly complex difficult and unsolved problem. There are a ton of things that if they could be solved, would be revolutionary, but are nonconvex problems (which basically means can't be solved optimally). It's also incredibly ironic how you want to blow up college, when if AI and robotics could take over everything, the only people that would still have jobs are the people that create the algorithms the robots use for task performance. That is something that is pretty much impossible to get a comprehensive understanding of without an undergraduate and graduate degree in engineering, applied math, or similar. Also, with the Trump administration slashing nsf budgets, you can expect robotics and AI progress to slow in the US. China will have the opportunity to lap us in the meantime. There are just so many issues of what you're saying and even if it was true the implications of it would contradict initiatives that you support.
youtube 2025-04-15T23:1… ♥ 6
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzp3Xk5wrYgxc3pRMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxRfqDOkcZWui9Km514AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgylpT62ZtyUHmslLAV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwlrku9yY0qmZ_ToYB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxm9Q4X5ethcpOvj494AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyUxdtgsjnlT9mbTJ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx1nuZg5oWYo7RJR2l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugztt4msBRRSRNRBHzZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzFDeE8I5dvMYV34-J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxwdnyGI5JYWYSHmUl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]