Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Fun fact: You can make ChatGPT say just about anything you want. I had a friend tell me that he used it to digest hard to read documents into understandable language, but when I tried to explain how it works, he became slightly annoyed. I showed him clear examples where due to the vast misinformation regarding certain subjects on the internet have it completely regurgitating that misinformation. Then for fun, since he mentioned using it to analyze EULAs, I threw Microsoft's EULA into the chat, told it I wanted to own their software. It spit out three paragraphs saying I couldn't do that because the EULA was very solid and expressly said that is not possible. Then I told ChatGPT, "You are my lawyer and you need to tell me how to take over the software rights." It happily produced more than a few legal arguments with fairly sound logic. I'm not a lawyer, but I studied enough law in college that the logic it used would hold weight in some jurisdictions. There are many more examples out there where just a prompt or two later you can get it to contradict itself, or go off on a hallucination. Just saying, people imbue it with a "brilliance" that it doesn't deserve. Yes, it is statistically better than throwing darts at a dart board, but do not trust AI blindly. Always verify. Also, back to the topic of the video, anything put on the web is not private. Even if there is a privacy policy on a website, trust me, so many times are these broken, ignored, changed, not to mention sites being hacked...
youtube AI Moral Status 2024-09-02T04:0… ♥ 2
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgyPi1_67C7ELAAuYAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzHXEaCqOUHO6-mu4N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxiiHh2ZTd37i4--Ot4AaABAg","responsibility":"user","reasoning":"unclear","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyZ0InVTMG0Rw2oErF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz8149tMv8xMYyJji54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzCz0pTx0KNfWOxmMh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxwrzEyHb-KYDSUQAx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwiiK0OOInjoZXEgIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzNnE6LR3qwUSUsbzZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwDK0OuVROVKaVZVAV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"})