Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Delusions + like-minded ppl= crazy Delusions + books = double crazy Delusions + Google = crazy * crazy Delusions + social media = you raised to the power of crazy You^crazy Delusions + AI = infinity crazy (I use AI as an interactive notebook. The very first thing I do is prep it for what I want to come out of it. I have a 160 credits when I only needed 120. I've read a lot of books and I love textbooks more than anything else I know what's under the hood... So I spend 10 hours on AI and I have a website done exactly how I want. People are spending 30 minutes on on AI to get a website... There's no magic. No humanity. But you can do something amazing with it if you prep and vet ) In the past I would have spent... 40 hours f****** with one line of code gotten bored and moved on to the next big idea... I'm finishing my ideas because I'm not doing the tedious things. 80% is good 20% is just straight out of its ass and if you keep talking to it that reverses. So I copy and paste with another notebook open and there we go it is changed my productivity and it is the greatest invention for me ever I hate you all f****** weirdos... We never get nice things... I work in AI now and I had to take a math test years ago in order to get this job and now I'm just going through Instagram and linking pictures with products... But on the side I'm opening my own School and we will not use AI for anything ever. )
youtube AI Moral Status 2025-12-08T12:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyuk1hBtKCsoVIMlGV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxCMM_nHx7vx3CUi4B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzVveglUuOdEOPDz0Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxX7TxDaYQ34a1_0RB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxtOHpYkiOjd13ruUR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzysJ0DzXsAajmE7B54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwAYhjcm3oJXCmDcaR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxqlRre80WFcsF3yyF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzNGyLesXo-3GaVwj94AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwRHE4F_qhgUGRtXdJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]