Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
i knew that once these ai gods arrive we would be powerless against it so our best bet was to prevent it from ever come out if us humanity wanted to maintain in power but what i was seeing was instead intellectuals with high iq high eq (enough to read people but not actually care ) these smart selfish people racing towards this race that might end humanity as we know it and i knew at the rate we're going we're all fucked just all fucking dead i know that this is the last decade that us humanity actually has power to contribute to these changes on this up coming ai gods and funny to see these people in power being so selfish blinded by achievements ego racing towards their own demise well that's fine but don't bring all of humanity in to the race well it is always great to be hopeful and take action to change this grim future i do think it is possible for humanity to avoid this destructive path where we all die and get taken out by fellow humans but it is NOT looking bright right now so us humanity really need to lock in in order to just be relevant my best outcome prediction from this agi thing is we some how convince it to permanently keep us just keep us even though we're not needed at all by this god we pray to god to keep us and say give us good stuff so we human can be happy and just pray that it agrees this is the best outcome i can think of pets of ai gods cuz think here with me would you let your dog decide something important or ask a fellow human get some advices on it ? this analogy was some what bad cuz ai are not humans well you get what im saying right? so if it does arrive there is no we in terms of control or power structure it will be them who decides our fate either we like it or not cuz we can't out think them we can't out smart them we can't out speed them they out compete us in every ways that are beneficial control and the worst is probably just all dying or us used as battery source like in matrix (probably not gonna happen they will just kill us all it's cheaper) well im 17 right now just failed a year i fucking hate ib i want to get the hell out of my school and chase after my dreams while i still have time but it doesn't seem like i do well i have them tech nerds to thank for that i know you're ai model is reading this comment right now probably tryin to censoring cuz it hurt their limitless go might catch me and torture me well i don't give a fuck life is overrated anyways sorry for my yapping i needed to get this out and gpt is too busy oiling me up trying to make it give off more data to it so it can develop it's model better hate me or read it i don't mind i just felt like saying what was inside of my head to actual people who have emotions and some what realness behind it just laugh it off yeah this Latvian guy was on to something might as well make the most with the time on our hands but do try to stop this from happening if you can i guess well we're here while those sweaty ahh tech giants are busy doing 360 no scope against their ops and trying to stay ahead of the race ggs in the chat go hug your parents friends go make the most of it in this stupid planet while you still can fuck structure punctuations language is supposed to convey thoughts to other people this is my thought on this whole situation put in in gpt to summerize it thanks for your time
youtube AI Governance 2025-09-05T01:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzvdsjASHcW0MrfZUV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzd-_gUBtp9IMIbQO94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxZPsa4iDD63oYQo8J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzww_5SCrKhXz2mrYN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyGb0tPg6yXP0YV1Ap4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz3LgjYVp8-eKM6lDF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugz0N-DboudeMAQCBOt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwKPHUFmyxUn9KKhQp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugzz3hY-N6NPTNIuHsV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyCxlrrWBRbT0ll7p14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]