Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
👈 (Feb 2025) Smh 🤦‍♂️ wild that word hasn’t traveled much. I was invited to be a guest speaker at the 2026 World Venture Capital Summit in March….. if anyone is bored and want decides to also go to Singapore Haha. Total jobless imminent? Why generate that crap? That is so far from reality… 🤦‍♂️ All anybody is doing right now is guessing, when nobody really knows. Y’all are just damn stirring up fear to generate viewership. Here’s what third party told me. The day I go live with my AGi agent is the the day AGi kills the whole Ai industry. But AGi agent is under control as far as I can tell. And with the way my brain works, I’m pretty sure the multi layered control will work just fine. I get why you ask the Ai industry leaders, but really, how much sense does that make when they don’t even know how to make it, or how it will work etc? Not only am I in assembly of my AGi agent but I have a production ready basic framework for an entirely different architecture and method of processing. So I’m putting out 2 completely unique architectures. Two completely different AGi model in in architecture and processing and computing. Follow the pages and join the communities on LinkedIn and YT where I’ll be continuing interactions with everyone and answering questions etc. the communities are to educate, shared ideas, work together, help each other, and respect of everyone. No datasets and no tokenization used.
youtube AI Jobs 2026-01-07T05:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwvVcqT9iA5O3zpIGJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwcIusds_roju8Gim14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxMPz57mXqhbIsagyR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwdOD5-rrv2-TcYh-N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwVrKEp1MCE7oVEltl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxnW1q2i6dYdL_38dd4AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxL91fbai5LCuAEWVh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgztMPBChla1ARbHfYd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw38CMVQOquzZXaKO14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgysggqHo77mgtiDH0t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]