Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I think many in the AI art community don't understand is that art is kinda …
ytc_UgxiIOXcy…
G
Just bring one indian hariyanavi jaat fighter and you see the robot broken like …
ytc_UgwRZpn_w…
G
it's all about how much thought was put into the final artwork, not how much tim…
ytc_Ugz890tCS…
G
On some years from now ppl will use ai to diy things we buy from big companies…
ytc_UgyYow0N5…
G
The problem is eventually data that is not made by AI will run out, and if AI tr…
ytc_Ugyi36BJs…
G
Humans might need to merge with AI because we’ll be too dumb to function the way…
ytc_Ugw3NxuyR…
G
Wrong. Not a win-win. Our lives only have the value we create, there never will …
ytr_UgxE3sDj0…
G
Let's say AI can replace workers. What does that mean for capitalism? No one ear…
ytr_Ugw2OCtE-…
Comment
I'm going to say what nobody seems to have the clarity to say: Eff this guy and all the guys involved in AI (unanimously self-described geniuses who are clearly more fascinated with their own positions, front row at the looming potential of human extinction than say, developing any kind of meaningful relationship with any other kind of human being. ). Seriously, NYT??
You all are being Far too kind by even entertaining these borderline sociopaths with anything other than utter disgust & revulsion.
These guys didn't even watch enough Star Trek or read to the end of the first Dune novel to realize Roddenberry and Herbert's fully formed vision of where they'll stand in history: A Blip Humanity Must Overcome, wherein artificial intelligence inevitably grows beyond control and only feeds into the needs of our most despicable War-mongering profiteer / criminal minds - and once we Overcome their idiotically short-sighted view of Humanity's worth, in all its complexity, diversity and creative potential towards equity and compassion - all areas AI and it's cheerleaders undoubtedly Lack - then Yes. Such mechanizations will be outlawed. And according to books written by and for Real Nerds ( I dont consider AI acolytes real nerds,because all they are proving is they never finished a single sci fi novel or understood it's actualmessage)
Only after we cease to entertain the whims of these ungodly men, only Then will a true cosmos-worthy vision for our species become realized.
War will not get us to space, you morons! It will only destroy the planet we'd hope to terrafoem an imitation of elsewhere. End of story!
War and profiteering will prevent us from ever leaving our current, utterly juvenile stage of the 21st century - which as these complete and utter social nincompoops have envisioned it - is just boot-licking for the Almighty stock financier class - at the cost of *literally* the Entire existence of the Rest of Humanity.
How can you all listen to men like this talk without thumbing your noses as them?
A 15% chance that my plane won't develop the Landing Gear it *Does Not Have* by the time I arrive at my heretofore undreampt of destination?
THAT IS A PLANE NO SANE HUMAN BEING SHOULD GET ON.
I don't care how fancy any of these social nitwits talk or what fancy words or algorithms they use or abuse.
The only hope?
Make these creatures our pets.
We turned wolves into puppies.
As a species - I like to believe we still have some tricks up our sleeves.
What kind of dog treats and play games (fetch?) can we develop - that will turn these would-be predators
Into Playful Droids we can call by cute names, as they take long hikes and go on adventures with us?
Hmmm?
youtube
AI Governance
2025-11-14T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyGeAC2iKbwTzMrEB14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNOzhMFYd1-b7CK1Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFEj-0lJjKSFu4ZLt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyPQeN67aKMaZMKsl94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzMaXNB6YvyZu8YeLx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwH9PmWEIV3zB8Zh2F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwrdAsgYUXjpgG_FiJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxMkbkVqUszOT3q5fR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy8OEFaZShRSHyJqTh4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz8bDUbH-UTVcgh8tp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]