Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
She can’t control the car because she’s all the way in India! These companies us…
ytc_UgyXh-EzQ…
G
Although editing is an art style, I believe you are supposed to show off and cre…
ytr_UgzahA4pX…
G
@Mystiic_Adilgashii If you actually believe that history they taught in school, …
ytr_UgzS5iBC_…
G
It's like the end of the internet frontier, people can't think of anything usefu…
rdc_n74vpjb
G
Let’s just make anti AI laws for some sectors we are placing terrifies on things…
ytc_Ugx-yYzzd…
G
Addendum: Soooo many people love to strawman things like what I just said by imp…
ytr_Ugy-9l3p4…
G
@jhonshephard921 swing and two misses. I’m an American. And AI can’t make handma…
ytr_UgxXorIDO…
G
Are you promoting the idea of climate changes while talking about about the AI? …
ytc_UgzDX-lQ9…
Comment
Folks, I'm not just another guy screaming that the AI Sky is falling. I was studying Marvin Minsky's work on AI 35 years ago and have followed AI longer than that. Now I want you all to understand one unbelievably simple reason why self-driving in traffic will never, ever, _EVER_ work. They say, Never say never, but I'm saying it now, spinning on top of a mountain like Julie Andrews in _The Sound of Music: _*_*N-E-V-E-R ! ! !*_*
Why? Because System_A, no matter how "smart" it is, cannot co-operate (note the hyphen) with System_B, System_C, System_D ... System_N which, if not co-operating, will eventually operate outside the boundaries presumed by System_A. Now add in the fact that System_B thru System_N all heavily use Fuzzy Logic -- sometimes incorrectly -- and you have all the ingredients of System_A damaging or destroying System_B thru System_N -- _AND_ System_A.
As you've no doubt figured out, System_A is a so-called "self-driving" car and System_B thru System_N are cars piloted by humans.
No matter how complex and well-trained an AI system is, it cannot EVER safely co-operate with systems which neither know nor care that it exists, and that don't follow a presumed complex set of operating rules, either willfully or mistakenly, and cannot be marshaled to do so.
Capitalism is a wager: will the portion of this company's debt that I have just purchased through its stock offering eventually meet or exceed the value I expect to realize from it when I sell it in future. It doesn't even have to be an actual product or service; it can just be a dream about a concept that dupes people into believing in the same impossible dream.
If you can convince a potential investor that unicorns sh1t cupcakes, that you know where to find unicorns, that you want to open a cupcake chain, and that the potential investor will make a million bucks by investing in it, then you have just made a Capitalism, my friend!
But unicorns don't sh1t cupcakes, do they? Well, no one will ever know because unicorns don't exist. But neither do so-called "self-driving" cars.
youtube
2025-12-05T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxisDonWR5EfGguGal4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPNzGFuvvq_pSZMFV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWoDbKU3UvaLuDIJ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyuUTz-3FPvAKlJjoV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzNglMXN-55zUB8RMV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwpaD_hseQeY5pxWVt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMzxT-JCgylWRc0tV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwBVEDGJbhNRqwrLmF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzY0a11ebebLyCsLmR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzxUQoKQtf5O8AjhU54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]