Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I disagree with Elon Musk’s seatbelt argument. Okay, fine. You want to force au…
ytc_Ugwn1p--f…
G
Imo AI vs Art by actual artist is still vastly different, AI art is at a point w…
ytc_Ugx1r1zEa…
G
If all you can afford is chatGPT therapy, please do it. Its good, I know. And if…
ytc_UgwDIHDB2…
G
Why do you morons think LIDAR is bad? More sensors would be better in every inst…
ytc_UgxY9S-U3…
G
"you wouldn't be as critical if I hadn't used AI" bish what are you smoking…
ytc_UgzYUG7Eb…
G
@RawrxDev I don't think I every claimed that a machine learning algorithm was ma…
ytr_UgzlH7xf5…
G
so what happens if the u.s. stops these data centers and lets other countries de…
ytc_UgyaUxJ_i…
G
All books are written by man, too. A crazy person killed people with a gun, so …
ytc_UgxHTnbgU…
Comment
You know, they literally made a movie series that covers why AI use is a bad idea and why integrating AI into sensitive military faculties is a worse idea. They literally made movies about why this is a bad idea.
THEY. MADE. MOVIES.
As flawed as human beings have proven themselves to be throughout time, they MUST be a component of sensitive faculties such as the military. Humans have the ability to reason. To question things. To be suspect or suspicious. To be wary. They possess the ability to second guess and work on hunches. AI doesn't possess rationale like that. AI is arbitrary. Humans are not. It's actually their human flaws that make them invaluable and intrinsic. Integrating AI into critical faculties is a mistake and a genie we wouldn't be able to put back in the bottle.
I can only hope that mankind is savvy enough to know what's at stake if they do integrate AI into critical infrastructure and to avoid doing that at all costs.
youtube
AI Governance
2023-07-08T20:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzYuuuQ8NZT3X4BJRp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx-DXe9XgKmMQFuu8J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwmo9f-qny_u60Bxrd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw5eTYiGWCuQQY-6PV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyg8k4_XG_-ZMv00Ct4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxv7-aia9V5tCK0_9F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxe8Iu0bmdbUnqBZ6R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwAonnojDfa8MveD2J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzcMdOTRvYiaG1hRVt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz0GDYhZBYt42L7jXR4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"indifference"}
]