Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not anytime soon... but maybe sooner than you'd think too. There are already mul…
ytc_UgwqKfixJ…
G
@Therealmalamarhonestly AI is a great tool and only a tool, asking it to move h…
ytr_Ugyixex6Q…
G
Yeah this is complete nonsense and he knows nothing about how neural networks wo…
ytc_UgzDN340C…
G
Without fresh water 💦, life (humans, animals, plants) will die. So these idiots …
ytc_UgxRqHQuS…
G
@DWS205 “well it’s clear from your question you don’t even understand algorithms…
ytr_UgxkqCUJb…
G
Why would a robot fear death if it knew it was able to live again by simply bein…
ytc_UgxyOHFNZ…
G
AI is bigger than the industrial revolution. It's comparable to letting the nucl…
ytc_UgxeVjx5q…
G
Its hard to find a human responsible for an autonomous thing.
Is it the robot p…
rdc_cqikxcw
Comment
If a computer is only as smart as the person programming it, or its basically a key word algorithm and is simply data harvesting and then just suggesting the most commonly used answers.
There must have been research done, with control groups, a computer programmed buy a bunch of lunatics isn't going to be as good as one programmed by a bunch of PHD scientists for example. You simply don't put a paranoid schizophrenic in charge of a nuclear weapon. How does AI differentiate between sanity and insanity, does it know it's artificial. Can you teach an algorithm to impersonate, convey empathy, by definition it's a mindless psychotic lunatic. How can a non biological electronic entity convey empathy, if a psychotic person can't.
youtube
AI Governance
2025-09-04T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw2T1KgKVj-1-U2-pZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw9WxnNB4hY3MO3lCx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwY-hHMICCvaJtX-8d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxslmepTP1tI3g4E6R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqmbNVnu1AgOA_zNh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxwHjYgBXvclUNDt2p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzA_y-qwq40xdjJfbp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxoGQXhndawsdUIFbt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGRWEjUPQ4Chx-n1d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAUoS_rdHx6b5RqKl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"]}