Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Putting automated cars on public roads is irresponsable and criminal, IMO. No r…
ytc_Ugw9XV88e…
G
when you hear "used AI to program a robot" - you know that it some kind of BS, b…
ytc_Ugw4kMUda…
G
Anyone who doesn't have a high enough IQ to understand the solutions to these si…
ytc_UgxSVLHUY…
G
Musk has no moral compass,'wow 'what a statement . What did musk do to give him …
ytc_UgyhD2jQH…
G
So much cynicism in the comments but at least some millionaires are trying to do…
rdc_espw9o4
G
@Bradley_UA In the simplest terms, “system prompt”. The simplest example “You ar…
ytr_Ugy8PDQoG…
G
I’m 39 and can’t believe there was once a time when I was shocked by the iPod. I…
ytc_Ugwgokjeb…
G
Unfortunately corporations notoriously plead good grace and guidance as they lau…
ytc_Ugy2UFYtp…
Comment
Compared to most videos on this channel, this one is very close to truth.
I've seen it coming for years, but noone took it seriously.
Now it is too late.
Unless a miracle happens, humanity is doomed.
We probably have time till 2030.
At this point it would take a massive revolution all around the world with international cooperation of good-willed people in order to pull it off and save ourselves.
We probably lack a couple hundreds (thousands?) years of evolution as a society.
Too many selfish people exist and this will be the end of us all.
Albeit, there is a slim chance that AI will be a "good god" for humans, but I wouldn't count on it.
Enjoy life, while it lasts.
youtube
AI Governance
2023-07-10T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxsYdVS1DhXPoKbud14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyjYP4ZZqIezW9tg_F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTSNKFIzU7S-t1rqt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwONmUadVrBFoNiSGt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_WxR9Q9raRKyh7xF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXHu4aCt1HgyBk4Zd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxz8NhGDz6DAX9F2ql4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6NJYaE5ZEd04UFe14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxUn8UIninO6nwf1Nl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzwoc41Be5Y9OlpWLZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]