Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What's the point of intelligence in humans if AI can just do the effort? Our int…
ytc_UgypmoZA1…
G
This is the whole idea behind human creativity... It is to create solutions to p…
ytc_UgwdbM_P-…
G
It is. Not in the sense that there is a generalist AI currently watching and con…
ytr_UgynUiCZx…
G
That doesn't mean it's not a horrible policy. All available evidence we have has…
rdc_dcwnif8
G
Asimov did not in fact know how to build an AI. He wrote stories about robots an…
ytr_UgwN-qYvn…
G
It’s already doing it and in the operating room. Not necessarily like a robot. I…
ytr_UgzpaBrL_…
G
Can I please just die with same biology I was born with, (no chips, no nano-tech…
ytc_UgwTy4n1Q…
G
I saw a thing where a meeting of CEOs were celebrating being able to steal copyr…
ytc_UgxhLYT-U…
Comment
I miss ALL the facts at the end of the video. Point be there is money to be made in AI doomsday scare, stifling competition by scaring away the funders etc etc. This included Ydkowsky and the owner of chatGPT, because guess what, with a moratorium on machine learning, other companies developing the code can be eliminated. ChatGPT is actually not smart and you can train it to give you the answers you want. It is prone however to making up stuff citing fake references, because it is incapable of accessing papers. It is not sentient. Anyone, you can trace throughout out modern history all the examples of moral panic surrounding new inventions. We are not on the brink of extinction yet. I’d much rather be worried about the brain disease called tik tok.
youtube
AI Governance
2023-07-07T06:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxbeBHfy2-MZrq4nnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwP_m4ojqtYaDosLmF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy9TIa-COms-4OCwzF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzVkJXwmSI74W8deYd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwVgymMdbWhpJTDlph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzGAOsQADSQDbmcjfB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMIoyPAPmNNhXPBmh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyhd_jqWhuY0y3oNQB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyAkKVHsdSNwk-aMw14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTgX0Y6GJAdnCY0dZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]