Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hearing Bret debate about AI becoming its own 'species' really makes you think a…
ytc_Ugyy8kees…
G
This video made me donate to charity so good job IDK about the AI but you succes…
ytc_UgxhU96e8…
G
UBI is a huge and important part of the solution. Although, it could be better t…
ytc_Ugz8wrcda…
G
Architects barely starts a career already taken away by AI hahahhah . Tjats why …
ytc_UgwrLRGVs…
G
... this is just homeschooling. Minus the AI. You can get all of your stuff done…
ytc_UgyZ1_eQf…
G
In this age of obsolescence, the disadvantages of AI outweigh.
This is a good sa…
ytc_UgxYaqen3…
G
I’d argue a banana taped to a wall is more thought-provoking and interesting tha…
ytc_UgxZWvBrm…
G
You mean the thing that's already happening, there's new AI that can take a stil…
rdc_kcj8ktn
Comment
"Our mission is to ensure that artificial general intelligence -- AI systems that are generally smarter than humans -- benefits all of humanity." -- Sam Altman and OpenAI
"They think they are positioned to decide what 'benefits all of humanity'." -- Critic
No, that's not what he said, at all. He said the mission is to *ensure* that it benefits all of humanity. He didn't say that the people in his company alone are going to decide what benefits all of humanity. He didn't specify how that would be figured out, because, as these "Utopic tech bros" have admitted before, they don't know that yet, but it will likely be a multidisciplinary process.
youtube
AI Responsibility
2023-05-15T05:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwljBWmxzgxcaXOQ9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZYhJdAKAJtYh6C-14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxdB0d6ADB3PyXXAop4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzCLSZbg6utOqRsat14AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw_oPrcYN7A9J6nXtV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwhq586lNWboAYmuf94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxc2Rep3p7Qc3XvxYN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw55r1btDusdw_xqJR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwqNidnglB6R1fVhu94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyfL2kFDuFf2dnHpUp4AaABAg","responsibility":"industry","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]