Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked chat gpt if AGI is conscious. It told me that AGI is not conscious becau…
ytc_Ugy900VEl…
G
the reason that “bad” art like fountain or the banana can work is because there …
ytc_UgzATfn4T…
G
What will happen is an increase in media coverage for tragic accidents caused by…
ytc_UgzGmLXAo…
G
Already we are debating what is real and what is not and AI has created so much …
ytc_UgxgXCukl…
G
I would argue that AI art, can be art, but not in most cases. Most AI “art” we s…
ytc_UgztEv8qc…
G
Lately when I mention I'm a software developer there's a 50/50 chance the person…
ytc_UgzOxPeD3…
G
Sometimes your right but i need ai to answer sometimes i don't need ai cause i k…
ytc_UgyZzMBuB…
G
Don't want to believe what Chatgpt says about black people/ African Americans bu…
ytc_UgxguoxNE…
Comment
Sadly Tucker got a few crucial facts wrong in his summary statement:
- An artificial intelligence that is smarter than humans in a general way has not yet been created as of April 2023. it might be only a few years away and there is a world wide arms race to be the first
- Google is very well not ignoring AI safety as Tucker (and Elon) make it sound
- OpenAI is not Open Source and they are debating whether it would be a sensible idea to opensource their code. Also, it is no longer entirely non-profit, they had to stop being non-profit because required Computational power is so darn expensive.
Also, it Must be noted that Microsoft has lots of control over OpenAI.
youtube
AI Governance
2023-04-19T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwii6JiA6KIRWLtcHZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzpuymob-5QzAeP8kZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy_aSj1b3dxideeQSR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwwF4vh-YmBqoi7krV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugycb7zkx34bE86s6eZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFKX6wxxVoBLpStjl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxYSMnK-xjKDSr3PDt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz0Z3SJ7IwEQThOZwx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxyBLKrkutQ-dIfo8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwXpjDtrvJ_Yt7L1Y14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]