Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fun fact. People took stem cells and put them against an Ai at pong. The stem ce…
ytc_UgwdUydAm…
G
honestyl but how did they replace construction worker and builder? giant robot t…
ytc_UgwEyq-hS…
G
Literally done this for hours on copilot crazy how it always goes in the cycle o…
ytc_Ugyu-ioPz…
G
Here's a terrifying scenario for you: As we develop smarter and smarter AI syst…
ytc_UgxaCe8v2…
G
The EU should think more about innovation on Ai instead of making regulations th…
ytc_Ugx9XUTnz…
G
I’m more worried out insurance carriers won’t cover going to a therapist and wil…
rdc_jifgvc0
G
We appreciate your concern about the potential impact of AI on future generation…
ytr_UgwtF_Ydh…
G
And I genuinely share your concern about democracy as somebody from Middle East,…
ytr_UgwIACSOj…
Comment
@@SetaroDeglet-Noor Yes. But GPT-4 isn't an existential threat. It is not AGI.
AGI poses existential threat.
That's what Bengio and Tegmark are arguing for, not that GPT-4 poses existential threat.
GPT-4 poses risks, but they are not existential. I think Melanie can't think of existential threats of AI, because she is only considering current AIs, like GPT-4, so let's not do that. We need to consider future AI, AGI, which will indeed be able to do things that we cannot prevent, including things that might go against our goals, if they are misaligned, and in those cases, they could cause our extinction.
I'm a bit disappointed that they didn't talk about instrumental convergence explicitly, but they just kind of mentioned it vaguely, without focusing on it much. I wish someone like Yudkowsky or Robert Miles could have been there, to provide more concrete technical examples and explanations.
youtube
AI Governance
2023-06-26T00:3…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx_TYMnuaoDy8oOl2Z4AaABAg.9rP-tW7VcEz9rTDYYTzOwx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzILEF6c80rviWuew14AaABAg.9rOfRNYsrK_9rPr9ZZo3Gx","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzvAE9c82jICQGq7754AaABAg.9rOd7tgv5fY9rPIqd6kVCb","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzvAE9c82jICQGq7754AaABAg.9rOd7tgv5fY9rPllc8WblS","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwHyJCTsfpVlfgY4id4AaABAg.9rOUtK6eOPP9rOmEEPdeXC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwHyJCTsfpVlfgY4id4AaABAg.9rOUtK6eOPP9rPFptFVcrQ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwHyJCTsfpVlfgY4id4AaABAg.9rOUtK6eOPP9rPrGXr5C7N","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugw_vDk_1yjEcZa0Su14AaABAg.9rOTUtSVYLV9rOmdhjqwjF","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw_vDk_1yjEcZa0Su14AaABAg.9rOTUtSVYLV9sGpoflejiH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxsrxsaoh_cpFVOt0J4AaABAg.9rOMdAZj3ax9rOhT20qZ81","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]