Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A woke AI is definitely up there with world ending technology. Teaching AI to be…
ytc_UgyidywTY…
G
Personally I don't think robots deserve rights, they are created to make our lif…
ytc_Ugggw7gD7…
G
So it’s up to the engineers to not make it self-replicating… which engineers, a…
ytc_UgyHcd-I6…
G
In the end, 'Happy Birthday' played on a ukulele has more value than an AI-creat…
ytc_UgxcE7BBI…
G
“If anyone builds it, everyone dies”
- yudokowsky, the founder of the alignment…
ytc_UgzHLQ93n…
G
I remember when Black Sabbath and Judas Priest (I believe) were sued over their …
ytc_Ugz9MOF2K…
G
That man's got his own house two cars one in the garage one outside and you goin…
ytc_Ugy99rQjC…
G
I think we can use ai to our advantage as a source of reference and inspiration…
ytc_UgyXTY2Wt…
Comment
This was absolutely amazing (at least for me). Funnily enough I found the parts where they werent engaging in the "AI will kill everyone topic" even more interesting. And to be honest I felt like Yudkowsky sometimes felt that Wolfram had fair points when asking him what he ment by that. This wasnt a debate imo so there isnt a winner but I kinda sympathysize with Wolfram here overall. Very stimulating talk!
youtube
AI Governance
2024-11-12T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwrs34dPhKqwKUNjat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmQr70otrPbIqXgvd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx3ZZ0fBzZs6ikKE8F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw0ULd8JLncDiNnD794AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzKFqMFOBRSRYNZbgJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxPIOza9X46ztg-tHN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxrKilhY4pmsWdQc5B4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyHjEFXpXmACP1JfAZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgweOSrgE_M3vGGTCQB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxu0SgXLnEB6z3gy4R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]