Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone whos self taught and uses ai a lot, im glad im not this cooked lol…
ytc_UgyiPmu5O…
G
check your premise. It's pure speculation that self-driving cars would drop deat…
ytc_Ugwrs67iZ…
G
It makes me laugh how he speaks about people working on AI as smart. Funniest t…
ytc_Ugx0HMpxU…
G
All AI images, videos and voices must be clearly marked as AI and if not the cre…
ytc_UgxGTCJ1i…
G
AI art is so easy to make, (obviously), the AI does the art for you, no effort …
ytc_UgwGjHfPC…
G
Still looks very fake. They have a lot of work to do to get the eye blink to loo…
ytc_Ugwog8avq…
G
These Ai bros are just lazy, and don't even try to pick up a pencil…
ytc_UgzGq7Q3S…
G
He said to learn resilience cause life will be harsh, even in an advanced world.…
ytc_UgxK-Vvj0…
Comment
@Wolram: you seem to only want to accept the idea of extincition of humanity once you can see how the AI systems would ever set out to do that. It's almost like you feel the need to understand how it would happen and if you can't see the path, then the path may not exist and you are not worried as much. However, if you take a step back and consider that AI will be developed until it can do anything a human can do, and even better, then it is only logical that its (sub)goals could be as quirky as some goals that humans have. And then, if you consider these AI are eventually vastly more intelligent you can deduce that some of them will be extremely dangerous to humans. You don't have to understand the technical details beforehand to understand the risk.
youtube
AI Governance
2024-12-01T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwhetJyaa7zaqwDeDN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugyk-OKW3TGozQB_eBp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzX7RB92cDYXHhwK_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKpMq6JhlIFF28Tex4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2JBNpRY8BevvEIeN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyrO0-H4ZqH4OnPqH94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxzMsmvWvck0BqRPNx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyeUBmE_5VG4ux2gSp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz2FpwtTbLpyQH34Kl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxIY763xz6KMKWDgDl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]