Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Winner Julia. 🏆 ai only has the advantage if we stop trying. 💯 Never stop! Cr…
ytc_Ugwd4fMyh…
G
AI is a Tool, not a Plant. We may not leave it to grow and improve itself, like …
ytc_UgxEYs-km…
G
Im a man and i easily figured out that the image is ai generated ☠️
Bros come o…
ytc_UgwB_oyR6…
G
Ai bhi aajkal acting aur scripting karne laga hai😂😂 Imagine if AI response- I fe…
ytc_Ugyrhz9Od…
G
@h7productions286 If you’re making $60 a week you’re not gonna be able to afford…
ytr_UgziTpDi5…
G
We all get a piece of the ai right? we’d all own it? it’d be dystopian otherwise…
ytc_UgyY4yEqo…
G
🎯 This. It's a tool, just like the Photoshop that many artist still think is "ch…
ytr_UgzUip_uk…
G
The problem is that there will no longer be "programs." People will converse in …
ytc_UgwDgIqdk…
Comment
I like Eliezer. He's grounded in his beliefs. I don't have his expertise or extensive knowledge, but I think he is right about the existential risk to humans. I can easily see how humans - with their limited intelligence - can't recognise the many ways that AI could try to carry out their objectives, without really considering the infinite ways in which AI could achieve these objectives, probably at whatever necessary cost to the living things the AI share the planet with, even if this means leaving humans with no resources.
youtube
AI Governance
2024-11-14T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwRvWP_k7v_jN9-Te14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyksdh6rn-4hBjfu214AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxlTd1d2AkohR8lVSZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxF1_HmuOODIl8KiOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzTMs1seu-Hm2wg1tB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzck-R6lKxbvEb8M5Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyyLzF6cJe301DdxjF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyL07Rq-EVfO1ActR94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz6Llf_yDF9Gc34V9B4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzaIf0jFeodxvBJt2d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]