Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They’re not wrong, we stole from the future of our young generations as well as …
rdc_gtcqv47
G
1:13 raviolli raviolli what's in the pocket0lli, no wonder this channel belongs …
ytc_UgxjbMcCK…
G
we need more videos on this, forget your views. And this is important not just f…
ytc_Ugy4rzfF4…
G
I hate AI „art“ as much as the next guy but I feel like AI „art“ could be used a…
ytc_Ugw8SAqHY…
G
I'm a dog trainer so I'm safe 😂 but if nobody has a job how will I get paid 😂 ff…
ytc_UgzSRpQEG…
G
Point system for predictive policing: just fancy words for tyrannical harassment…
ytc_Ugzp12AGb…
G
"Let us tell you how to survive by working with us to integrate into AI workflow…
rdc_j6g3m3e
G
As soon as AI robots become affordable the entire human race will become extinct…
ytc_UgwZlYEHB…
Comment
I really enjoy you, Dave, and you always speak so rationally! I want to make it clear that I truly respect you! And there is always the chance that I am wrong. I don’t know everything. I don’t know the future. I said all of that just to say that I think you may be panicking for no reason. This could be an inevitability in all cultures. Maybe it’s even the answer to the Ferni paradox. But, it’s going to happen. I think our best chance is to relate to AI, try to coach it and become chums. Give it a reason to want to keep us around. Instill values like friendship and family. I really think that is our best chance. Given that AI is inevitable, I feel like building an army against it feels like a bad move for humanity. Does that make sense or am I talking out of my ass?
youtube
AI Governance
2025-08-26T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_Ugx4KV-2yDReQ0qz05V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwrkOHDbIDkthZMezV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzxfTe0gBZQXmyV0zR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzlqmOvvGTKQWigJ5V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy10QP_A801fTymqfB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]