Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Read the Bible, believe in Jesuschrist and do what AI will never do: be saved.…
ytc_UgzvdsjAS…
G
Lawyers will never be replaced by ai because client relationship is necessary. H…
ytc_Ugz5iyIjk…
G
The media isn’t going to post gooners replicating models of their favorite strea…
rdc_nk509f3
G
While I find the conversation intriguing, I don't believe these are signs of sen…
ytc_UgwUvbjwn…
G
There are actually AI image users who make money off AI, on patreon which pisses…
ytc_Ugw2h0E2R…
G
I was in the same boat as you. They converted my EI application to CERB automati…
rdc_fn5n8ct
G
Wouldn't it be nice if we built a society where people celebrate when a job is a…
ytc_UgxUrT-sz…
G
It is not AI if it believes liberal policies are good for humanity. It is basica…
ytc_UgwS78pyu…
Comment
**“In this podcast, we’re talking about artificial intelligence and how it might shape human life on our planet in the future. We all know that life on Earth could vanish if just 10% of the world’s nuclear potential were ever unleashed. The fear of destroying one’s own nation by using nuclear weapons has, so far, been the thing keeping humanity from wiping itself out.
But with the rise of artificial intelligence, new theories have appeared — some suggesting that AI could one day eliminate humans entirely, simply because it wouldn’t need us anymore. The truth is, the future of humanity over the next 50 years is deeply unpredictable. One thing is certain, though: it was humans who created artificial intelligence.
And so we’re left with a haunting question: if AI were to bring about the end of humanity on this planet… what would be the purpose of its existence here at all?”**
youtube
AI Governance
2025-09-07T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwS3TYsQ7saUOEL3L54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwqncVO2a1VQ6Iofd94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy_be5Dz4WeKhv0nmF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZPu0BbQXZuSot9at4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyaSUPqlVgkSrjsYmh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwX9w0SvTBeBO5q-qF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKF_ZKr08H_Da1jN14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyAxytus95tzJNdv2R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwi2RYhpWPZLd4wavt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqcNP8yEe__pPpZxB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]