Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree with many things in this video, especially when it comes to indirectly (…
ytc_Ugy6lLMCW…
G
How about I state before my songs « this composition is not made by AI » and I m…
ytc_UgyCq9uT4…
G
The ai: can i ask you a question
Can i ask you another question
Can i ask you an…
ytc_UgwjJ5CEb…
G
@Bebebebe22- What don't I understand?
They're upset that their work is being l…
ytr_Ugz6CssZr…
G
The fingers being correct is the best giveaway that it’s not actually AI, these …
ytc_UgztY8fQP…
G
While no where near the level of what AI could reach, what is a super intelligen…
ytc_Ugwa_yGSx…
G
He's right. I've been trying to get Detroit style rap, and it keeps giving Wiz K…
ytr_UgyHb8wiF…
G
>Lipps is now back home but says the experience has had lasting consequences.…
rdc_oa4z1bv
Comment
Everyone here except Nate Soares sounds mind numbingly myopic, their arguments countering Nate, bordering on wishful thinking.
The majority of experts in the field including the Godfathers of AI have warned there is at minimum a 5-10% existential risk concerning a fast approaching AGI/ASI. Forget 10 or even 5... just imagine if the next flight you were taking had a 1% chance of crashing, how many would board that flight?
youtube
AI Governance
2026-03-23T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxRHj_GqoTuKUuo8z54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKkolzCmNiXNpum1F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgximKBdniY8witwtEp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzbIo26YunXGXwSagR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw0w9lGkc22srY7CX54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrWF_VuGcSgrSOyqt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyaAcgmkYhN03Aei0x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSeaQIdDAAFYvWuOt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHj2EQ7AGsA9en_854AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgznswjF1WAiIvs34pl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]