Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai is already here, and its growing daily whether we want it to or not…
ytc_Ugyl32vY8…
G
🤡 I am not sure if AI replaces coders but it definitely replaces you 🤡🤡…
ytc_UgxgXmTb2…
G
Videos like this are why I keep a few of my AI subscriptions active.
Go back to…
ytc_UgwYOE0Lh…
G
I mean the government can just making AI use illegal. Problem solved.
Edit - S…
rdc_l4px0zt
G
@raymondfranko2894
Didn't see that one. Give me the link. I saw quite a few tha…
ytr_UgwX8XJ2s…
G
Not a fan of AI art, but I disagree on youtubers part. If it's entertaining enou…
ytc_UgwEPFrud…
G
Wow, we don't even know if that is really the case with this product but if it i…
ytc_Ugy4GCBF3…
G
I can definitely see his input in the Supergirl: he made many edits to fetishize…
ytc_Ugx4pDezC…
Comment
Artificial control of anything organic always leads to death. Every. Single. Time. We need to EMP ourselves back to human independence before AI takes that, too. I don't care if we're "Stone Age" again for a while, at least we'll have a say so in how things go from there. The only way forward with AI is artificial transhumanism for space travel. It might sound cool, but it's not. It's lonely and painful.
youtube
AI Governance
2025-12-30T22:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxKYfjWLFSc4iSyUkh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6bVrUqDZL3_xoGRB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyRm5EKpdV4BMand054AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzunMlFgfaH51EVmtt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyP1HowEzPZ-VSJROt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwKJJ6Or187e5CN7sl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxDC2-e0YLQia0sCRR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxDajY5arf94zG3tCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLuT7n9TWLZtKsExZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz5mfbQ7U6wPSqWdYh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]