Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You'd think he would just not use AI art because he doesn't like making art but …
ytr_UgwykDHGa…
G
These guys need to watch Vedal and his AI daughters Neuro and Evil, they are tra…
ytc_UgyQF2iKq…
G
JESUS✝️REIGNS FOREVER MORE!!!!!!! HALLELUJAH!!!!!!! AMEN.
I LOVE YOU, LOVE YOU, …
ytc_UgycDd5ML…
G
You don't know what the automated trucks will do. When so.e of the sensor's fai…
ytr_UgwRLi69X…
G
I think Elon should have explained to Tucker the difference between Specialized …
ytc_Ugz2YYEgh…
G
why? we have so many humans now that need jobs, need to learn etc.. help them in…
ytc_UgzidcNj8…
G
Haha this video is funny, totally bullshit, don’t believe the AI-hype, AI doesn’…
ytc_UgxSYQNPr…
G
They need to implement UBI NOW before job loss reaches catastrophic levels. It w…
ytc_UgzZHhVYZ…
Comment
Friends - we ( Software services)are going in wrong direction with addiction on the word AI...
For a joke - this above subject is what shown director Shankar in Tamil movie Endhiran...where Chitti loves Heroine.
Already machines started controlling humans with addiction to mobiles. Don't know what else have to face...
Even News Channels want fastest way to publish themselves...
Let's face the fate...soft killing software 😢
youtube
AI Governance
2025-05-28T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgKfOaHDdN_rcNqd94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwU6jUHAwTtkwX2Af54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzsMY8cOXXACkmDZ0p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyUBASy2QNqZQdPjTx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1NCsO0rfG5cF3MUl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeNcbn6-8d7eBt-YR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugx3Ni37noR36ZwlllZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugxi0LEFrF4YxnhSde14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxMPZJ7fEqviJQs5Sp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYSmgK1SIqJBjVM_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]