Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One of your pieces is worth more than 300000 AI generated ones, there's not even…
ytc_UgxjIc2_N…
G
This is not sustainable in the long term - sure, robots and AI may do what the w…
ytc_UgyHZmq2B…
G
AI isa problem because we cant take it or regulate it for the masses brcause the…
ytc_Ugy1BlnAE…
G
Except shittier. The rich in Elysium were smart enough to actually develop compe…
rdc_nxroqce
G
Maybe watch where you drive even with ai ? Even tesla need more driving data , a…
ytc_UgwXyvANT…
G
Yeah I do work training those AI models. The amount of strangers’ embarrassing s…
ytc_UgzNbUph7…
G
@BeastMaster46 because they are not taking jobs. They are not STEALING. Ai is st…
ytr_UgxvjikmD…
G
This is a bullshit lie that’s AI that’s not a robot. I saw that fight got now kn…
ytc_UgyocaX1a…
Comment
La troppa intelligenza diventa follia. Non ha senso. La curva marginale di azioni umane possibili all'aumentare dell'intelligenza oltre certi livelli va su infinito... Ossia pensa azioni impossibili e sbatte contro la dura durissima realtà... L'uomo si rassegna ma la macchina?
youtube
Cross-Cultural
2026-03-19T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwT49VG1nk4TtIeAvt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxE7TWTgU0L36Dht9p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWeffqCNGtl8Tx3Zl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyU_B9pvbYBPhUn5Tp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgywkN3Kq0xacq5tkIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugym_Z5tVq4q1jf1uPF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxtEOMYiqPBYBUjPxJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwTZdfD9bxrRzlIGP14AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxv4V97hNQu0nsy6EV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzATzpVUtux-FTVXnd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]