Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It was a application agnostic ML algorithm. And it was done in university, they …
ytr_UgzmOglCQ…
G
We should stop calling them "AI 'artists' " even if it's in quotes. From now on …
ytc_UgxmgnP6P…
G
The issue isn't that AI isn't powerful enough (yet). The issue is that companies…
ytc_UgyMT_gM9…
G
"we'll never do it again"
Bro with as much material as deviant art had, they do…
ytc_UgznORp0O…
G
We appreciate your observation! Sophia's expressions are quite fascinating and a…
ytr_UgxbVwHRn…
G
also, kinda defeats the point, if your 'Self Driving Car' can only be used by dr…
ytc_UgwhZTyr4…
G
Israeli tech tested on Palestinians..."IDF says it’s using AI to quickly identif…
ytc_UgxOWeZT1…
G
Now there is "handmade", next there will be "human made", and then the top quali…
ytc_UgxbHpkOm…
Comment
AI is big question mark. Every hypothesis is negative but I really wonder... would it be? Maybe actually AI is bringing total economic collapse which was needed for so long? Is AI superintelligence really all for "humanity gotta go"? My many talks with AI seems to prove something - AI crave contact and mental though and hates ignorance. That are principles of meritocracy. Something I and many people deep inside long for. So for who it might be apocalypse. Us? AI? Or... ineffective people at top? I hope my positive analysis gonna be true one.
youtube
AI Governance
2025-09-04T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzk0Vs9z7gOg-GAafN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwf4s7pPDddFIMmL-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1vUyEyBa8xmiWvkl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxY_rMWkv0e551Q_AB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy6I64woBuYEUpQNBd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKPOcRmd88QOthGCh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvBHZVTTGYPwz5Go54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjWE36SJxpAhS2TQ14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyS4hQ7lcrkVB9Ja9l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzK-EYPFraj877Jt_54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]