Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They just proved AI can and will kill humans ,so why would you want "IT" to know…
ytc_Ugye5QF4L…
G
Maybe try Claude. I have never taken any courses in programming, I don't know a…
ytc_UgykTs3Mc…
G
i didn’t read anything, just saw “de-centralized ai adapted credentials” and fig…
rdc_oh2lgex
G
Honestly ai scraping is extremely unethical
But theres just no way to make a goo…
ytc_UgwiIQvTd…
G
if we treat AI as a slave and keep putting barriers up it will never align only…
ytc_UgyAwV6qq…
G
Thank God AI came, this all people crowded the IT field and now it lost it's imp…
ytc_Ugxli80Il…
G
I was unconcerned about AI years ago. I thought it could easily be kept in check…
ytc_UgwQWpMxT…
G
We are truly lost. We have a short time left before AI is in charge 😢…
ytc_UgyMvCIOC…
Comment
"And according to many leaders in business, science and technology: we're in the final chapter. The part of our story where we finally go extinct.
And there's nothing we can do to stop it."
I have not watched it yet but it sounds like bullshit... If the now is the now, and there would be infinite possibilities in the now. If something like that would ever happened, it was a choice. It could be stopped in infinite ways, then. plus infinite solutions. Earth could live forever in infinite ways then. Plus A.I could get into infinite directions.. hmm. Even if. Earth and people could be saved in infinite ways. Not being rude... that's just my thought about it right now before having watched it.
youtube
AI Governance
2023-10-17T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzxbxHJAcifBnjrXFB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwdEblPYRqPehPs89t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyu-abhvc6AruOBnYF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfAgck95CZ61cUWHx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy9cPhfbEnPWi8-5VR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwjk_yIEloaa7T0Zrl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjtbzBJglKDhZvAI14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxqhMYzeYmoF3cwOOF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxRVEF1Qxa1uE5MyZx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRadCRC22lKq_Q_Xt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"})