Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@itakruI mean when I was “borrowing” art for the home game and a water mark was …
ytr_UgwVAdc69…
G
Let’s have a referendum on digital id and facial recognition. It won’t go the go…
ytc_UgzBRHZpQ…
G
A.I. killing people, it's a feature not a bug. See the documentary made in 1984…
ytc_Ugzz84wo-…
G
"one day the tech industry could invent XYZ and that could revolutionize the AI …
ytc_Ugzq-GUvV…
G
So why does the Guardian use Taiwan newly and democratically elected president a…
rdc_ky7fu9p
G
I was so disappointed when the AI Ghibli feature came out. Mr. Miyazaki said tha…
ytc_UgxySeCrv…
G
@lloydclement2152 Some people like or don't mind it. I think their point is the…
ytr_Ugx9k9inc…
G
Thank you for speaking about this Charlie. I am an actual artist and it's very u…
ytc_Ugy04bz4E…
Comment
I love how everybody thinks Elon Tusk is so smart but if you really listen to him, you understand that he only has trivial knowledge, surface knowledge about deeper topics. The problem is that the general population has even less than surface knowledge of deep topics and thinks he's the smartest dude.
Look again at the second question he's given "what are the exact dangers of AI?" He's blabbering on about "dangers to society" but can't even but the finger on it. Instead he laughs about "it sounds like terminator" but you can see that's exactly the extend of his knowledge there. "something like terminator"
Then, at the end, tucker asks the question again because tusk hasn't answered it at all. And all he comes up with is the dangers to social media? An AI writing posts on twitter?
LOL this guy is a rich sham just like so many others. Like trump almost
youtube
AI Governance
2023-04-18T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzyCAGAEUyMoPg1YUV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz90xk3jw_FeSEGoYx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwAgfY-AWMJzrZ3RW94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyh8jUcUxQdvA7JoCN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRU9wbEjA2os2s02t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwjHxfNS8zEvAwYuDV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyMu9BAFLyDUm_Zutd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzRtC--xntBBEwAUDV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw0EIttSl3g3zlsxud4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8ccmyjDoMcAOMQxp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]