Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is making me sick. My Instagram feed is full of AI slop. It’s kinda hard for …
ytc_UgxpM7ITE…
G
Someone needs to create anti-nightshade software... that actually does nothing a…
ytc_UgypajNt4…
G
Only idiots believe in self driving will actually take them to their destination…
ytc_Ugw1aMN5e…
G
Guys calm down sora ai basically made a chair fly and turn into some weird sh!t …
ytc_UgwogNexD…
G
He didn't say it was going away, this wasn't a jab at AI art, this was a jab at …
ytr_Ugz1lFxR9…
G
Man got so mad about people hating AI that he started making propaganda, crazy w…
ytc_UgwAYa3s_…
G
Yeah, no where near enough people are paying attention to this part.
Take the h…
rdc_mrrov95
G
ChatGPT doesnt have users, only believers. It's just a vast database of conversa…
rdc_lw8frec
Comment
A voice vote would have worked. Anyway since most of the panel believed that a sentient AI will occur in our lifetime, we can assume the AI would have free will. Would you wish to live in a world where an AI could decide to get rid of all humans? After all, humans would pose an existential threat to it.
youtube
AI Governance
2023-07-18T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3Yomb1ql_-dEtFtN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyWbCbuAjmWT6bpZ994AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxJq2JxO4SYjpa2xW94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyKDGc6e-uo3t7QRTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTUdNVTZr-fTkOZvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnuZXHvc1zYkQfVxl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFCQyEsq9SLEWAFFJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugy0Mfz3nE0rvMxxRwJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw10Q77hy6GXacoZml4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzIBKMnPKGkUnkq2Z14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]