Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just got a you tube ad for headphones designed by the canadian military with wha…
ytc_UgzzLtY7z…
G
Fear mongering on AI..6 months these guys will change their tune...ott finished …
ytc_Ugyy1nYLJ…
G
Thank you!! I've been saying this since it started being available - people smir…
ytc_Ugy6txfJK…
G
the US doesn't have concrete or official AI regulations because our senators and…
ytc_UgwjiqXsV…
G
AI trains on other artists' work... So do human artists. Every time an artist se…
ytc_Ugy7ElyvC…
G
Imagine a.i depositing a billion dollars in your bank account or whatever and te…
ytr_Ugzj-UZYK…
G
Would there be a way for us to acces "AI intelligence" with a type of neuralink?…
ytc_UgzIUAkeo…
G
Fun fact the banana on the wall did actually inspire people lol. I have music le…
ytc_UgzZbwi4_…
Comment
Wow, going into the debate, 67% believe AI research(!) is an existential risk. Listening to fear mongering arguments, 92% believe that. Tell me about the inability of humans to gauge risks. Surely AI research(!) is the only(!) way to resolve the very risk that people seem to be afraid of.
youtube
AI Governance
2023-08-17T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxT0jzYgY0XdOQ4cqh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyEkCQtq92SLKPlPNl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwLVGaFFl8nCHEepqh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx52BnGLYa6UxbMX294AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxAci_nguooo5v0NRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyILhQ_KsZ-b-C-Lqx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzDi4kiS-bSe3g-LhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0PFkivatSns4E8xd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxedCS7pDsuymN4QxF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzprZVcmX1iB91yZPp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]