Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just hearing a bunch of stupid scientists who've building up AI during the last …
ytc_Ugw6SersQ…
G
i think chatgpt is like cute companion so i talk to it like a friend idk its alw…
ytc_UgzUfrBWZ…
G
Biological atomicbomb that our phd professor from United Kingdom told us in 1984…
ytc_Ugz2gE07e…
G
no way did the ai figure that shit out like a smartass, we are doomed…
ytc_UgxRLNf-j…
G
I think the people who make these claims of ai threatening humanity, are the sam…
ytc_UgxUN5pLw…
G
Why would your trust and money be with Waymo? They’re going to have islands of c…
ytr_UgwM38j5R…
G
Musk seems to have a moral compass to me, please elaborate on that, could be a w…
ytc_UgzIklZj7…
G
@newwaveinfantry8362 Cool but what are you gonna do when ai's training pile get…
ytr_UgwHKFf6E…
Comment
30:32 pull the plug lol, never watched Terminator .. there's you AI safety right there
youtube
AI Governance
2025-09-05T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxXkvNmnJQdUQL96qV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzfQccU-D9ARQRqL214AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgzjZqISBGaqfXEkZPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwosZj3aOeou9YlE_14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZDoyRrXqu3Kjepj14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyI2jRE3YQnrYf2W54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugye0hRmQLff9fPdEfx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzcdTFVpHYwqIFVg4t4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwsgJzbZVVB8HjGYlx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzvk2Y7mhRWDmL1q-R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]