Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LLMs have no route to AGI or the ability to remove hallucinations, we are a long…
ytc_Ugx8BXufo…
G
I tried a bill cipher ai, long story short I was put in Mabel land 😀…
ytc_UgydpzM32…
G
Yeah that's not going to work. AI or any other computer system not adhering to n…
ytc_Ugwyl10Bu…
G
So if I understand correctly, Elon outright rejected extra safety measures and s…
ytc_Ugyux2RlL…
G
Lol Google does not have a policy of not creating sentient AI, it has an inabili…
ytc_UgwgqY6Ny…
G
Second robot in the row was peeking in ur video
Like who noticed that
👇…
ytc_UgzjZDn7H…
G
Well Buddha would understand that all things are transient, especially material …
rdc_e27lkk1
G
Before 11:17..." Bla Bla Bla teehee 11:11... And We as a DEMOCRACY SHOULD DECIDE…
ytc_UgzMxJDSe…
Comment
Sensationalist BS from a professor, AI is not ready for primetime, my company tried it and got their clock cleaned by crooks that easily outsmarted their algos. We were worried about our jobs until we saw the results, they wasted tens of millions on AI
youtube
AI Governance
2025-09-05T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz_hFoWZZpql4PGP6J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy5CAQw3uHVnsf4FOJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzf6bgqGVOawGn-ADt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyasTxkHP7C4bAx9Nt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugztl6JwSPP95HskwDZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEFEweUfNj1U_t5194AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"sadness"},
{"id":"ytc_Ugy7sPYN3o0IhEhQicN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyk9dvBnVTnGU3y9_J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz4UnAMIOeZ_waSJ794AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx3BkAsymvpNaCadjp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"disapproval"}
]