Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dont worry Ai Animations look still disgusting no company would use ai.
We mak…
ytc_UgyQxuJD8…
G
On top of that he was lazy or doesn't have any art skills. At least he could had…
ytc_UgxN2rjJ_…
G
To the children of God: The word of God does not say this. Anyone, stating AI is…
ytc_UgxNQixGd…
G
Hey Cole, @02:10 I think you misunderstood what it means when "students with hig…
ytc_UgzjqoKDW…
G
Maybe watch where you drive even with ai ? Even tesla need more driving data , a…
ytc_UgwXyvANT…
G
Well he should look concerned. In the future his job could be taken over by AI.…
ytr_UgxK1FzNV…
G
I’m not against making a buck, I know that things WILL Change wether I like it o…
ytc_UgzSaaHlV…
G
Makes sense. I’ve been unsure whether to buy into AI ETFs or just stay balanced.…
ytr_UgyaGy5_4…
Comment
Even beyond what they're causing now, what's worse is what they'll be capable of in the future. AI regulation is basically about companies needing to in some capacity contribute to AI safety research and be transparent in how they train their models and for what.
Deregulating AI is purely about securing profits and allowing AI companies to engage with things like military contracts or mass misinformation campaigns for the politicians with the deepest pockets. And that's not even getting into the doomsday scenarios that AI safety research are trying to prevent.
youtube
AI Governance
2025-07-01T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwJbFPCcMQePQ6bODV4AaABAg.AK0_UeKLvNXAK1SZSZnUrd","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwJbFPCcMQePQ6bODV4AaABAg.AK0_UeKLvNXAK1UTyJSfXv","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzgB5mlm6GTiPj7A2N4AaABAg.AK0_2TuRqh4AK1jXRWRS8p","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxXX8skVAI222TfFpt4AaABAg.AK0Un9evNsMAK0Z8m8zEIX","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwXWrXXaFu2jyQ6JNB4AaABAg.AK0SitFyRBaAK0UW1UnGe_","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"approval"},
{"id":"ytr_UgxX6brPihyVpbcGYDx4AaABAg.AK0RIq4aBEJAK0ZNqd_MEg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugxqtc8OA5tfRsi7D0R4AaABAg.AK0QmTHpFJtAK0U_zWKTpW","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxqtc8OA5tfRsi7D0R4AaABAg.AK0QmTHpFJtAK0VdN6nKkp","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytr_Ugxqtc8OA5tfRsi7D0R4AaABAg.AK0QmTHpFJtAK0WGndEk60","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwoKwbZxzs-VfeIGD54AaABAg.AK0QN9VvO0HAK0W-vSaOHV","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]