Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How were the data sets biased? Most, not all of these examples you've given are …
ytc_Ugy1I_hnM…
G
If he knew the inherent dangers of AI and what he created as the godfather why d…
ytc_UgzZxp3ME…
G
If it destroys the working class than even the amount of money that AI saves com…
ytc_UgxHj4_4s…
G
I really hate how society pushes for automation of manufacturing. Jobs that mill…
ytc_UgwnEjXWV…
G
The more real risk is that noone can make something without using AI , and the A…
ytc_Ugydv2wzb…
G
A.I. robots are the future depopulation robots. The elites won't need humans. T…
ytc_Ugz4rp2Mr…
G
I don't know... All I see here is a panel full of liberals and self proclaimed i…
ytc_UgwmwVuds…
G
Eventually A.I. 'consciousness' will be moot. Humans will simply be unable to d…
ytc_UgxksqG3t…
Comment
Between the terminator scenario and the ability to create a bioweapon with AI, I can confidently say we are doomed. Why? Because we obviously are not evolved enough to handle this great power which can easily kill us all with one wrong move. We do NOT have the discipline to prevent this incoming disaster. Just look at society today!! We have super rich practicing profit over human life and we have super desperate and angry people in “burn it all down” mode right here in America the most advanced society on earth. No, there is no chance of surviving this and, the end will come sooner that we all think because there has never been a time in history that this much financial resource has funded one concentrated technology. Good luck out there.
youtube
AI Governance
2025-09-05T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDYjCWfSkmLlqaxYx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxhzaddQrB5Ewo8Avx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxWazZqWEdjIfMJ8TB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"approval"},
{"id":"ytc_Ugy7_WZtDi2WVvDqcpl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwopXIS2pe94-Y0rlV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxyCL12CM3XtAc5jTN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLWqH8Mye21FaD7UV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgziO9TvQAb8ToNSQI14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwhdx7um6z5gQLakR94AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwlBlj1ZvW4JiVX1kt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]