Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is a huge difference between the Human Brain and AI. Even AI like DeepMind…
ytc_Ugw0OHtVN…
G
Colleges should be shaking in their boots. No jobs because of AI means a lot few…
ytc_UgzTu8x8d…
G
I typically loathe the ACLU. However, in this particular instance, I would argue…
ytc_UgxOE_epl…
G
AI learning is no different then what she did. Should she pay royalties on her w…
ytc_UgwIIpdaP…
G
The first mistake with autopilot, was calling it autopilot. I know it sounds coo…
ytc_UgxKu8FPB…
G
Talks about Algorithms, but not once did he mention the man to whom the word "Al…
ytc_Ugx81XeRK…
G
@owo2610 my bad. ChatGPT 4, not 4o, you're correct. However, there's a lot of ex…
ytr_UgxBG1byk…
G
For anyone - Everyone is talking about polar bears right now because AI uses a t…
ytr_UgxlF4Q9K…
Comment
Palantir is already killing us. Their main purpose is to destabilize governments and control populations. ENTIRE populations. Just look at the ridiculous anti-American legislation and the clown show in the White House we have — that’s not pure stupidity — it’s _designed_ to make us think it is — but it’s really AI destabilizing the country enough to distract us from what’s happening in the background — COVID, for example — whether you think it’s Chinese made or not — was allowed to proliferate America — by the same person destabilizing the country now. They don’t care about the loss of human life — they care about controlling human life and making sure the populations don’t rise up against this all.
youtube
AI Governance
2025-07-04T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwj4LxmCA-k9ZHHvJV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyuWKE7mgR-BO31eet4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHYIH3d46O2KWQDcp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxXWQ0F07IyY2JcBzp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzAAk1NfSe5A2bRiqt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_OgitN4NW-UTkmSl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVSuyyKbU3NP9WsT14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwvj4tNPl9uSJoFObZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5JVIdIFkIermh7XB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwbpKGqwtEUtIWSvR14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]