Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All it takes is like 100 nukes to literqlly destroy the earth
So ai aint more da…
ytc_Ugy8Ivs2P…
G
Last month I asked to copilot to implement linked list and it responded with the…
ytc_Ugx8TxN4C…
G
I don't know the difference between prompting an image generator and comissionin…
ytc_UgxHfczd4…
G
imagine that somewhere an AI system completely overcame is biological creators a…
ytc_UgwaooWkf…
G
AI is probably better at running the world. It definitely would be better and mo…
ytc_Ugz4aQzpT…
G
Holy peak video. I've argued with lots of AI haters with a pro-AI stance, aware …
ytc_Ugw3cgm-X…
G
Just remember, Steve Jobs, the inventor of the iPhone would not allow his childr…
ytc_Ugz8zq1Ls…
G
no, they werent getting destroyed. its just the internet, words on the internet …
ytc_Ugzo_eRpx…
Comment
The very purpose of A.I. is to line the pockets of a few billionaire CEO's. A.I. will only be profitable as long as society as we know it can exist. Once we reach the point where that is no longer the case, we will become essential cows farting in a field. Nothing essentially. A.I. will destroy the environment, society, and Harvard itself. Idiocracy wasn't meant to be a documentary of the future. It turned out to be just one step along the way to the apocalypse.
youtube
AI Jobs
2025-10-06T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwjXFvVn_0OYr481yZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugys1TK_Ri_-SsgIh5V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3_9eY1DbJbtlel-l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJthb4kj2ztM1Eqy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnhFukRU2P3fiRVxh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxB_GFtG8E5S3eujiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxl87w0UxG0VK4ygKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz1uVW2Yuc7hYGDymx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAWrvFC2f_8E2Zdtd4AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjhDNQo4dXzkUB9lF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]