Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yang's point about AI displacing jobs faster than anticipated is eye-opening. Wh…
ytc_UgxKd7SJs…
G
As a programmer I’m not in the least bit worried about AI. I use it to make myse…
ytc_UgyKJUlHr…
G
From a personal standpoint, based on their record of the habitual abuse towards …
ytc_UgxHjk06b…
G
I read a lot of science-fiction and a lot of stuff on technology and science, an…
ytc_Ugxe296nn…
G
To be honest, I’d be happy without a phone and ai. Life was simpler and people a…
ytc_UgyKxqYtT…
G
I feel like the major problem is less AI in itself and more capitalism. Sometime…
ytr_Ugy9vAF8N…
G
Why isn’t anyone talking about cyber wars. When another country attacks the USA …
ytc_Ugy9vnuJs…
G
You cannot identify the image from which a token is derived therefore it is not …
ytr_UgyhTwPJy…
Comment
Whats took 20,000 years for humans would most likely take them 10 years. They don't need to eat or sleep, they don't have to worry about financial problems, health problems, or any of the other things that has slowed down or stopped innovation or a genius from continuing to grow. And the biggest implication is they don't have to worry about death. Ai will eventually be a run away train.
youtube
AI Governance
2025-06-28T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzaIMIHReEDMImKywl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEHMeks5sr2FIgBhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxao9NwHv-iNsnnlr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwGXRiF1_7tZOnexs14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-FSleuiO0i5ih5uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqXbkevsi6Dtas9-d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwhxvnmg7mXiXfrV0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1SLU2BSm531PgowF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxDp0uqlJbhTHnzp554AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyGNZhOxJN4kYmxC3V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]