Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I Will say there are uses for AI art, but not how its used publicly.
a good exam…
ytc_Ugy5yE4w5…
G
ai "artists" when they realize they could just pick up a pen and paper instead o…
ytr_UgwUPVguo…
G
Grok is better than GPT, I have used both. Amazing how apples trying block Grok…
ytc_UgwYdVXLA…
G
"Humanity's" greatest mistake is actually "creating" things like* this robot (w…
ytc_UgzPElGOn…
G
It's just like AI is an actual problem. If people have to get sick and injured i…
ytc_UgwZT9TZQ…
G
AI cant use your art if it isn't online! So draw, paint, sell at galleries or ar…
ytc_UgzsBvUVH…
G
"ULTIMATUM to #Yanukovych from #euromaidan: if president will not resign until 1…
rdc_cfl6pnz
G
people should not be teaching AI this. Altho it's cool now. it won't be later as…
ytc_Ugz5noq88…
Comment
NB: I work in and deploy AI. If you talk about AGI, people give you a voice. Doomsday or glory of AGI invention. Reality is probably somewhere between the two polar opposite viewpoints that drive all the social media conversations. 99% of jobs by 2030 gone. So laborers in preindustrial jobs across the planet who don't have reliable power or water, so have no AI datacenters to take their jobs, who are in micro economies and not part of the G30 economies. Their jobs are going to?
Far more likely AI will enable human productivity like the calculator, like excel, like HPC, like clouds. AI enabled humans will win while likely AI illiterate humans will be struggle. So start using AI for your job.
youtube
AI Governance
2025-09-07T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwu7NjIjdys934LGLt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDGeavmB7W9bKrA8F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwML5LCFVLq1nz1HYt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxuBFQvsQwvjvAjtSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyuMJsncMnIvKb_gE14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAcCDMSKjDt2Zer_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx94QOw_i7SAgzuWll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwcox7I38SHZ0PfYVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgNZjRyMWaJ2af3MZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyGRP4sP3l3E7r7q7l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]