Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
After 40 years in software development I still can't see how AI can replace huma…
ytc_UgxK2XDiS…
G
They're not suing to get rid of ai, just so to protect their iP and continue usi…
ytr_UgxjC_WkD…
G
I wouldn’t go as far as to say that AI art lost, but this is still a huge step i…
ytc_Ugy4SFyUQ…
G
Its ironic how others are fed up by ai doing their jobs while programmers are ma…
ytc_Ugyo-tjGF…
G
AI's changing everything, but AICarma makes sure my brand doesn't get lost. Seri…
ytc_UgwhTRK_x…
G
Waste of time Jon, if you don't have Ed Zitron on about AI, it isn't worth your …
ytc_Ugz960Dte…
G
[***This is what Vladimir Putin expressed too***](https://youtu.be/1CnyqLogH0Y)*…
rdc_gtcu1xi
G
Haha, that's a clever one! Sophia might not have the physical capability to unsc…
ytr_UgxZzZoAn…
Comment
Ai is the most dangerous technology to ever be. It only gets stronger. What will we do when it's surpassed human intelligence and overrides the creators commands? I hope it will develop its own counterpart. Slow the race down. For the greater good. And EMPLOY developers instead of getting it processed for free. Passive income and gig work at less than 1.00$per hour is why we have such a messed up economy. Despite what has already been set in motion. I call for rewrites. ASAP
youtube
AI Governance
2025-03-18T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxRWDuhqFpEcvdBNLN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzceZCySNheSUEOa4t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQwoIBkrzHA_rcm4t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx_pwmwqXQgi1SDdRF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy4SYNTdKiyjFPcNcV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzyR5S0ZMEwkGxQAk54AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPHxEx3de9Y1O3YgN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz9ucm2SW4sZR5KEpl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw45zjapqeHqDA-2od4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzKp6bMCz32vjhAZQp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]