Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While I think it's absolutely cool and good to protect your own work, I don't th…
ytc_Ugwo2W0Uq…
G
I just debated Grok, way better debate because it wants an AI government and a h…
ytc_Ugw4M5c76…
G
In Dune (Frank Herbert), humanity banned and destroyed all computers and “thinki…
ytc_UgwZsQT_C…
G
He's just dumb. He didn't read the full information from the AI and didn't verif…
ytc_UgxovrNLR…
G
AI is a weapon against the human psychological ecosystem. Your children will be …
ytc_UgypvWaOE…
G
Coming from a tech background and having been recently enamored with AI art (Par…
ytc_UgyeotnET…
G
You wish lmao you can go ask ai to create an art similar to " any famous old art…
ytr_Ugz4tsjZL…
G
Only thing I agree that AI should take over is medical they need an enhanced sys…
ytc_Ugy2WS4Ot…
Comment
I hope I am wrong, and I was resistant to the notion. But Within a couple of years I don't think we will be writing, prompting or reviewing anything because we won't be there. Product Owners will have AI translate business requirements into layouts, code, testing, deployment, logging analytics & A/B testing campaigns, eventually it will all be automated. Full AI orchestration is coming. Large teams will be reduced to a few people to babysit this process. We are entering phase two, labor reduction & product/service abundance. Maybe I can become a technical product owner, but I think their days are numbered too. Once Ai trust is established and familiar why would customers visit websites, they would just ask Ai to order this, or book that. I need a hug. 🥺
youtube
Viral AI Reaction
2026-03-23T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyVQ9GuOkGBR54DISZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz-x57Dk_I21hRBA4Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyA7EqUU7JQ_BaK09Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy1A47G2sQonDXZ8A14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugz8-Y9mKDkDx7IZrDF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgztGYVbTZ4TK91laDF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxJKbhWJ5hALD2QbVt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxItH3nWRqCLlOtLPh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgxLY5zEB057FuiIT5V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxUAR2m-QU6Y331Qt94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}]