Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI should never be taken as correct over a state issued ID. This is nuts.…
ytc_Ugxw98Z5A…
G
Not a fair fight 😂 that robot is built ford tough not a piece of human flesh in …
ytc_Ugwdflznn…
G
Each shareholder is currently training a personal AI that is directing their dec…
ytc_UgxuQsw0_…
G
What's hilarious is the idea that Disney is spending $350,000,000 on films and t…
ytc_Ugx_EQhA9…
G
It's not 600 million on specifically abortions. Any service that provides aborti…
rdc_dcwmppn
G
I think progress with AI is going to be exponential. We probably got a few decad…
ytc_UgyXJ7SJl…
G
Ai may be inevitable, but real art will always be forever.
Real art will never …
ytc_UgyuxWmE_…
G
The only AI artists I would call an artist is one that trained their own model w…
ytc_UgyVXyTU2…
Comment
I can see the pitfall they fell into. Copilot has allowed me with no python experience to put together a simple program that does everything I need for a single use case with verifiable output. I do not trust this code to do anything but this one thing or get mixed in with any other code… but apparently the leaders of these companies do. They need to learn how to use a tool and where a tool has limits.
youtube
AI Jobs
2026-02-10T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxncSomFDtuIo0-weV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyH6oQDU2Ozipx71UJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwh6_v21TXRFwCB-y94AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzXJI6itbYhB6plLvd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxX7y0qJE2DncfTjxd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMgCYwYJjtzws5vhx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwRY9yOdd8PMBn6VB14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxlOUZizhSSCfnMm1J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzjkYxP5Y_kD0Rmxp94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"mixed"},
{"id":"ytc_Ugymhxyhy7jyF2kCTnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]