Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why we expect AI to behave better than humans? At least quarter of humanity had …
ytc_UgyiaFx-r…
G
Ai not only takes away 100% of your privacy but can take away your freedoms. Co…
ytc_Ugz3IlQ5u…
G
If everyone starts using AI to replace most jobs, where will the AI computing re…
ytc_UgzM4MXTb…
G
Calling people who use AI to 'draw' is like calling someone a chef who has emplo…
ytc_Ugy8VX-QB…
G
Here's my more realistic approach. Yes art can be generated, yes it can resemble…
ytc_Ugw8hwnM7…
G
Everyone is thinking the wrong way. Every job that exists was created. By a busi…
ytc_UgxSq4K4F…
G
im mainly on the side of "it looks like shit", i kid you not i have such a repul…
ytc_UgzA67uHo…
G
Ghibli's style was never perfect and smooth. It's even rough at points. Because …
ytc_UgzLcCSmq…
Comment
Don't worry too much, folks. Replit thought they could replace software programmers and found out the hard way when their AI went rogue, deleting all of their production data and creating 4,000 fake data profiles. As someone who understands these things on an internal level, I promise they will soon realize that humans cannot be replaced because anything we build will inevitably have errors. Also, fuk those drivers that are in those trucks, let's protest and create a union, but ppl are too greedy and dumb to do it.
youtube
AI Jobs
2025-07-27T13:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzLkWK22Po5lbkmEKl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx5ieB3Y3hLjHxAmyp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCQSlS9Ar9hWslHn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz5mp0MGEEVF0Lhl-14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx6zCc5LRA6nMbvY4Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgztgT_3Bo81Bt10LYF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugydasvz5a3WSAthJfV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwco83afL79xp-K51V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz5a92tLyAdFmRjoIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwG8PiiVK2Wm1MOnQB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]