Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like how archaic artists call AI art ugly simply because it's so abundant but …
ytc_UgwRf-med…
G
A good battle robot will have the equivalent of a 'fight or flight' reflex. Man…
ytc_UgyiFLA1H…
G
American capital owners have this obsession to eliminate human labour hence the …
ytc_Ugz7jiK8a…
G
My commute to work is a round trip of 170km. Electric cars are pretty useless if…
rdc_da42dwu
G
Not how I feel, it's based off the art and took many years to get to this point …
ytc_UgzHI84PP…
G
I dont know if some1 read my commnet, but still writing lol... Enywhay im 40year…
ytc_UgwpWNrG1…
G
Who asked for a self driving car? We all want a crash avoidance car, but I dont…
ytc_UgyMpPlYD…
G
I just want to say “Thank you!” where you metioned how digital art was trashed o…
ytc_UgyUEu_f6…
Comment
More than likely developers will be managing AI agents who write the code for the next year or two then there will be no need for any developer at all. Companies will hire one IT manager to manage the AI agents. These agents will be a team to write, test, and deploy the code. If a company can save money they will do whatever it takes.
youtube
Viral AI Reaction
2026-03-06T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzSuqp-ORWs24qpawt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwm-KCgFKvqtLd_9tx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzE-FwzXzCfA2agPt94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwfk4us0KscR-miYp14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw4FwHN1BzAPLf50rF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxC6NBXy9wP5aDrJTN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxGXyMVyhtPQTUxgD54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwncxX2mkRMQffncUN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsdKnA0TsQs6q4hsx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNas0lKvPEynU5RtZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]