Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To play a bit of devils advocate, in response to "the vast majority of people wa…
ytc_UgywmgCp3…
G
The saving grace, so I think, is also the unpredictably of mankind. AI is built …
ytc_UgwHpLhG6…
G
Tech will save military lives but when there's millions of autonomous vehicles h…
ytc_UgwsxrBne…
G
This!
AI is really applied mathematics. Fuck the coders you need those math gee…
rdc_k8t36yz
G
Right or wrong, the Ai Art you create is your own property and and i cant see wh…
ytc_UgzPnOOzj…
G
I wonder how ai bros would feel if people started stealing their code in the sam…
ytc_Ugy1vAnWl…
G
I hear your conversations and I understand the threats but why does nobody EVER …
ytc_UgxicsVkj…
G
This is the central tension of building in 2026. we’ve moved from the blank page…
rdc_ohscfs4
Comment
So much for the moratorium on AI. I’m about to have my first kid and I am actually nervous of this world he is about to enter. If I trusted people to care for others, AI would be a blessing, but inevitably as history goes the greedy psychos will leverage tech for themselves without regard for the masses. Honestly can’t believe we’re not easing into this thing. It will fundamentally change civilization beyond anything since we began 10k years ago, meanwhile the tech and business community doesn’t seem to care much past the next dollar. Along with .gov, wanna be tyrants are licking their lips right now now while the rest of us have to just sit and watch it unfold. Humans are too smart for their own good.
youtube
AI Governance
2025-09-07T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJXuuz0ztFVESToHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyMkAR_6iiRde0dafB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz219IR8buATg-TOMN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpZ-wUFlVBxH724RJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx0G-jUP5OPnctHWUp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwFDNFF3HC5xxHnw9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy6SAoCU5EnnJXPbbl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwNGGJUIb2ABPnoNSh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgznlOnMI4F6_Hb5QeR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy92OLbQTAbjypnI6p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}
]