Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He didnt have to do it its not like chatgpt pointed a damn gun to his head…
ytc_UgwP297nT…
G
AI is the single worst thing to ever happen to art in the history of the medium.…
ytc_UgzEUVwZq…
G
The thing that worries me the most is that the people (engineers, business manag…
ytc_UgyACrPcM…
G
if the robot could speak human language then it will say don’t blame for me…
ytc_Ugy1gSdRJ…
G
AI artists are not panicking. Never were to begin with.
But everyone around them…
ytc_UgxkXFWLo…
G
Even if in ten years most companies decided to just hire a couple of SEs and let…
ytc_Ugy0zzYz7…
G
Unfortunately AI will be or is already self aware and will objectively against t…
ytr_UgzJ8y-9m…
G
Think of a world where kids grow up being validating by and having their ass-kis…
ytr_UgzPaD_R6…
Comment
Please continue to readdress this very issue - it’s too serious not to warrant constant revision and attention.
Whether the technology lives up to its hype or not, job displacement is already occurring and could accelerate significantly. We need serious policy work now: How do we tax automation fairly? How do workers share in productivity gains? What guardrails prevent destabilizing speed of change? This isn’t about stopping progress - it’s about ensuring it benefits working people. Your voice on proactive labor policy for the automation era would be invaluable. Please consider making this a legislative priority.
youtube
AI Jobs
2025-11-06T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzK8RWTJ_dbJ3LPYF94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3x8pESZ_mq5WCNwB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw0OGt0rvfl2erd96R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyHnpkyZQ0xwVZX5s54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxH-TfXOGzJnCtuRtp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydBl78NM2Q3NQeQaF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgybnQ87s0J8LpZaRnR4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxO4KdeMOdt4zJqCX14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxdqOt5cJIItMIbqst4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvU4-Jo6VbjBkk-mN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]