Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You want to fix AI bias by adding YOUR bias to the data set. This is dumb.…
ytc_Ugz0UAEkL…
G
> That's the other thing, human communication into these systems
Which is an…
rdc_f1eqdrv
G
Sue the police, sue the hotel and sue the fukking AI BS company for millions…
ytc_UgzyBHG4y…
G
He is wrong on Bitcoin being a limited resource. If at any given time the miner…
ytc_UgwJTexEQ…
G
I've been writing scripts and building systems that automate jobs for over a dec…
ytr_UgyzW8whB…
G
UBI is stupid. Just have people as mindless government controlled drones? No, ha…
ytr_UgxMXB1--…
G
if its just a machine its a imaginary love like n imaginary …
ytc_Ugy-69w3D…
G
1. We have no idea how this will go. Just look at the constant failures of self …
ytr_UgzjkaKbV…
Comment
Oh, absolutely.
The worst part? They don’t even see the problem. They’re not learning, they’re outsourcing their own brains. If this keeps up, we’re looking at a future where they know how to ask AI questions but have no idea how to think for themselves.
reddit
AI Jobs
1738457182.0
♥ 216
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_magzg44","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_mah7ttz","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"rdc_maikn74","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_maiyilb","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_mahxrgu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]