Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hyperbole and exaggeration. The only real risk in 24 months is that the AI bubbl…
ytc_Ugw9bcbk8…
G
So what is the difference between a dumb iron bomb and an unattached intelligent…
ytc_UggsIQHlA…
G
Except that most of the negatives of consciousness are caused my biological func…
ytc_Ugx0sQCDw…
G
@harrisjm62 i agree but this isn’t really the time for that. these companies are…
ytr_Ugwave2k6…
G
The two ways I've used AI at work are:
1. Building code in other programming l…
ytc_UgySicemK…
G
America is turning into a shitshow with Trump. Time to start thinking about a mo…
rdc_fnxo4yx
G
I really hate obfuscating language that talks around pinning responsibility for …
rdc_l4z15ej
G
When ever a person claims they can easily beat an ai show them this video lol…
ytc_UgyrB9K5h…
Comment
If it was up to me, I would put a electric axle on a dolly and move road trains. AI would be much safer and cheaper on an axle rather than on a whole semi truck. Even better if these can maneuver by themselves.
There will be a time for autonomous logistics but AI driving is not there yet, mostly due the unpredictable nature of AI.
youtube
AI Jobs
2025-05-29T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxo9njJyJF29J_qQtZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx49Wm5OtQ4hXe3nCJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx3uN2WtS19X898eC94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6NkyPBFCvo1E6j2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztMmfOF4uuq_7rH5x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz5aPlRvj-Hi6Ghxa14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwwmibFBHBE9I2T7Gp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzsvuIerVnEAfQ4_Yl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwTXrR5c5-18aziaAN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwswrgyaLXSPCYXMxt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]