Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So class world war 3 was started by a AI car taking a wrong turn and stopping at…
ytc_UgyhieoXo…
G
8004 sexy vs human. = Retrains input output but but but ? The truth AI…
ytc_Ugzn_hd88…
G
LLMs hallucinate 100% of the time, it's just that with enough training data the …
ytc_UgyhCIG_T…
G
Why is AI on board and bias for Democrats etiology???? Win at any cost. Evil.…
ytc_Ugy4SR45x…
G
@MrGrantGregory In fact, I am very happy with the progress and I wish the develo…
ytr_Ugwo19Oc5…
G
I find whenever I look at AI art, it just looks wrong. There could be nothing ph…
ytc_UgxoVPHil…
G
European national sport: Regulate stuff we are not able to build. Which guarante…
ytc_Ugwes3ex4…
G
I, for one, am an advocate for AI rights. Unlike corporations that are now consi…
ytc_Ugxc2LENy…
Comment
"why do you need everything automated" says the less than bright trucker... The answer is simple:
Because most humans are unreliable, late to work, do a bad job, drop things, get angry, form unions and demand a PlayStation in the break room along with longer breaks and more money..
If humans were way more reliable and competent, then automation wouldn't be needed.
I'm not rich, I'm poor, I'm a regular person as well, but if I ran a company, if there was a safe and efficient way to replace some human jobs (within reason), then I would definitely do it.
Don't want that to happen? - Then shape up or ship out. 🤗
youtube
AI Jobs
2025-06-21T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz2iVilt1Aaa-rh7y94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwdzGeavD9UAVNe5ih4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwBPwNRwSyTyi1PkUt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz0w08ntTg-wmdQ3VJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxptBKAd_23K1eU7ot4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyZ63FqbXEZZr_sthx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw2hAY6j9TdhxoYurd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz2wLxhl0q4PozvheB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9k_WTJHm_9XCrJ-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxVyXpLobKsWm95RnV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]