Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've used AI to help myself before, but essentially, I used it to help with idea…
ytc_Ugx0eU-IP…
G
Wouldn't we always be able to just pull the plug? Considering AI models are stil…
ytc_UgzGPT-bE…
G
I'm all for this. the more jobs we replace, the closer we are to finally realizi…
ytc_UghmR29ej…
G
How is AI the problem?
The fact that he couldnt open up to the parents speaks lo…
ytc_UgypN64Ih…
G
If you saw the share CORRUPTION in some industries (like the agricultural and fo…
ytc_UgzxU3JBA…
G
this reminds me of the cyanide and happiness episode with painbot. what if you b…
ytc_UgjduTmNn…
G
If you truck drivers want to keep your jobs, you need to stick together. Here'…
ytc_UgzUu_Qej…
G
A lot of this won't show up as clean as he's promising. I don't mean to be cynic…
ytc_UgztY1U7t…
Comment
"You can't generate man when you search for doctor"
Lets take that one as it is a good example as the medical field is very close to a 50/50 split. As Linus said, if it generates more than one image, it can take that into account and generate x man and x woman, fine. If it is a single image, any gender it picks can be considered biases (assuming a 50/50 split), which at that point you can just refine the prompt if you want a specific gender instead of trying to "gotcha" the AI.
In turn, if we are talking about a gender dominated profession, E.G. Oil Rig worker or pre-school teacher, if the AI generated a female oil rig worker and a male pre-school teacher without be prompted for gender, 99% of the time it would very likely not be what the person prompting it was looking for.
youtube
2024-02-29T00:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzBwOiT10CFBQE1hd14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDSFxWLSsEQNNDpD94AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwL-LexZyj-9tZLUDF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzfAYAmD2hz7FZ-6-t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxntBwLmJWCN5I7w5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzHA1sPnGyLndslwrV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypPNDn61nBUJUBZm14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLH4jS5VK47jmepX14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz8lov1GFNEZUpHzv94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgylRxw6df1sii0EJ694AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]