Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah i think we're WAYYYY off from that. And even then that robot is as uncreati…
ytr_Ugz-0sAaq…
G
I find AI like plastic replacing cotton and wool. Industrial food replacing real…
ytc_UgwLBAPdl…
G
Until the resource drain that AI is currently doing causes it to self destruct. …
ytr_UgwRp0EZi…
G
I'm a artist, using ai. I agree. It could be some kind of katholo, spaghetti mon…
ytc_UgzHEOH8P…
G
I believe that ai generated art should only be for a.) personal use or b.) humor…
ytc_UgxAyCcNi…
G
Let's do an actual criticism of "AI"...
AI or Artificial Intelligence is not ju…
ytc_Ugw1khlyI…
G
Driverless trucks is the easiest thing to boycott , there's no one in the truck!…
ytc_Ugz4Khi5a…
G
No, I disagree.
Just because it was public and free *does not make it yours*
*…
ytr_UgzkAuC5d…
Comment
I think fundamentally a better version of what Google was trying to do should be the way they go forwards. It just needs more training to know when a prompt was specific enough to override the added diversity weights. So when you search “doctor” it should output a wide range of options but when you search “male Japanese doctor” it gives you that. And it also needs to be aware of extra context, for example if you ask it for a current US soldier it should output both male and female versions but ask it for a WW2 US soldier and it should only be male unless you specify female. If you care about the subject’s ethnicity or gender or whatever then specify it and the AI should follow that guidance. If you don’t specify it then what exactly is there to be mad about that it didn’t output a white male? This is far more a failure in execution than a failure in premise.
youtube
2024-02-28T17:5…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxQT84I9TEhX9fNJmZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGlAs6eDhGd_KoC9V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzpk9UeXHdKK61KP7F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0xnvYRFdJLI8RIYN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzL8-WJrH9gxDnMaPR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwchABnkzTL6ywOd554AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpGAr4iqyCy7VDmWl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxPXejnZosmwaCSB5l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJxCooqE4Dx4DQOmB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx6LDNfnx15krw3JKR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]