Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just get a real person. Don’t use a theft tool.
All the art here was miles bett…
ytr_UgyK8ulO5…
G
It's not a robot.. it's a human who injected a configuration into the search eng…
rdc_dsmgrdl
G
This is… false, a person will not replace themselves the US government is based …
ytc_Ugz7U2Llb…
G
You did so much better than AI! I’m not really good at drawing, but ur drawing i…
ytc_UgwJK7WCQ…
G
You don’t need AI to kill off people, just put a dangerous closed box on a table…
ytc_UgzCdmzLa…
G
No, but it feels like an ad for the service he writes.
He delivers a premise, pr…
ytr_UgytUIT9Z…
G
Really? People think that AI won't make most of the job obsolete. Autonomous c…
ytc_UgyUmHgpF…
G
AI tech bros aren't trying to save people's time with AI image generators.
I don…
ytc_UgwFXpclx…
Comment
Not everything on the internet is factual and most of it is opinionated or fake anyways, why tf would anyone ever think that this would be a great way to teach an AI about our society? I’m sorry but wtf was going through their heads, the internet is almost completely disgusting and unregulated, so are you trying to make your AI as disgusting and unregulated as the internet? So dumb, you’d think it would be obvious to not teach anyone solely using the internet, but here we are 🙄
Just spoon feed information, like how teachers do it, so that you can actually regulate what they’re learning, it would take longer but you’ll have a way more or completely unbiased AI afterwards.
youtube
AI Bias
2022-12-22T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzBgq84hAC8XJ7aTu14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgBQKRuRRbF03YVhp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1YaQHzD0C140Mh954AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwit7ej6L59J5fai894AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzlgmnh92fWHNcNhMZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTeFzi5k0mdJj7mVB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyINQKr_hQoIEj9dXF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwRCA44vT7-1Ma5C-l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzcxzPiFBVymj2CU6t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9UmD0BLjhJFNqD1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]