Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh yeah Geoffrey....how bout telling everyone watching you how we are filled wit…
ytc_Ugzr8Z4OD…
G
It doesn't have to be, check out Tom Waits v Frito-Lay. The problem is AI genera…
ytr_UgyOPf4vo…
G
If that's the case, then there is literally no other group who should be dominat…
ytr_UgzMbU0_2…
G
11:01 the ai button would be more useful as an ear reference than a button refer…
ytc_UgyFn_ZK7…
G
Not really, for them the rules are slightly different, also absurd. So that the …
rdc_ljok2wj
G
Over 1000 GPUs are needed to run a basic AI
It isn't that practical and compani…
ytr_UgyqkoHZl…
G
AI is not bounded by national borders. That’s what people who try to rely on gov…
ytr_UgwCOPu7F…
G
I was looking for a lawyer and called a bunch of law offices. Maybe 10% of the c…
ytc_UgwL5gOJO…
Comment
As someone who's debating between Christianity, this would be my argument on that first question as the atheist.
You said God allowed evil because he didn't want robots, but I dont feel like a robot with laws. A wife can still love her husband with laws, but laws can still be broken. All I'm wondering is why God didn't create something that would stop evil, I'm not even asking that God made it impossible for petty crimes I'm just wondering why God didn't stop the worst evil, like murder, genocide, wars, rape, tourcher, child abuse, natural disasters, disease, poverty ext: and I'm not saying that God kill or punish someone for eternity, but why didn't God create a better species that, the thought wouldn't even cross their minds, why not a better world where natural disasters or disease wouldn't happen
youtube
2024-11-06T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyFhPGd_UO5RPsAu8N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgzzE1FRcUzwkxCvlPN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgysRnMIgH-Fnbx7FMN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxz_K72QObuVM8QLmN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxHFqdhTzW7bneSPd14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgzAsyfSKld3pW2EBY54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwFQy5NOwRfmSsdfiZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw_oAgfbX_dbZjkgqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugyhoj2KqOc5katB6gx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgzfJKm22HRL9vTXhiR4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"}]