Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think AI could be used for referance images, but it isn't art because there is…
ytr_UgxCPAACc…
G
The devs 'rolling' their eyes at this point in the AI cycle are just behind the …
ytc_UgxVZ7Qxd…
G
It's crazy hearing xqc telling artist to be a plumber when he's literally the la…
ytc_UgybynIyS…
G
@juliagisella ai art always has its flaws yes it can look very good but it has n…
ytc_UgxalYk22…
G
We should be more afraid of humans than of AI. It wasn't AI that created more th…
ytc_Ugwe9vBs2…
G
Total shii
I would rather call a human customer service than an ai
What if the…
ytc_UgzKN3O1k…
G
"Why are you raising an alarm?"
Because Dario is the only AI juggernaut CEO tha…
ytc_UgxmXMrvW…
G
“Ai art is the same as digital art”
Me who spent 9 hours on a Digital art that…
ytc_UgwFpT_lp…
Comment
@fen3311 No, the issue isn't when it decides. The issue is when what it's been designed to do DOES conflict with our self-interests. The issue behind AI isn't AI itself. It's the approximate, ballpark thinking of the humans that design it. As artifical intelligence becomes more complex and gains generalized utility, our slightest biases and mental shortcuts that we used when developing it will become more apparent and pronounced. We're playing with a monkey's paw, so our intentions for AI and the way we design it need to be perfectly aligned, without any human error.
youtube
AI Governance
2024-02-19T05:5…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzoiqacwDnTsirvXtl4AaABAg.9zb5LdYWvsR9zmWQZlgGtJ","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzoiqacwDnTsirvXtl4AaABAg.9zb5LdYWvsR9zobHOTJRd6","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugxqa0xJTu6jeoZc2Vp4AaABAg.9zax21AhBMW9zb0XwiL8YM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A-UHouISpuH","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A-zeYdA34G6","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A0BTnkQyGBq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A0EIUq1Gy0b","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A0EPCVssybp","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgyAn9UMXeFG0p5fasN4AaABAg.9zaeaIkd0ji9zaxX0297JL","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzwoaXz9oqsdxViHWF4AaABAg.9zaPkUsCS8F9zaxm-ga8gV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]