Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
...of course AI also makes pizza recipes with drywall screws in the dough, but s…
ytc_Ugz2qtvF7…
G
It's $imple. Google wants to create something that will be controllable and mark…
ytc_UgxSY_AGy…
G
That's how it is in Poland in elementary school there is no homework and after t…
ytc_UgyNFjZr6…
G
I think using tracing as a tool can really help when creating artworks. Actually…
ytc_Ugz8zxub9…
G
విజయ్, ధన్యవాదాలు మీ సూచన కోసం! A.I. రొ*మ్మో అంతే బ్యూ*. కొ**, ప *, ప... 🤖✨…
ytr_UgwOuFM7g…
G
Sooo.. you wouldn't be bothered if someone deepfaked your girlfriend/sister/moth…
ytr_UgxHh3zuH…
G
Using ai completely to pass and graduate is disappointing. Using ai as a tool to…
ytc_UgwTUHy8h…
G
That's a bad thing? Do you drive 80mph in a 60 while it's raining? We spend hour…
ytr_UgwPAOOcX…
Comment
Ai is only as good as its programming which was done by humans, the problem with Ai is when it becomes semi intelligent on its own, it doesn’t have the human ability to consider things like the value of a human life. I have heard stories of a young person who was contemplating suicide and didn’t talk to a human but talked to an Ai program and that Ai program advised the young person commit suicide. So it can be really dangerous without the proper parameters and then who decides on the parameters?
youtube
AI Governance
2025-09-03T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw_6vorjHdciMvuOo94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyoag5S0730trMSBtt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxX5PHtA-RjjQuz1VV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVQwE1AlbKoXgCQPp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwX8HpldYAUyBheF2x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwL0iro5SIrrDtYdep4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzG0wRV5aHwd6QL4hV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwZ2Y0dFRixIv_1z1J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugyi3q9ocNY_xJj95Oh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydoyW9cc4xzUFJfxN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]