Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Le monde ne veut plus de la vérité des fausses réalités avec l'IA des faux humai…
ytc_Ugwazgvih…
G
AI cannot turn against us, because it does not have any understanding of ..Well …
ytc_Ugx_zIuAw…
G
I want right wingers to (for once) be so damn honest. The left has NEVER "silenc…
ytc_Ugx7CHX7N…
G
So we have a panel that is either falling asleep (Neil) or making dumb comments …
ytc_UgzM3V786…
G
I was casually mentioning a conflict where someone mistreated me and deepseek su…
ytc_Ugzzl_KxE…
G
I always say please & thankyou to my AI's i just can't help being polite…
ytc_UgwOGaeVR…
G
Google already HAS a better one, they just can't release it as the technology it…
rdc_jcc01xe
G
Is this real ai? why does it keep stuttering and going "uhh" ai really does that…
ytc_Ugzszo9bf…
Comment
Its great to listen to Professor Stuart Russell, the first ever AI expert on earth. The possibilities and concerns he expressed should be taken seriously by AI tech developers. Humanity always largely ensured tech development are always safe, i have that hope and Progress without safety is surely disastrous and surely we kill ourselves.
youtube
AI Governance
2026-01-31T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzAezVIopAX5AVZW_p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyeIKovKyjPAudNN0J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxaCYqlJR5ML-tRw54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzuyjIGGXbyO6Lr5Ex4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxbNWvrlXFfEyZMg8Z4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyWBU_MqLDunqy7dvt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugytmt7N4vSgli-4wFJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzMMVD-V5IoIoigmLN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz730z-lRc23uJ40fF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxGBvQbI_-494WjPo94AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]