Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One thing I don’t quite get, is why is this an issue but inspiration is not. If …
ytc_Ugx3dZP--…
G
My first thought as well, the response in Wuhan was swift^1 and effective, so a …
rdc_g9t4n2b
G
You... do realise she didn't mean copying 1:1 right? She explained the differenc…
ytr_UgyOH4POu…
G
Idgaf if I'm wrong for saying this, NO AI model is better at drawing than any ar…
ytc_Ugzhhe0fc…
G
I love how in the FAQs, the third thing is "What are hallucinations?" And it goe…
rdc_mcwvgjz
G
Cops need to learn this ai shit so it doesn't keep happening. Hes not qualified …
ytc_UgwxQTRJm…
G
I think if we all came together to teach AI "artists" and AI "artist" supporters…
ytc_Ugx23Fn2-…
G
I love your voice
And yeah ai art and human art can never be compared ever
Art…
ytc_UgxwQi8nK…
Comment
I wonder what Prof. (?) Geoffrey Hinton thinks of AI in healthcare applications — assuming AI finds a cure for cancer, would he still rue his initial foundational work on AI/neural networks (assuming of course humanity would not have invented AI in the absence of Prof. Hinton's foundational work, at least not as early as the present time). Assuming AI finds a cure for cancer would he approve of AI (with all its concomitant ills) or does he believe that the AI-invented cure for cancer is not worth it given AI's potential for all the ills that he believes it can inflict?
youtube
AI Jobs
2025-11-02T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwh8oOnG80G5bNiobF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwyF-igfNDK7Mq7YPh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw67Bb-2zcrAKOS6Dx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdZQ0IXSe-WdQ8lI54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw5f7A_gIMHeH_AwXx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxsHuCsr7EzxBMBN-l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzeeTmODU-sY9dEgLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1MpgNH8UsoeBr0vN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmawnJ1pnuXcWZCzN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwcxgo-irvT4gTaBZR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]