Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why don't we do this instead? Let's not even work for these greedy companies & w…
ytc_UgzytRKRI…
G
Geniuses: "A.I. will eliminate the human species in the next 20 to 50 years."
Th…
ytc_UgzWNWiMi…
G
The power of the few are you. AI will eliminate profit and Gain concepts.
Peopl…
ytc_UgxbSpB5x…
G
utter nonsense, LLMs ar a major disappointment, we are decades away from AGI, if…
ytc_UgyOWv5nC…
G
Ingroups always dismiss outgroup conflict. I doubt she's likely to be targeted b…
ytr_UgxWbU_JC…
G
Yeah we're maybe 2000 years away from what he's suggesting. But there has to be …
ytc_UgyHh_qq7…
G
What you mean is you'll be doing the same jobs once they realise actually ai is …
ytc_UgxLPKoOD…
G
Future jobs will begin with training people on how to train humans to coexist an…
ytc_UgxIWHsf6…
Comment
Well first of all, AI gets exponentially better. Very soon we won't have to correct it very often, if at all.
Second of all, I think they're nerfing the AI that's available to the public. There's much stronger AI in existence probably only available to the government and military.
I don't know if it can become human or not. Who's to say we don't accidentally break space time and accidentally make something sentient? We don't have the technology to do that, but since AI gets smarter at alarming rates, it may invent a way to make its brain so similar to ours, that it literally becomes sentient
youtube
2026-02-09T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzMblre9EH3ua9HTbt4AaABAg.9lfPLq0UQfh9lfScW-Vuq6","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzMblre9EH3ua9HTbt4AaABAg.9lfPLq0UQfh9lfVYRaFk2t","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugxd3sBZjVnuQXNwCLl4AaABAg.ADPDRJLbX7XANBTsFOS5pY","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzsI8KkltxpytpgMNh4AaABAg.ADPDNdoTOLqADPDrRIDwQ3","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgwwFt1KB1YorAnl-el4AaABAg.AUbFB8RAGPiAW0RM17SGJj","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzlflh926lHdUt1MMh4AaABAg.ATfqPUgOA8KATz1pu72xow","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwyJ2IN1lkb3qsojy14AaABAg.AT_kU3jrR-uATcXPoXadrw","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzsCWrmS5EAYHUB7Zp4AaABAg.ASzdRBc1DM_AT-GXd6o74S","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_UgzsCWrmS5EAYHUB7Zp4AaABAg.ASzdRBc1DM_AT-HTWETKx_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwOO46qv2J-I4DQhMJ4AaABAg.ASyMZym56FvASyyeWJRAdr","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]