Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is about to learn a whole bunch of gamer words.
And have "colorful" opin…
rdc_l4fdysr
G
Ngl imma debate with the Atheist AI. Its easy, the answer is Humans. We are the …
ytc_Ugw84e2Mp…
G
These robots r going 2b so perved on lol no wonder they'll rise up & annihilate …
ytc_UgwySj5Vc…
G
This is very interesting, but not surprising. Our biology and so our history and…
ytc_Ugx_4tGPU…
G
100 different frameworks, crazy utility platforms, yadda yadda. Modern web dev s…
ytc_UgypgW4mq…
G
Trump just signed a bill for AI to be incorporated into education!! An entire ge…
ytc_UgzuEwR5Y…
G
4:44 what is the point of getting a robot to do something through torture when y…
ytc_UgyR5qQOv…
G
This isn't true. Capitalism does not require products to become cheaper. If it b…
rdc_l4qfg5a
Comment
I feel like NaNoWriMo just got too big; as a bunch of random people basically daring each other to write 50K words during the month of November, it had no need for forums that needed moderation, or a Board, or official statements, or a policy on AI, or sponsors, or anything like that.
Having all that other stuff did enable NaNoWriMo as an Entity to do some good in the world, but also inevitably led to this kind of mess. I feel like they got big without acquiring some of the important requirements of being big, like a skilled (and paid!) person to handle public PR. Such a person could just have said "we have no policy on AI or anything else; create your 50K words however you want, not our business", and I imagine no one would have batted an eye.
youtube
2024-09-10T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz2Be6K6KNee9vbk-B4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw3JHU8990ei5sdv5h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxnxn9FhRu9zcANZvR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAOg-_ropi_LSb2Q14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh9cIHtAlq57aiSoZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzU0mlzCKf2pUYJaN54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyjyQvyIa2mGeQvBEx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxMX2qsIxvH7lX-LUx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzsor_OmEEJZ6qAwmZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxFpqnyt_Ik-mHVfAd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]