Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How can a chatbot feel anything, like love? This is crazy! It is A1, but how did…
ytc_UgxDUUCH_…
G
Not every doctor is dr house. It makes sense that an enhanced google search (ai)…
ytc_UgzRo9bWl…
G
I saw that Asmondgold clip before, and when you listen to it IN FULL, he has a p…
ytc_Ugwg16SMu…
G
2:43 GABRIEL FROM ULTRAKILL MENTIONEEEED????????????? OOOO YEAAAA BABBYYYYYYYYYY…
ytc_UgxlA4NEh…
G
So obviously money is the motivating force in speeding AI along…so you play it o…
ytc_UgxNePVM1…
G
How many fucking AI companies do we need, and what does a shoe company think it …
rdc_ogpro14
G
LLMs are a word blender. They spit out bad stuff because humans put a bunch of b…
ytc_UgzIwYd1A…
G
@thlightest9827 all robots are connected to a central server via a local WIFi n…
ytr_Ugwgi9K9z…
Comment
I mean, it's not so much creating something new as it is collaging hundreds of thousands of other peoples' ideas together based on human-generated tags. That's a big difference from how human beings use their own ideas and discernment when taking inspiration from other works.
There's also just the fact that there isn't enough data to train these things to replace humans. That's why AI firms are now flirting with Habsburg AI, basically using generative AI to make training data for generative AI, which seems like it'll just comically exacerbate AI's current "hallucination" problem.
youtube
AI Responsibility
2024-06-29T07:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyruCTGrPO7H-1OXrN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugwt18sLdGaTHIMmbqJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy8Cbah6Sn3aWd7HzZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwO9SN4Q8SV5bAHIgV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyoHtJF14xLGVaN43p4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwsOR8LWmuhqeZnkKp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzwJ_iG3HmWQEqvnAx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxTk6EDbEaxHQ4g9NZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw32fz-O-DTYwia3SV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw9kXZe2aWJXuDzTZ14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]