Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai isn't even intelligent. It's just a program. People just trying to get excite…
ytc_UgwGBldi4…
G
I know a designer who has many friends who are visual artists and she dumps midj…
ytc_UgxaDHE9G…
G
Well if you do it with your own wallet, and mouth, makes it easier for me to lea…
ytr_Ugwnnlnd7…
G
I suck at art, especially humans.
I use AI as a tool and to pass time.
I WOULD s…
ytc_Ugy6LX60W…
G
"OMGEES HIDLER SO EBIL OH EM GEEEEE!" disliked blocked and reported for being an…
ytc_Ugw5BksAA…
G
ai "art" or content is such a large problem now that it is justified to accuse p…
ytc_UgxvW209s…
G
Yeah no build your AI God as soon as humanly possible where were you headed it's…
ytc_UgzVRxCB5…
G
One thing I've always wondered about is where is the line drawn between we human…
ytc_UgzMRqQYM…
Comment
I asked AI where it is on the Gartner hype cycle:
AI technologies overall are in a mixed phase, but generative AI (LLMs, foundation models) is likely in the Trough of Disillusionment, while supporting technologies (engineering, tooling, data infrastructure) are still riding high on inflated expectations.
And what the future holds:
Near-term (2025–2027): Painful correction, skepticism, and regulation.
Mid-term (2027–2029): More stable, industry-focused adoption and real value.
Long-term (2030+): AI is just part of everyday infrastructure, powering new industries.
I guess we will see if it comes true or if it is all just a hallucination.
youtube
AI Responsibility
2025-09-30T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzG8j2YVhTg3NJUMBF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzM3UId6tuDENXTW7J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgztKncGvoygx4a_daZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyspVHKIIbiMnkGxix4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0zmLrJnEeFx5pTs54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxD2QSDvqvyCnRML_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweNe3I2t4OEG5sO_J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlysMwc2Lu6jVmqrd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyUdDtwJk7jis43e1Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrTzXLDPD26R7NiFh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]