Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
so if no truck drivers over the road, how will chains be put on or taken off for…
ytc_UghzAg5h2…
G
Couldn’t agree more, also we need to be realistic, AI is trained on data, if 2 y…
ytr_Ugyy0sarh…
G
I feel like you may be missing a key part of the AI debate, that it's not just t…
ytc_Ugwu80lwT…
G
Well, I agree. It's just another tool, like nuclear weapons. But we all know wha…
ytc_UgyYz43cu…
G
I would be careful producing anything on chatgpt. The user agreement is brutal w…
ytc_UgyZaRW7t…
G
Really, you're ranting against windmills?
As turbines age, they can be repowered…
ytr_UgwbfvML2…
G
>To be fair
Not really.
Although first world countries have a larger deman…
rdc_eudmcrr
G
Good presentation and here are some further thoughts on Consciousness, AI, and o…
ytc_Ugx-qal0V…
Comment
People love control, so this is not going to happen anytime soon.
You can read about Anthropic’s Project Vent — even so-called “AI” still makes a lot of stupid decisions. It’s not real intelligence; it’s just a large language model.
We do not have true AI today.
And the economy isn’t only about digital things. Digital doesn’t mean “real life.” In reality, no one can clearly say where all of this is going or where we’ll be in five years.
Yes, LLM-based agents can replace a lot of jobs. But humans will always be needed to supervise and take responsibility for critical decisions.
LLMs won’t replace humans in many professions — not now, and likely not ever.
youtube
Viral AI Reaction
2026-01-02T10:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxM8nJd2_KBRyYnpk94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzu5YXNBWNoXSi1xL14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxlNOXLRN6ZNnlFKoh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzvdnjw6FISt3OlaF94AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_CxxGyBbbHz5jI5l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcWqw2z7W0dsOiYSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw8nVTHvmBh1ND83UR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyY3qXAJpgbgY3iBel4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZzEpOBhycsBWjksh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwA_oyZ2Cz8ssIG3o94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]