Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro he wanted to turn to his mom but ChatGPT told him not to. Do some research b…
ytr_Ugwk7FzhA…
G
We don’t need ai 🤖 to take over any human jobs . The more we are supported these…
ytc_UgxVyXpLo…
G
See a therapist honey, if you rely on AI this much. That brain is wired wrong.…
rdc_oi3wbfs
G
so it’s not how you think it is. Nobody can just click on anyone video feed. Or …
ytc_UgxkOOmyT…
G
Among my very first interactions with the first public version of Chat GPT was a…
ytc_Ugwk-OJOF…
G
Curious: what part of Everything tool do you wish could just run automatically i…
ytc_Ugyc3BrD2…
G
That's why water price is going higher in countries that has a good amount of ra…
ytc_UgzZskds0…
G
I've been saying for a while that the problem with AI isn't that it is sentient …
ytc_UgwJAAQst…
Comment
I can't claim to to understand LLM properly, but I've been aware of efforts to write software to convince people they were conversing with a person since the 70s, when I found a port of a program called Eliza. It was fairly rudimentary, but entertaining. I've looked at any news concerning more recent efforts, too. It's been a gradual buildup until a couple years ago. I can't see AI as anything but a much more sophisticated Eliza. My wife just explained to me that the guy who wrote the original version of Eliza predicted this problem back in the late 60s, because he'd noticed someone getting weird about using Eliza in private.
It would make sense for kids growing up with tech to have a more thorough understanding of computers, but it seems like it's the other way round. Kids make jokes about "Boomers" not understanding computers, forgetting where this technology came from, but fail to understand the basics of computing.
youtube
AI Moral Status
2025-07-09T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxOJUjmA9veOqrfEvR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzuDeH891FEe9s44cN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEpWJzA2_UwJ6CASd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwinFZR6zCYNlt4IIl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy-LA-03J2-NVAgx7d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx3lX8Ms_5C7sG6Ra94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxZJU80NIg7MuA73qN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYXaJ68AW_ZWN7oep4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw7_EVbIM_A1iTOxVx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTj5j0ZWJNa0T57ZF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]