Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also the AI models are "cheap" for now, but AI is mind boggling expensive, I wou…
ytr_UgwaaxLqX…
G
I would not watch art made by AI, the main thing that interests me in art is who…
ytc_Ugxq_p6xi…
G
They should have made the male robot look like Biden and make him really stupid.…
ytc_UgzdCcSCq…
G
I haven't watched yet, but just off the title: same. I've gotten ChatGPT to prod…
ytc_Ugwp42dPI…
G
if the type of ai you’re talking about is what I’m thinking of this is because t…
ytc_Ugwit7ej6…
G
@leedolian4482 This is far from the first technology that was said to make LOTS…
ytr_UgwKhKBI2…
G
There are two types of algorithms, the one that predicts user’s intentions, and …
ytc_UgxRamO3C…
G
A lot of people would claim using ai at all to generate art is bad because of th…
ytc_UgyJogvE0…
Comment
As an BSc in CS and an ML engineer I don’t agree with some of the things you say. This whole discussion is flawed in my opinion. Because the same way no one remembers Turing test, Moravec’s paradox, etc. Basically the market will sort this out. No one will care about this philosophical discussion “is this thing conscious or not yet?”. Why no one talks about turing test? Because it is irrelevant now. At some point these learning algorithms will be so good at doing some particular tasks no one will care anymore about philosophy. I’m exaggerating of course, but if a robot comes to your house, takes you out on the street to put you in an energy generator, this discussion “is this thing conscious though?” will not be relevant anymore.
youtube
AI Moral Status
2023-08-21T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyiXF6J1w07hcYWQzp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgysAI4NXVF1XsbDSol4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxUQtGJbCvTp8Mo_354AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzigh0InrBLROQjyTR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyx0oKufu_NJVAZu5B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKI9AUH-yjYL6CD0R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxAR_m4exbkzzvva4t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxkAGFI1o39n0PataV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgztixwOLuoP78-87Hx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwNqXm3Lk4bqL50AT54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]