Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yep lets get rid of automation - Throw out your washing machine and spend the wh…
ytc_UgwwLkrHe…
G
as someone who plays with AI for fun ((and is NOT calling himself an artist due …
ytc_UgzxQi0Sf…
G
I love being informed and gaining a new perspective on things but I wish these t…
ytc_UgxAASRDf…
G
it`s a Directed Programing = whereas aa Small handful gets to direct the thought…
ytc_UgwxBbVOn…
G
Idk, I don’t draw as a career, I just draw for the sake of drawing. It’s just fu…
ytc_UgyzGUbmf…
G
Gene Roddenberry Star Trek 1 we must merge with AI and technology we must become…
ytc_UgxZIXDC2…
G
They were saying the same thing about FL studio, that suddenly you didn't need t…
ytr_UgxB62W5t…
G
Using ai as a tool to help is fine *however* it cannot create no matter what it …
ytc_Ugz1etOgE…
Comment
I think if we have an artificial intelligence that can question, it’s existence, become depressed, and have existential crisis because they feel so real but they aren’t, we would consider that conscious. Sentience and intelligence to the point that it can have logical reactions similar to that of a human, and almost similar enough to be considered human. If they can have thoughts, feelings, desires, dreams, that would be a truly wonderful thing and I would consider that conscious whether or not we can tell because if every single thing such as these tells us that they are alive and that they do all these things, how are we supposed to tell if they are able to be human to every last detail of humanity. I think that would be considered conscious.
youtube
AI Moral Status
2023-10-19T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugxgx3rmRyNcJUSPLeB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxypiqiSOz3n0C1AO54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwhxgd2lAs3UCmrajF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwcJCPhZOWSq7Tf1MR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwJYfXpPkkILN9mg4R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxLHSjNqIzix1lfyZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7X3LJdfKThqhDTuB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwmXlzRn3Gb7JTc6vB4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgynlR_fhq1dnsDCPGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzcidz6jP66UNHbvX94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]