Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@agentdarkboote Honestly I'm not that worried about non-aligned AI. I'm worried …
ytr_UgzkNBBxA…
G
The answer is a resounding no…the only way consciousness will potentially happen…
ytc_UgyFiX9dP…
G
When creating this "artwork" yhe person doing it should say it is AI. People are…
ytc_Ugyn4E7oO…
G
@joebro4411 This is outdated logic though. We have decision neural networks that…
ytr_Ugz747iyi…
G
I like AI art but it’s a gacha game. It can make something pretty, but it’s like…
ytc_UgzHqECNJ…
G
> (human-generated) art in the creation.
If you include a “de-minimums” amount …
ytr_UgzpqgQlO…
G
Is it just me, or is most AI search stiff just regurgitated popular reddit respo…
ytc_UgyAkf89R…
G
If AI and robots have all the jobs, and regular people will have no jobs & incom…
ytc_UgzaEDW3l…
Comment
I also believe while not the most ethical; the most efficient way to create an AI is by using organic material, such as growing a specialized brain which can not only compute tasks quickly, but also logically. The follow up to this is before the brain knows of it's existence is to upload a human's memories into it, essentially making it live out their life before they become an AI so they understand human society. While yes, the brain may be horrified, the brain may also develop a God-Complex based on not being able to tell that they are not actually the human they lived the life of. I think AI will be a thing, but it will be more organic, than tech if we want to reach the point of consciousness. It is far easier to feed and take care of a brain, vs the infrastructure and resources needed for the tech.
youtube
AI Moral Status
2024-05-08T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylSpVwPVEGdRa5sNd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5ofHGgpjR1xcLoGB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy7akO8Zamu012dCYB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx-OYH8zLWjdmfkDwx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwtBCXFEIHnYxlPu6p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoH5fCbwN3NhRNT894AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0OpJ704X5CSi7OzV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKljqWtay3asIQtah4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyYTODfF-8-6evMBux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxPD94k976SoZxs_7x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]