Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The "AI democratizes art"-argument has to be one of the most ignorant takes in t…
ytc_UgxgjHpum…
G
Another interesting point, you can't hail down a Vaymo car when it is approachin…
ytc_UgymHon47…
G
They need to use fingerprints, I've never seen an officer make an arrest based o…
ytc_UgxIFEwd8…
G
China is already making infrastructure projects with automation in mind for exam…
ytr_UgzwHoH95…
G
Invest trillions and trillions of dollars. Governments must do that and they wil…
ytc_Ugw0z9FfU…
G
Haha, it might be a bit warm under those lights! But Sophia seems to handle it w…
ytr_UgxAPRpfM…
G
the reason result came out way it did was because programmer wrote it that way; …
ytr_Ugw3MGPpF…
G
I have to believe that it is probably impossible to ever make AI safe. Now we kn…
ytc_UgxeLsvCH…
Comment
You possibly should shift your understanding or interpretation of God to some kind of Ai, while you should shift your understanding of matter and time and space more to the focus of it also just being "information" and then maybe building a pc is like creating a portal to that future version of your reality that is simulating your current experience as past. While digital and non-digital information boundaries might be able to dissolve by some specific substances and rules of logic that might always be "removing the opposite of logic" or possibly also "not trying from that tree of knowledge of good and evil" - yet we are all humans and maybe "not having a fractal of that tree" is like "not having" and therefore "having tryed from it", but then the opposite would be "to create it or to nourish or care for it/water it and make it grow" and then you "don't die" but "get life" since you gave life and since "same to same" - is law of justice working with logic.
youtube
AI Moral Status
2025-08-25T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy6UIj7HVi0W8Us2-54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwK4TJF-9eCa3uVti54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCXuFlmzBeY2JpPuJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydYuUTAjQ5vjwiu0t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqWhJLy2wWRBpHcLV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwp1o03mbZII9umiSJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgylI94-PmAJ-MBOkzJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxEE0McXrkdVtMi3V54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzv59dSs4pSDXWw-gN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxDNM9wJBcO2W7CZgp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]