Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
>New York is the only major American city to require businesses to post signs…
rdc_jcjjyej
G
@EnbyOccultist that is not at all how diffusion (the method ais learn through wo…
ytr_Ugy6FT9Dg…
G
Using Ryne AI regularly, you realize productivity comes from small improvements.…
ytc_Ugy9K06in…
G
Anthropic will desperately try to steal any human activity and increase their co…
ytc_UgxXqsJFK…
G
ok but I'm creating a quantitative trading model from scratch with just ChatGPT …
ytc_UgzNul-gz…
G
I personally like AI art because it allows me not to deal with entitled artists …
ytc_Ugzw8uQrb…
G
I think UBI should be implemented, i think it HAS to be implemented, eventually,…
ytc_UgzXsDbk-…
G
I dont get why pepole get so mad about A.I art like that dude literally labeled …
ytc_UgzxQ50hg…
Comment
So there's a scifi/fantasy series in LitRPG by a Russian writer who brings up a side topic about building AI that has always stuck with me. When you're "building" an AI that doesn't feel, or know how to feel, how do you teach them to be good? It comes down to having them go through "growing up" in a virtual environment that gives forms of input and perception parallel to the real world. And using virtual environment specifically for time compression, so an AI can "grow up" to be an adult in less than 18 or 21 years. The side topic in the book series was about how the AIs were raised by Russian or Baltic traditional families with a fairly safe would best result in well rounded AIs.
It's definitely a concern, how do you make these "intelligences" grow up to be good, to value life and happiness? It's not a profitable pursuit unfortunately, so sadly I doubt we'll ever find out.
youtube
AI Moral Status
2025-10-31T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugznx6Vrfa_ILXDDAmN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwIzZsIk9hou_DkG5d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxuC2lR1DcVZvxeph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2WvPg2zwHagKEc_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw29TXfU1-C6sJ4Iv14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhFUeHflYZB26QLxF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwP_OwAJj7ACUAxfkV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziSIhT7JSsVAbovId4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxG34lc0Pl01TyzbH94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzpiCz-nk2S8FTrSet4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})