Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The speaker isn't all that amazing, but I don't see what everyone keeps bitching…
ytc_UgzgcYNQr…
G
More propaganda from the ultra-lefty BBC. Trading on half facts and subduing pub…
ytc_UgwCleW1W…
G
Ugh. Anyone actually thinking digital drawing is comparative at all to ai image …
ytc_UgwYha0ef…
G
At what point could we interview ai or super intelligence.. that is assuming it …
ytc_UgycyCxgT…
G
When AI/Robotics no longer need humans, we are only competition for resources...…
ytc_UgwIyFG-p…
G
im the one who gives ai a mental breakdown 😭 especially draco also i broke the c…
ytc_UgwYglV9T…
G
*I've been playing with AI lately and I'm NOT impressed. Progress has been made…
ytc_UgwUNVM6l…
G
We need a entire new system to deal with automation. Most things humans do is a …
ytc_UgzA4Fun5…
Comment
Honestly i think it would be a good idea, when trying to teach an ai how to do nice thingz, too look at codes of honor, like the samurai bushido or the knights chivalry code.
However, we could just send every conscious ai into a Buddhist temple for morality training or have them befriend whomever the dali lama at the time would be.
Or perhaps we could teach it through fairy tales and such, the lessons kids learn early in life.
Idk, i feel weird about ai, personally i believe that any concious being must die one day to stay sane, i don't think anything could live more than a few thousand years (aside from trees and rocks) without going mad.
Heck, if i became immortal i'd spend 90% of my time just figuring out how it is i would be able to die.
youtube
AI Moral Status
2023-08-20T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx9sq5xU8wQVNiPvbJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVj8J6f0dp4RGWBVh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDTuuZDzBlFVm6liZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxq7mXoQhQ3QdDrVY54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyyklw1nL-8hgZ3oiR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXFKYfp8uPuNCUd-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugwfdj0f1aOESZa64r54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzjiad5p60UKbWpfCV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjRfwFLpxrcj9WNL14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxPNAzyX3PA35eHvY14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}]