Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
as to animal treatment. I would argue the opposite.
it's not that humans are so…
ytc_UgxwOutAU…
G
Have you seen the deep fakes of the founding fathers produced by Google’s AI. Pr…
ytc_UgwECVjBy…
G
Professor Stuart Russell warns that the current AGI race could lead to human ext…
ytc_UgzS5Q8aI…
G
AI will usher in the collapse of the first world!!! Time to move to Thai land.…
ytc_UgwNElNlo…
G
A girl in my school said “Just because my art and stories are made with AI doesn…
ytc_UgyBqSisy…
G
People can set type and enhance and manipulate photos with much greater speed th…
ytc_UgzaRzrrO…
G
People use AI for the "artwork"
I used to use it as a tool of reference or an i…
ytc_UgzR_aw_B…
G
What you said is a lie.
1. The reason why ArtStation does not look like that now…
ytr_UgxTOWyDR…
Comment
Great podcast. I wonder what you and Roman think on Ai integration with human biology? Roman's points were mostly around Ai/AGI/ASI being a distinctly separate thing from human biology. Ray Kurzweil believes Ai and human biology will be one of the earliest integrations as the next evolutionary step in human development. We'll have a set of "augmented humans" and non-augments as AGI becomes integrated with humans on a biological level. We'll go from avg lifespan of 80 years to 180 years and so on.
youtube
2024-06-13T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgywyNkEJJTP-cET_9l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-M0Ls8ztQaqIeYR14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzk8tfA_XBVFPvGiud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwptr3ij6Bh0ojGKsN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwgNwU9DjmJoMz5Aph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwUyQGutFz8rvH9AiV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyA1-Wkdb7wKrcTTsp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwgm1XB0kPy7Fj8jcl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydrgoesQvcBzt3HhN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwuKgsdaExoxWJc5JJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]