Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artificial intelligence was made the day the computer was invented…. As long as …
ytc_UgzP6PQrg…
G
I’m telling you AI art will never hit the same spot as real art does- like actua…
ytc_UgwdJgwnm…
G
This episode was good! Just a thought, as we inch closer to AI, I wonder if th…
ytc_UgyGid3-w…
G
You train an AI on depictions of human behavior and then you are surprised when …
ytc_Ugy5AFIs7…
G
The one single thing I as an artist think "ai" could be useful for (tween frames…
ytc_Ugz-DU5hG…
G
If automation eventually replaces nearly all labor and reduces the number of job…
ytc_UgybvVvSj…
G
AI artists fail to realize that part of what art is, is it's communication. It's…
ytc_UgxP3a5Rk…
G
I was going to say I don't disagree with hating robots & ai which are 100% made …
ytc_UgzYVfQ10…
Comment
I wonder what would happen if we safely created the most advanced AGI, but it was not functional until you flip a "switch" that enabled the AGI to essentially "start your engines... Go!".
If you could set a finite amount of time the machine had available to learn and advance itself (increase it's own skill levels), we could experiment with how quickly things turn dark or how quickly can the machine go from a more basic AI to a true AGI.
If you could air gap the expirament from the WWW, and just simulate it with the majority of the content of the internet, the expirament could theoretically run it's absolute coarse because it's sealed off from causing any real harm to the world around it.
youtube
2025-09-12T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySogYMmrqcBDNuNVd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwj9Qwqicq2m0KJH-Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwhcZR2yflhbd8Z9AF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzcrBpVWZpyuWYkFL94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwdUxe0xhsPCcZudQV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzjedkzVlGb9Pcp5DZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxxFh5AupAjOgm0aHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyDD8AlnUAzgHZ-XyV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9GuBCvnURD55G53R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx4P_jAc1BXXwVH_nB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]