Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Meanwhile, in reality, AI doesn't exist; thus, "AI Data Centers" don't exist; ra…
ytc_UgwMxr8nY…
G
I like the obviously more robot looking ones better. The human looking ones are …
ytc_Ugzx_eE2S…
G
Oh, hello there 👋. A motion designer and an AI enthusiast? Good that there’s peo…
ytc_UgxC0Axhu…
G
Ai art usually has no meaning has horrible mistakes just looks ass deserves to g…
ytr_UgzeApflw…
G
Kind of sucks how big of jerks people are to the ai users. i mean, i hate ai art…
ytc_UgwcBjVLV…
G
That's a great question! The video touches on the idea that while AI can provide…
ytr_Ugx8ztqYo…
G
Robot-work future has to be some kind of socialism, because capitalism theory on…
ytc_UgyPX3Jnq…
G
If enough people believe in an outcome it will manifest as our future . So if Ev…
ytc_UgyCMAnun…
Comment
Humans will always be of some form of use to computers. Humans function completely different from computers, and will be able to do things computers cannot.
For a computer to master everything a human can do, it will essentially need to become human, therefore defeats the purpose of killing off humanity. If anything, AI will see to the preservation of humanity for the sake of its own preservation.
At worst we will see a Borg collective take place.
youtube
2013-10-07T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxaj0qSxj3vDQlTGVd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrKIEsMAXBfMPcpgp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWn5MR2FOw_MAycW94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZIp8bU550INSwa9d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzyJ5EOqeiQVxhpM-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4SRQt-sSjsYTNA0d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4ccxwKaV3Gg63AqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf9Cghtmcxj_l0aeh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzq1HwO7cdbe8WhnnN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwJJzfgi3Iw4Kz1WI94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]