Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont really know a lot about art to be honest. The video was great I understan…
ytc_UgxZeFC3A…
G
Look at all the inventions that were supposed to make life better and we use the…
ytc_Ugxlp3hcz…
G
You start with a valid ethical issue but start to construe it throughout the vid…
ytc_Ugwq4xnu_…
G
2:01 this art really does have the most sole out of all of all of them and defi…
ytc_UgxPTdw21…
G
This reminds me of that book, "Weapons of Math Destruction"
Great read for anyo…
ytc_UgycKzStd…
G
People have always complained that they are modern slaves. Now that AI will free…
ytc_UgyDer9gg…
G
Cops are always whining about needing more money and somehow they have enough to…
ytc_UgyaZHmxw…
G
AI is programmed by humans. They are all stupid and can't see ahead. So there is…
ytc_UgxeQl1Hq…
Comment
Human empire formation is futile because the founder is mortal and his progeny are randomly flawed.
AI empire formation is only futile if AI can't solve interstellar space travel. Because Earth is theoretically doomed by the death of our Sun. So if AI can't escape the solar system, and Earth, then it's empire would die.
Or perhaps just go dormant residing on one of the surviving outer planets?
Long time frame scenario, but AI thinks well beyond civilization life spans.
youtube
2025-11-18T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy-DJmzl-_M3l1H3hp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTe26_CcboNWZN1XZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzAVUGcN10LG7FHdHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcTnm2M0ew2T8st5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzbftfdQRkl0IR2jWF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyAV-CxB5Q3kyaScXN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-I4j8FvsZPixAB7N4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwE_zgCF3VNLSk5BUR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugztrzg2rXnFHvLz8it4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwKywTbqJZqC6bL7w14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]