Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Self driving cars are great. I get to sleep and I don't die if the car drives ri…
ytc_UgzPNDPeB…
G
Satan his real name archangel Samuel God's venom he will be using AI 2027. Cern …
ytc_UgylwJ-cr…
G
“What’s the point of even drawing anymore” the fact that like… that’s even a con…
ytc_UgwALiv-L…
G
Is anybody else but me considering that each mass manufactured electronic device…
ytc_UgwkpEEV5…
G
They just keep regurgitating the same headline, no basis or examples of the jobs…
ytc_UgzS7LexN…
G
I think the main problem in AI development is that even if the programming was s…
ytc_Ugzv1ccnV…
G
As man robot incidents that have happened in the recent years you’d think ppl wo…
ytc_UgwUpPo7F…
G
I would say, the biggest thing AI will be bringing to us, is that we will learn …
ytc_Ugz-TwkFx…
Comment
The nature of biological life will not win the race to superintelligence. Ever. Biological life requires more time to reach that marker by default. Evolution just takes too long to reach that without multiple extinction events being survived by any species. So if we want to win the race to superintelligence against AI, just get there before making true AI.
youtube
AI Governance
2024-11-01T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxWKaUKGeOlvtqIywJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymaBvm-UNOgj66mFl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJ6YcShNrEU8_5JPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNYaLMkaKVq2vq0Xl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgztEqWbqLiBATFX1694AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxE4qqjLPF8Kb3EfRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsbcR4TW-y3n9q7sV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzyKSi2KkyyovhRVX94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfGUh6dLfPSQEV7kt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8JkV9Nmay5bpdayZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]