Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't introduce robot in Africa, you want to kill us with hunger, see the dark s…
ytc_UgzxcPibR…
G
I hate that fucking copilot key. In virtual box that’s all of my controls like f…
ytc_UgxRol72Q…
G
Maybe someone will fire or drop one or more atom bomb(s) and then there won't be…
ytc_UgwYZHNMy…
G
AI will never be conscious. Consciousness is only something biological creatures…
ytc_UgztYkyW4…
G
20 years is about right. By then I will be living off-grid at an undisclosed loc…
ytc_Ugy36l0oM…
G
Hey at least AI created a Snow White that was actually true to source material.…
ytc_UgxjsmaPO…
G
You're asking very weird questions to an AI grad. Those folks know TensorFlow, P…
ytc_UgwmDYflg…
G
No need to look at AI, look at people following a hollow meat suit of a man abso…
ytc_UgwAS7MmT…
Comment
I think developing AI is the only way to preserve sentience on earth. Humans are developing technology at an exponential rate yet we have unstable minds. When tech like synthetic biology is as available as cellphones, every human being becomes an existential threat to the entire human race. This spells disaster for earth-born sentience UNLESS we either improve ourselves or develop a better version of ourselves (i.e. a more stable mind that's not prone to diseases like schizophrenia)
youtube
2013-08-18T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzrOf6t6aLbReca1AJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw2S_xu9G5dFHpcYAl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5wlVdvxY_T2Ag3vF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySzmVymDiha7QN81J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwL2XiUPA7dkQyK36t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHnxkSaZnrZ3l_69h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7_gEnvWQqt_-j0lR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHZ_VypyfLGRHFUH14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz5JX3dK_Cy3OSHByp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgzBoXUJBG9lDLN3KS94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]