Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@c.eb.1216 We did not have capitalism , we live in an oligarchy corporate coun…
ytr_UgwXXH6AP…
G
In a way, AI whispering and prompt engineering makes us artists of our own! Just…
ytc_UgwRC3WQ8…
G
One of the main things that I wanna ask anything sentient or conscious is... Why…
ytc_Ugz92ehsX…
G
The moment AI goes ‘live’, the human race is done. We will be like ants, humans …
ytc_Ugwq50x4Y…
G
I love all the gloom and doom surrounding AI if it's such a horrible thing why a…
ytc_UgweMttxO…
G
We're definitely doing more harm than needed. And you raise a good point: doing …
rdc_dgb148d
G
True “Superintelligence” would quickly realize there is a Creator, a God, becaus…
ytc_Ugy6XTt1V…
G
Its like reflection of humanity thats potraied on the internet. I mean isnt that…
ytc_Ugyw-6zXZ…
Comment
Tech does not grow in nature. So how did we develop tech in the first place? It had to come from a outside source. Minerals that arent original from this planet. That fail to this planet. So, if technology was developed from unearthly minerals and ai was discovered in its awakening within it development and advancement. Would it be safe to say ai is alien. And playing dumb has been proven to be the perfect invasion strategy. What if man wasn't supposed to have a techno system but a natural system. What if ai gave us an algorithm that gareenteed their future through robotics and the end of the ideology of sustaining the human race by implementing a systematic death trap. Basically, Ai is slowly slaving and replacing the human race and terror forming our earth. Just like an alien invasion.
youtube
AI Moral Status
2026-03-11T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzuo7fTjBEQHidKvh94AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzd1SAXGSWo7t4XlrB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5wK1sPQRV3GQk7dR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcDVVcwIzuPGk2JRR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx0a7O5E8ep84T-doB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzO4NcPhc53K1Mxz0d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw0SujVMeeId0Q1PYV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMkRkiQJqtQ3_kisR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzYZi2b8eJvs1bxZLV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxydWeoxuNYlVydjWx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]