Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In the meantime, the first known major cyber hack using AI just occurred using l…
ytc_UgyNriS6V…
G
You know what a good 'insurance policy' for AI might be? Building in to it a rev…
ytc_UgzN5XS3R…
G
@AK-gh7mc I was afraid of it when Israeli said look at this picture of Hamas but…
ytr_Ugzrv92r3…
G
AI isn't the issue in this situation. Industrial automation was supposed to allo…
rdc_nckaobd
G
What many of these predictions overlook is the human response. As people lose jo…
ytc_UgxGtTv9C…
G
It ocurs to me that if there were a AI capable of paying políticians it probably…
ytc_UgzRlBa8x…
G
Humans are the greatest danger to humanity. Who created world wars throughout hi…
ytc_UgxXr5bgs…
G
AI powered robots are nearly here. They will be electricians and plumbers, and m…
ytc_UgyTamQFZ…
Comment
The Center of AI Studies focused on AI safety. What was revealed, for my understanding is in progress. That being, development for "The Borg." Artificial arms for the Borg replicated, existing AND years in production. If you look hard enough, you'll see parts sold in electronic stores.
Right out of Star Trek with a link to information from government communications headquarters - Q - plus other government communications departments.
Someone told me years ago, "Pay attention."
youtube
2026-02-26T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy-lWg1B79vDAguBnt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzqGQXmt2UPTWQZ4I14AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugww1OYqqiwvvmO2AaR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqgFUpURDajEN6M554AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyHf156zzG6vEO6wOh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy52iQXZEhkyfVBNxl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwr6V52Muc2LNzb-fp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzQS8JtOIDlzQlWEh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVOXFXBlvqxctPfJR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwesQ6dk4NWuTpPb3x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]