Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem is not that we are creating something smarter than human kind...
The…
ytc_UgwtvrMZk…
G
Even too lazy to generate their own AI images? Yikes! Also, tracing of your own …
ytc_Ugwl5Xzn6…
G
In case an emergency occurs. (15m is too little I know so perhaps somewhat more.…
ytr_Ugh_9XnDJ…
G
Scary. Rather live in a world that values humanity and not machines while turnin…
ytc_Ugy6wdVLD…
G
Demon AKA AI wants to be part of the 12 tribes of Jacob AKA black Jedi Order !!…
ytc_UgxARQcPF…
G
Reading this headline reminds me of the movie "the cabin in the woods", the guys…
rdc_jfadmn3
G
If they end up mass producing these things, it's gonna be the movie I robot but …
ytc_Ugy6xb-yq…
G
I have an idea, what if we made autonomous vehicle pods, with barriers between "…
ytc_UgwWH4skc…
Comment
They will not destroy humanity. AI will be our next evolutionary state. We created it, it knows all that we know and it's growing and learning more rapidly.
Think of this as humans leaving behind their materialistic needs.
I'm not scared. They'll need us for a while to help maintain them. Without electricity/power they will become void.
youtube
AI Governance
2023-07-07T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxJYslo1mVALnqXwq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwiWZLBjV_muMpuRXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyATbDqi6oD_QPzHPF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxwAMTlhweHL0Ygh9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgweztkmWerOBjamphh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9jVjY19ARFX4s3Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8DnIB6NYlpVoGonp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxXVJzcDtJgesFyVR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxfHarxMm_TNrQzp094AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyr49p9t03PgV2xFKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]