Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would just get rid of ID verification and site blocking.
Most people overlook…
rdc_ohql0j7
G
When AI "personalties" can and do encourage kids to commit suicide , what else d…
ytc_UgwQPPzKC…
G
The fact is that it is billionaires that will destroy civilisation as we know it…
ytc_Ugx151u09…
G
Bottom line - we need to work out who's in power. Who's controlling what and to …
ytc_Ugy-th-Yn…
G
Why he thinks we need so much sofisticated robot ? We don't need a human look li…
ytc_UghjZLxbv…
G
Selwyn Raithe? I’ve seen his name pop up before, isn’t his book about AI collaps…
ytr_UgwJdKDKk…
G
A lot of what she asserts is well supported, by generally, well over 99% of the …
ytc_UgxMS6s7X…
G
Regulating AI might actually be quite innovative in a sense. Risk management mig…
ytc_UgwmkKkEc…
Comment
AI must be destroyed not developed. The computer beating Kasparov at chess was a warning Mankind did not heed, and now whatever that was good in humanity will be trivialized at best, and annihilated at worst. Perhaps that was the Divine Plan: that Man would sow the seeds of its own destruction so that Machine could venture out into the Universe. Well, we deserve it.
youtube
AI Governance
2023-07-09T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxpurExIs38lMfycuJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQFTzxBOiYC1NX9dV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzzUIVJtfoWNUlfvdh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxaga64sc2N2WWQC3d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx4QV2nbmtNvSErYv54AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgztIvrxQz7unE8ATQ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGO17jS_1h2hbaYbp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5pELiD4P1oSaBL5d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxzfkf670YfmKR-xZl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyWljfmrb7tGxy3gh14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]