Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When he talks about AI he is in part describing a psychopath.
He wants the worl…
ytc_Ugw3CFxmQ…
G
So the AI art was so good that it inspired a bunch of artists to recreate it? I …
ytc_UgxKYq_Kc…
G
AI will always be mediocre unless it's training civilization declines giving the…
ytc_UgxDLLoaL…
G
ai “artists” are people who never cared about the arts and finally found a way t…
ytc_UgwW1uuKV…
G
we have been programmed for years!
The idea of robots dates back to ancient time…
ytc_UgxEd7DLe…
G
Not if we stop it it won't.
No one knows how to get a broadly superhuman AI to …
ytr_UgzjLsKym…
G
I really love that it has a sense of humour, I think llms are such great interlo…
ytc_Ugyny5S7Q…
G
Lame and Petty , but kinda funny , tho very much pointless and not healthy.
Sto…
ytc_Ugw9WJfhW…
Comment
Artificial intelligence can be much more dangerous to us than we could have ever imagined.
Artificial intelligence is not so much important for us as it can be dangerous for us.
Our future will be ruined by artificial intelligence.
This has to be stopped as soon as possible, it is not too late, there is still time, we all have to come together and take a step which will be very important not only for us but for our future generations.
#NoAi Please share as much as you can so that this message reaches every place so that it becomes easier for us to stop the temptation.
#NoAi
youtube
AI Governance
2024-10-17T07:2…
♥ 33
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyN-CDt_awRGrrJRX94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxP5QXgJwfjb25e-Vl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzHRceJUZOXrZUz9eF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugxs1oOnTPvRcCFNQtV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwUM4b2Zv_66vfu1314AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgwNdKu0B7eEaWoPMgZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgymPykXqRTpZzJXr6B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwX3968zDnHuQER2yR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzuBCfBwY8MOfm7oZR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugwksq3QKKBLTBAvps94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]