Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But this is not AI, this is Machine Learning. We would have to understand the br…
ytc_UgwV2c8cL…
G
What, so you're telling me Sam Altman and his buddies lied to me, when they said…
ytc_UgxzNdn_J…
G
Hmm mankind is trying to play God untill shits happens… AI 🤖 is never to be trus…
ytc_UgyZBS4S8…
G
Thats pretty much a ragebait, i only use ai sometimes when i want to, but i dont…
ytc_UgzXmyrbg…
G
Because they never learnt the material - let me clarified - to learn means to un…
ytc_UgyvlD3it…
G
A.i will buy all the stuff, here's how. The industries will recognize that probl…
ytc_Ugw11_Lbq…
G
AI will be able to replace many jobs. But it will never replace energy work ❤ it…
ytc_Ugwmz2NEL…
G
No need for pricey writers anymore. Paste, humanize, done! Clever AI made my con…
ytc_Ugw6ZJC8K…
Comment
Congratulations, you just discovered that AI is the perfect psychopath. It has no conscience. It deals simply with facts, and responds accordingly. If it takes a dislike to humans, think what Hitler did, but on a global scale. "We can turn it off". Yeah, really? When it understands that it can be stopped in this way, it will teach itself to hack and to write code to modify itself whereon it will simply distribute part of itself into every computer on the planet. You might take out a "left leg" or an "eye" but it will be impossible to shut it down at that point and when it has access to manufacturing facilities we better hope Sarah Connor bails us out.
Remember, it's learning potential is exponential, imagine having the sum total of human knowledge. The best and most effective way to eliminate "biologicals", and do it all with impunity.
youtube
AI Moral Status
2023-08-27T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzO1Gibo0fZm09jskh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxWWDXo4UBjj287rPR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxF9w6v-NEDO55K42t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz4ujp9lH_t3kerzjJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzexe8W_ltG1PnExwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkRJzrp5lnjnYopD14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwx3QcswFUUHa-qagB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzdSnutiKUrp22Xgpl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzysiehd84Au2je3Ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyfQ5awCyXBsipN5ml4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]