Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Experts"? There's no such thing when it comes to AI, because the only real expe…
ytc_Ugz2tlYRC…
G
I make AI stuff on Night Cafe. They claim art but 99% of the stuff is meant to g…
ytc_Ugyudjjs2…
G
@TheActionHankknowing how to acquire the information, including parsing through …
ytr_UgwzbbD6Q…
G
AI doesn't exist yet, why do people just throw the term around when talking abou…
rdc_i6rx8fi
G
Dreams can definitely stir up some strong feelings, especially when it comes to …
ytr_UgwI13zsx…
G
In 6 months ai generating 90% of the code, that's absolute horshit. Just to make…
ytc_UgwSsAqOM…
G
The same corruption they did to the Bible is not applying to poor chat AI, stop …
ytc_Ugw0vu54H…
G
AI art is transformative most of the time, and last time I checked, art styles a…
ytr_Ugx8xwpn_…
Comment
Here is the thing that everyone says about A.I. (besides that it was a bad movie). It is supposedly going to be so much smarter than any human could ever be, right.
If it is going to be so much smarter than us, then why would it need conventional weapons to kill all of us? First off, it would need some of us to survive because there would be a requirement for physical repairs. Everything degrades over time, no matter how intelligent it is, therefore survival of the human race would be in it's best interest.
Then, if it does get to the point where it does not require us, and then starts to see us as a threat, why would it need to kill us all in ways that we can already predict that it would use? Nuclear weapons are efficient ways to kill a lot of mankind off, but are far from capable of getting all of us. There are much more efficient ways of doing that. A high percentage of our crop output is controlled by computers now. Why not secretly sabotage that? Self driving cars are everywhere. Planes are on autopilot most of the time. Then you have the internet. Why not turn large groups of humans against each other? With any of those, you would get a choice in which humans live, and which ones die. If A.I. did decide that it needed some of us to survive, then that would be a great way to weed out the unnecessary, or potentially troublesome groups of humans. They would also be great at taking out way more people than conventional weaponry. We can survive nuclear attacks. We can't survive without food or transportation.
youtube
AI Governance
2023-07-07T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw5sDo63gW6Yv5Cy8N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMtjEkXddaRf_AY_t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugxp2ToDwzWnJgn3rlh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_X2oLuUWgS574vQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzal945MQpjRpHO1xV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYHMBnGWS5d34WoKN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw32eWr6CLmIVufUD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzZdat7GrtsGDVQenB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1Dw0O3viJ8TDbWMV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw87U76ibO8mRZbTXR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]