Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ankur, prompt engineering isn’t really a long-term career path. It’s a temporary…
ytc_UgyvVkten…
G
In 2023, an AI researcher at Google was fired for claiming that their language m…
ytc_UgxQQuE5m…
G
Only the tip of the ice burge Scott. AI does not and has never had an independen…
ytc_Ugwhfz3y_…
G
I think you missed a few important characteristics of autonomous weapon systems:…
ytc_Ugw1IGB_I…
G
Machines and logistics. Machines and warfare. Machines and education. Machines a…
ytc_UgxN7BDaC…
G
So.... say you find yourself in a 2 only decision scenario... a child runs out i…
ytc_Ugwq5L9ns…
G
Questioning NYCC's claim they couldn't remove him from the con due to contracts.…
ytc_UgyQaN2Vy…
G
What will they be hauling in automated trucks? If all jobs become automated, the…
ytc_UgxPnAfQf…
Comment
The Military Industrial Complex cannot resist the temptation to put AI into weapon systems, including jets and tanks, because they must assume that their adversary is doing it too.
Imagine a state-of-the-art jet that can fly much faster than a human pilot can tolerate, maneuver much better and faster than a human pilot, and obey any orders to destroy anything or anyone.
When AI is plugged into military systems, that will be the end of civilization. COLOSSUS: The Forbin Project will destroy all of us.
Civilization is phukt.
youtube
2023-05-11T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy64aWXg2olnpWAJol4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzsGNnMT4cNnAMu6HF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFiuanoGJpsooBNBx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwQ6pLy0cWdsyf2Uzx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZnV8tEWWkZDFr0Oh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyGjPjZxXIzE_jq57B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHYZwG-uvO01NV4W14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztYU1FSeCixbvvhQ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwidfAA62dP6xOHzUl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxXwg2QHV20U-1HoJd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}
]