Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just imagined the blue light on the "robot area" safety vest turning red and a…
ytc_Ugw7kZZtQ…
G
So get an ai to right a paper and just open a blank doc and right it that way 😂😂…
ytc_UgxTKnqjI…
G
I'd like to add one more analogy to the list:
If a genius chef who, by all mea…
ytc_Ugxbc82zk…
G
It's not hard just don't use ai. Let them any many more who dwell in it perish…
ytc_Ugx5lsNKy…
G
The video is well made but it leaves out some relatively obvious stuff that woul…
ytc_Ugis80PWS…
G
I would love to see a robot take my job. Lmfao how the fuck they going to lift a…
ytc_Ugx271OA8…
G
You feed all of humanity's recorded knowledge up to this point, and you equate i…
ytc_UgynvHIYU…
G
AI: Agrees with most of the stuff, but just letting you know it is for a specifi…
ytc_UgwrhhXFt…
Comment
The us air force started an A.I program to pilot the air fighter, but it went wrong.
In the simulation, when instructed not to kill the enemy, it will kill the controller (human) so that no one could stop it from killing enemies.
Rewrite the program not to kill controller, then it will take out communication tower 1st so it will not able to receive any order to prevent it from killing all enemies.
This A.I shit is legitimately a fucking terminator . How the fuck can't human learn n be more cautious
Arrogant n pride will cost us the future.
youtube
2023-06-04T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwQcystQotpefmCZwt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZu9b80ojEeYdIU6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwp2v048MRYSBrnUZh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVM2OXcclaH-3xnL94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwdaFbIUi1scoS_lpt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyuGt-TooMnvfpRJ6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxwURRMbnqd672nAJp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"mixed"},
{"id":"ytc_Ugyi175Vu9MOfasoPrl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwESoVi0-6Tj8PcNrF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfosoxQg7Dij5U-sJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]