Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm a dude and immediately knew it was Ai so I don't know why people keep acting…
ytc_Ugyns44g8…
G
In the end, thats what will happen. There won't be as high of a demand for a Tay…
rdc_kjkda7y
G
About the consciousness debate so scroll past if you don't wanna get existential…
ytc_UgzNJ9cu9…
G
I don't know how dumb you have to be to think a robot could have rights. Anyone …
ytc_UgimUt_LM…
G
I doubt the AI Is racist... most likely the creators coding. But in your video …
ytc_Ugw984TZc…
G
But if everyone or even most are out of a job, the economy will plumet and there…
ytc_UgzB_s3jE…
G
None of us in Texas signed up to be the beta test for driverless trucks…
ytc_UgzVRH5ce…
G
I am 66 and socially introverted. Had many years of trauma due to people. People…
ytc_Ugy_2FOuD…
Comment
The deathly force of robots isn't even the worst part, rather then at a regular war at some point a group of soldiers or even a whole nation would give up if the damage reaches a certain point and the ongoing fight doesn't seem getting you anywhere. Now if you send robots and some of them get somehow destroyed, more can be send to the battlefield and with drones and flight robots sooner even quicker war robots can arrive at war places, so that there will be no end of war unless the other countries are fully destroyed or taken over.
The best is to pray that a hot war like in the world war or as closely as big won't happen again, because even if nuclear weapons aren't used the robot's will do tremendous killings and damages and a few leaders won't care for those of the majority (civilians especially) suffers the most.
youtube
2021-03-04T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyNTGK9twrNcoI-U354AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3qtD5gyvWKd1-H4d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwni8F1zbtRxO8E9KV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwatnYrT0z68WJxmfV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQIXiA5lMd8k5hxYJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzR_KobipTUlM4dDJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFuhStP2v7qtBZ5e94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6vxNRnRRPvOpIBWN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRcT4h6LW9QWi5qQN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxY2pum3O5NPi2XNiZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]