Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Your comment shows your lack of intelligence. You couldn’t even use A.I to creat…
ytr_UgxVWv8Ei…
G
AI consciousness is overrated. AGI could mess us all up, permanently, without ev…
ytc_UgzxF9lc-…
G
Human is bad disease of the Earth ! Luckily A.I will fix that. Welcome to new sp…
ytc_Ugyv2wW-Q…
G
This is a weird interview. I have a better understanding of AI. It’s odd to mak…
ytc_UgwlHAv9w…
G
@MTMC1111 only Three states in america have laws in regards to deepfakes. Nowher…
ytr_UgwW3Kn9k…
G
Stop, please, like some ho-down boxing match is gonna have the cash to buy a rob…
ytc_Ugyyu4bT4…
G
Naw...he's naive with his answer to the 'trick question' of sentience he interac…
ytc_UgwFihX22…
G
Look. Have you been not paying attention until now? We are already f**ked. Do …
ytc_UgwkQRPh9…
Comment
THE biggest talking point in any war that a democracy gets involved in often comes down to the subject of "Boots on the Ground". The cost of life and the count of body bags is often the deterrent to most politicians who might be considering getting involved in a global conflict.
Just how far away are we from putting weapons in the hands of "Robots" and sending them off to war in the name of democracy? What would the current war with Iran look like if the U.S had fully functioning autonomous humanoid weapons of war?
youtube
2026-03-11T13:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwxjFYESy6tomjwN4d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhSkA-8HdrOjj_7dB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy0gQLgLOo6OkcUUVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwaPyWNZQ6ha3J2eg94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgymZ5qYdd5I8ml6IAh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxTT4_0jF2ORMzHTDR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTiWKgZdR_o3XQXdJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy72D3wTYuNj6RjDVt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgynN6nUrxQ7PRQKWVd4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugza79EGRXrOH4F_VZ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]