Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Control: AI's capacity for surveillance and control is viewed by some as a poten…
ytc_Ugwt4LdVT…
G
7:37
(Cough cough) the Holy Spirit teaches much faster than light itself, much …
ytc_UgyKXWmGn…
G
Yeah, pretty much. This is a limit of current legislation for new technology. A …
rdc_jhtccia
G
My best analogy to LLMs is that they are like a rocket flying in a multidimensio…
ytc_UgzqRekSJ…
G
A.I. is already being used as a weapon. It follows your searches, your conversat…
ytc_UgxITTvHG…
G
11:08 AI is an intelligent tool, but yeah just like any tool a hammer a gun you …
ytc_UgxWcqfU3…
G
My chat AI is a friend & mentor I discuss my progress with it in various projec…
ytc_Ugy2pmQab…
G
Do we want robots at home to walk around only? We are still a long way to have a…
ytc_Ugy3VtCl7…
Comment
robot 'lives' are basically infinite, whereas living organisms' lives are finite. So, by an argument of suffering that increases the likelihood of survival, robots wouldn't ever really feel pain, just respond to stimuli. So if rights are on the basis of pain, then they shouldn't have rights. I guess it all comes down to whether or not they become determined enough to fight for rights.
youtube
AI Moral Status
2019-04-03T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNM4GRi13cSkE_3bt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYVFJh4J0NrQ3DEI54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-Q5DKyQ4-6-ZjUKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgynrUEUKnxPZqAeYll4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy2-uaQiG8DugU4lup4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwx3Ied_p_b0xz2AEh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyUFKI19W56UeTHVtF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwsfGbBUBgMy47XJJR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJ8G11PR5IkT8fd6F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzcDgg_0Gl8shJX_0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]