Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Y’know, I don’t really jive with the whole “AI art lacks a soul” argument, becau…
ytc_Ugy7kCA2J…
G
Wondering if this can be class action against those ai illustrations company
Si…
ytc_Ugzon_7ku…
G
All these discussions about AI are very informative for what lies ahead of human…
ytc_UgyzRiMQN…
G
if we all become plumbers now its going to be paying dirt and the loop continues…
ytc_Ugw5JAcJP…
G
The EU is currently investing in the development of AI combined with robots to d…
ytc_Ugw_HXZrP…
G
The thing I wish people would explore more is AI is going to have adversaries, e…
ytc_UgySKBOAj…
G
Also in my opinion, not that I know enough to deserve one on this subject but I'…
ytc_UgwcNw8YV…
G
Sound like sophia is learning or she can about us but robot will out think us …
ytc_UgzEfyDLp…
Comment
Lol wat kind of sick joke is this. If someone programmed a robot to ‘feel pain’ why would they want to give them rights. Besides ‘feeling pain’ is impossible for robots. They seem like they are in pain but they do not feel anything. Besides how would a sentient robot work. Ai is made to better and help humans be more efficient. An ai uprising is close to impossible, but a sentient robot is impossible.
youtube
AI Moral Status
2020-05-19T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFwg13HIwDYvN1xzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYCNhwxammyrS6RO14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjBPwzT_Qw7n9UD2d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7cMFZrGGurrR-LaB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFihmfK6GnXiI18aR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwF7AYOXBXbRHE0Bx54AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxQB_SJFgAqADe4Bm54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKaH_8G5iEjt8UWut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzY-XCMoxUorvFlSgR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRnuXxDby7aOjA9Id4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]