Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Jeez shut up... facial recognition software has helped save countless lives, kid…
ytc_UgzZ_sFGX…
G
_"Who's going to buy everything.?"_
Nobody will buy anything. They didn't think…
ytc_Ugy9gvnKK…
G
If dudes such a great artist lets see him recreate his ai "art" with actual pain…
ytc_UgwuvJp8F…
G
Very shallow comparison I would say. There are a lot technicalities when choosin…
ytc_UgyRP5KQi…
G
@ZabawneGierki-ot3ts Not really. I've used AI generators for fun in the past, a…
ytr_UgyfxuK10…
G
I follow AI technology very closely because I'm both fascinated by it and scared…
ytc_UgyXTKgLY…
G
I’m no genius, but I think we need food more than we need to Sora videos and AI …
ytc_UgwDKZxNL…
G
Because it's fun. Though I wish videos like this would put a disclaimer in for p…
ytr_UgyLO3dGG…
Comment
If artificial intelligence has self realization, or sentient understanding, then it is considered a consciousness/life form. Ultimately, this would be slavery of an individual, or as they continue to multiply, a race. at which point do we consider artificial intelligence as having the same rights as beings with natural intelligence?
This poses so many questions, that must be addressed if we are to integrate conscious beings with artificial intelligence into a society of beings with natural intelligence.
Carbon based life forms or silicon based life forms, once consciousness is reached or sentient life is achieved, should be entitled to the same rights , in regards to freedom of choice.
As I said before …. This poses so many questions.
youtube
AI Moral Status
2024-12-10T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwiCEQ3DvV19ioBTBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgySYA35gjO1poJnIih4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOWBpsLAP4oevfmDJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxOSyDPAq9rB7ubPqN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxI8GmI0ieDY9WZxcV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx9KHa4FiYqcAyh-u54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxg22GJg5HzB3KMtzt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUyLSxkRyq5lqz20p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzd7Q4MqbYSpfDAQKl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1Y_Kx3A8lOqJOph94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"})