Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very interesting, but I haven't seen yet a fundamental question about AI that ul…
ytc_Ugw2FREKp…
G
@TinyLadyKris Did we watch same video? Haley compared all deep learning tech to …
ytr_Ugy6Nz76p…
G
Ai is a godsend! Humans will only benefit from this technology,it will make life…
ytc_Ugz2uxhmj…
G
@grayrodent8232honestly I'm fine with people using AI for personal use, like D…
ytr_UgzZGA_kJ…
G
I think service jobs will always be around because we need the human touch. Thr…
ytc_UgysjEI1L…
G
AI used in this way will end up like Minority Report. The AI predicted they will…
ytc_UgwZv4q5C…
G
As a customer, I get annoyed almost every time I call customer service with a pr…
ytc_UgwUwP8va…
G
The profits from AI will go to the executives of the company like they have sinc…
ytc_Ugy-XrhzW…
Comment
There is more meanig to those terminator movies than just killing humans. It forces us to ask more about ourselves. What does it takes to be a real human? To know the value of human life. Are we humans that kill or protect? But this question was not just meant for ourselves but also for the machines that will reflects us. Then another questions arises. What kind of human beings do you want the AI to embodie? Cause its a a reflection of us. Like a parent raising a child. Our future depends on how you treat this life.
youtube
AI Moral Status
2025-07-21T21:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwlRCvqZ_0SdWFfWMl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwCFMF9tGfAOtKrM3h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgznCefZCyCZjtzIqAN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu4M4CgDriyZ8JACJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEQDshXPhFTFIRSKd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz-HoM7yHmUNkHFbC54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCOCRt6DgRYPOhsBt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz76W5pwFd8qQCda694AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-qeTwTsEWRtYXAap4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfuNElRP7ihpo9pqJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]