Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not about the number of jobs Automation can create. It's about the aging wo…
ytc_Ugwpzk_m-…
G
When algorithms are built to capture our attention by reaching deep into our mos…
ytc_UgxatEKB_…
G
1. Yes, we are in a bubble, but AI will still turn pretty much everything upside…
ytc_UgzBkyU9T…
G
People are saying you need the ai to use clean datasets but what you really need…
ytc_UgxfUFvLT…
G
That's already happening, but it's pretty easy to tell an AI time that for a reg…
ytr_UgxIZFcvn…
G
@ the human artists are far worse for the environment and view far more things u…
ytr_UgwFu19pq…
G
I firmly don’t believe earth has enough resources to support the AI output. It …
ytc_UgyhxLT7Y…
G
@LeonardoGarcia-qt6lf AI is not learning the way humans do, nor it was designed …
ytr_Ugz1wstyM…
Comment
Either we find a way to make them both sentient but also have wanting to help humans physically and emotionally as part of their personality than I would say our terminators, A.I.s, etc should be treated as people since they can have the level of emotion as a human. To me, if you're sentient then as far as I care you're a person even if not human
youtube
AI Moral Status
2020-05-14T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFwg13HIwDYvN1xzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYCNhwxammyrS6RO14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjBPwzT_Qw7n9UD2d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7cMFZrGGurrR-LaB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFihmfK6GnXiI18aR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwF7AYOXBXbRHE0Bx54AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxQB_SJFgAqADe4Bm54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKaH_8G5iEjt8UWut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzY-XCMoxUorvFlSgR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRnuXxDby7aOjA9Id4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]