Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You really believe that banning it will make it go away? Good luck getting every…
ytr_UgyknPmTh…
G
If you seriously believe AI could possibly be sentient at this point in time the…
ytc_UgzDyNgXE…
G
The only way this will work is if the Elites also make Ai to be customers. Compl…
ytc_UgwUJNyux…
G
@goldenalbumen Being against whiny digital artists complaining about AI and corp…
ytr_UgzNIk851…
G
it cannot be reconstructed AI video because AI is not wise enough to know what g…
ytc_UgyWp-miu…
G
I'm not so pessimistic yet, although as a species, we tend to let our hubris a…
ytc_UgxRo9NTc…
G
This whole thing is just really sad, but honestly, go to any influencer’s profil…
ytc_UgyqXLzGi…
G
I wish he asked her if she uses Ai, how she does, which network, and advice on h…
ytc_UgzyOwNsg…
Comment
Is a ship without a mast still a ship? What about a ship without a rudder? What about a ship with no captain? No oarsmen? No cargo? At what point when taking away "more fundamental" aspects from the ship do you say "Ok this is now just some wood and is no longer a ship."?
If a rat finds salvation on a single floating plank, is that plank not in a way a ship?
Is there a base "level" or "state" that is what consciousness or ships "are?" I don't think there is. This is why I think AI are already partially conscious, in the colloquial meaning of the word. Self-awareness seems to be a composite concept made up of other underlying component concepts such as what this video touches on regarding the literal meanings of sentience, sapience, and consciousness.
I think that if a thing has the apparent function of a ship then it is a ship. I think if a thing has the apparent function of a conscious mind then it is a conscious mind. This doesn't mean that a plank of wood with a rat on it is a galleon, and it doesn't mean that an LLM is a human mind. But both obviously, to me, seem share the same ontological space.
youtube
AI Moral Status
2023-08-21T17:2…
♥ 18
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytr_UgxVsyyAAvCY45bh-AN4AaABAg.9tevz5lLQ6f9tf0mAK5nlj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_Ugynxjjbs5dzR2YxAOd4AaABAg.9terRjfYAcO9tg7V4gbVvb","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tfgAexvizE","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tgGHiytZBB","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},{"id":"ytr_UgxWO7pjoCcNbzlKI4t4AaABAg.9teqquaON8J9tgPjsRUpt0","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzmC20FGYI5Xqs31SB4AaABAg.9teq--gkWr99tgFf5BBO7t","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgxeFyQlh9DyOh7a6B14AaABAg.9tekqrIyogRA9VCTqB_k2T","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytr_UgxuXE6SoqVhL8x9ltV4AaABAg.9tebtebDZ0Y9tgs3jJODkw","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugz5f30YYziqxVnBwPZ4AaABAg.9teaiIsCx9Y9telClxAoql","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_UgzkNqEKcvQ-Cb-wty14AaABAg.9teYdPQ4lM49tep_1gj2o2","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}]