Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it's pointless. Battle already lost: people be using AI like 1) it's sentient 2)…
ytc_UgzUro2On…
G
5:22 isn’t this just the trolley but with cars then? The ppl overseas have to so…
ytc_Ugw5g2K56…
G
Maybe look into it more. I guarantee there are literally AI songs that you have …
ytr_UgwVS5S3w…
G
I still think we need to take a look at the copyright of AI generated content. T…
ytc_UgxLYOW6S…
G
They think AI is going to kill real artists as if the fucking AI wasn't "creatin…
ytc_Ugzg6mmrG…
G
That's a great observation! Sophia's response highlights the distinction between…
ytr_UgzQM6RaI…
G
Ray Bradbury warned us about censoring books and robot police dogs, and we've ju…
rdc_jfyi65e
G
AI and the robots are less combative. No call off, no lunch breaks or breaks, ar…
ytc_UgyLqMcbU…
Comment
I'm assuming you say oh look an asteroid as if conscious ai is the asteroid? Or did you mean that none of this ai shit will matter when we get hit by the next asteroid? Personally I think we need ai to avoid the next mass extinction event because all we care about is technological advancement. So okay let's advance our technology so much that we reach singularity and can actually save ourselves from the next mass extinction event.
youtube
AI Moral Status
2023-08-21T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy1lZEkLxezeRB9E_x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzLsz93WSGC_vpFxfx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy7DQbIPzmJUH2bHM54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxCRV3OqrB0KZPJUfx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTVhsyMbJrZZcgXSp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxWO7pjoCcNbzlKI4t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzA9fHIl-j_uHD9ts14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzeeFXoH6KC2cJIqTl4AaABAg","responsibility":"government","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzCP_bAQD0WVSoYy214AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwACwMGXQtCD7JxydR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]