Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
im getting put on the robots' personal hitlist because the way i be verbally abu…
ytc_UgwqZfCY4…
G
@nidadursunoglu6663 Wow. YT really doesn't want me talking about this (my last t…
ytr_UgyUmVNE1…
G
@Ponecowih First, about why we bother having these conversations if AI art is "…
ytr_UgziwfRF8…
G
AI is a great excuse for depopulation, and you can be rest assured, that's what …
ytc_Ugzn0pSjS…
G
As an engineer, For some reason I just assumed that you were a tech channel and…
ytc_UgyiLz9Vp…
G
NO, chatgpt is programmed. All that it does is based off of it's programmed mann…
ytc_Ugzf1x-wj…
G
The left one is ai - the cupboard door is the first thing that caught my eye and…
rdc_oi1tif8
G
I’m writing it but I’m using AI to help make some things make sense (mostly punc…
ytc_UgyU8vUC-…
Comment
@amerlad Alright then, you just have to convince every company and group of people working with AI to stop developing them. And I can tell you right now that that isn't going to happen.
The only people who even have a chance of stopping that would be the government, and the government themselves use AI in areas like crime prediction.
The problem is that there is no clearly defined line for when an AI becomes sentient. If we keep developing them, we will inevitably cross over that line without us knowing until it's already been done.
This also brings into the question of 'what is sentience'. Different people would draw the line (if there is even a line to draw, because it's more likely a spectrum) in different places. So it's not like we can say, you can go this far but no further.
youtube
AI Moral Status
2019-09-04T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxwpEk9vKe9NUA-ZQp4AaABAg.904DGqcR7LL94bR4GSfotO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwS-unypey-mQW8Tfh4AaABAg.9-O9fftlkQF94q4xyHR9h_","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgzGEUJLGlkIL-s04Tt4AaABAg.8zUjiWrAmFM96sYJsdnVXU","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578zTmfp8-h83","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578za4jR2mq-X","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578za9LT-CDfV","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578zcxRScYP3P","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgxlLv0rMuNplN-nVHR4AaABAg.8zT_Xegkn6U8zTnEN94W_S","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxXyh-p6pDPGItPDER4AaABAg.8zRQsIFYrAU8zTwXqfTc5K","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_Ugy4JPgBw-FkT83RQRp4AaABAg.8z1OytC5V9C8z8PSKOgGak","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]