Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wouldn't the Michelle Carter case serve as precedent? The court deemed that plac…
rdc_nnjea60
G
Thank you, Democracy Now team for covering this story. It really should get more…
ytc_Ugwq0_9p0…
G
I got all of them right, but I watch a lot of AI videos. 😅…
ytc_UgxDNsPnq…
G
Wow thank you for this interview. She is so smart- excited to get her book (hope…
ytc_UgyRPCSqa…
G
What a load of rubbish, AI will never take your job. Your employer will but NOT…
ytc_UgzXdJKWP…
G
The very people that made AI happen will get replaced by it.. Idk they may be he…
ytc_UgytvBWXD…
G
How long does this Tech Robot last ?
It takes a lot of Time to clean and to ser…
ytc_UgyeiejLl…
G
If any HUMAN finds this video in 100 years, we was here & Sophia was the first r…
ytc_UgwMS1Shj…
Comment
Definitely is a strong word. It's not coded to understand it is a very complicated pattern recognition software marketed as "AI" to make people associate it with AI in science fiction. It is coded to be convincing. That's the problem. We still don't know what it means to be conscious. It's one thing to be convincing, it's an entirely other thing to prove legitimate consciousness. That's the whole issue. Can it experience? Or is it just an extremely convincing performance by a program coded to make you think it can? If you could recreate every neural pathway of a human brain within a computer program would that program be conscious? We dont know.
youtube
AI Moral Status
2025-05-24T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgyQfHtmGnw9kL0sbsp4AaABAg.AJD-vAKVA4QAK9VKOEiqK3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzukWbOyPucVjtXflF4AaABAg.AIsHOhdrxkCAIuhUOKrKDK","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzukWbOyPucVjtXflF4AaABAg.AIsHOhdrxkCALHm3MZo0a6","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugy3anUeUrp_s4BhaC14AaABAg.AITl2pKBnX5AIVOD_vZrgS","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxxOagt1Ac8e_16kkJ4AaABAg.AIMGvoiATsdAIVPnx7KVa0","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzSb3hJasAKZuIRV354AaABAg.AIJq_0Kg-2CAIVQHEVQ3K3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwucDHFz5TMSdF_Vhp4AaABAg.AFplo8jx11AAFq9HiRrfpd","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzGaRixydj-yPm3W2t4AaABAg.AFc7wU956kcAFoUzqCGFqx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwLZpRcgfvKUtEaEp54AaABAg.AFc58pCec5BAFc6atY-xTe","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx4k244ZcBlF1kJ9fp4AaABAg.ADpKrXtIKVGADqW9NNhZeT","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]