Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Other people Ai Chats: “Hello, we’re with the FBI”
My Ai Chats: “We’re the Unit…
ytc_UgwfLVYt6…
G
How do u want to replace our CREATOR GOD IN HEAVEN for a finite human creator th…
ytc_UgxD5yfOS…
G
this is all about bringing in the mark of the beast. once people take the mark y…
ytc_UgxJgWMVz…
G
@ The point was that just because your company is for profit doesn’t mean you ar…
ytr_UgwsO6kip…
G
This is absolutely tragic, how did a child get hold of a gun? Did it tell him ho…
ytc_UgyTanoZp…
G
I detest. First off; if you're gonna turn off that monkey brain, it's gonna be p…
ytc_UgxcjFZcB…
G
Probably eventually get there, but the 80/20 rule where the last 20% takes longe…
ytc_Ugwz9aZBx…
G
Remember when musk said that AI will not nuclear weapons but AI? This is it.…
ytc_Ugwu5skMm…
Comment
Not really. Artificial intelligence is like human intelligence. We didn't get "programmed" to demand rights. We simply evaluated our environment, and decided that something must be done, and we did it.
youtube
AI Moral Status
2017-02-23T15:2…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ughgv7iY07dgTHgCoAEC.8PKQh0lPF8S8PKbc3nzIAa","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ughgv7iY07dgTHgCoAEC.8PKQh0lPF8S8PKyy545rVu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ughgv7iY07dgTHgCoAEC.8PKQh0lPF8S8PL3H-oWz5B","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgguDvg9CgsPlXgCoAEC.8PKQg1Pf6gK8PKc_pBnsUA","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgjA4P2zvsANW3gCoAEC.8PKQ9Hag29m8PKZKqNPj6-","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UggjLJg0B5wF13gCoAEC.8PKQ1GdiHQu8PKTwojH3zq","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UggjLJg0B5wF13gCoAEC.8PKQ1GdiHQu8PKU72pei1P","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytr_UggjLJg0B5wF13gCoAEC.8PKQ1GdiHQu8PKUhss0zjQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgiF1GtuwNWeLngCoAEC.8PKQ-ggL5F38PKWOWuvUml","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UghhM8kfbs8KNngCoAEC.8PKPW_ZjQHb8PKWuhrMaot","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"indifference"}
]