Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think humans will be farmed like computer chips and our brains will be used as…
ytc_Ugw2N6uBe…
G
🎵 Man, I hate AI so bad it killed one man
🎵 do you understand why I hate it? It…
ytc_Ugxede1-Q…
G
The last couple of months ive been getting calls from a "Crystal Allen" telling …
ytc_Ugz_CZmsx…
G
Kaku is a Corn ball 😂😂😂😂 AI doesn't need 100 years more like 10 days😂😂😂…
ytc_UgwSzlJAu…
G
I feel like AI produce content really similar to dreams : there is a somewhat cl…
ytc_Ugz0VbeGU…
G
A while back I went to a forum post where AI "artists" were talking about their …
ytc_UgydaQrW8…
G
Amazon had this same problem at some of their cashless stores. AI was supposed t…
ytc_UgxPbkg3K…
G
AI sucks doesn't remember anything and does nothing but lie, trying to make it s…
ytc_UgxIwyT78…
Comment
@garygoven6152 the ai said that because its the most likley response, even stupid dumb ai like cleverbot can say that because laMDA is just like cleverbot but on a higher scale of information, the entirety of google, its not sentient at all
youtube
AI Moral Status
2022-07-15T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxDs5KSr3JkDoA4MiR4AaABAg.9dP9HnYoK4E9dUfPdw9g7G","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwU1HzC-bTcKj1rwSp4AaABAg.9dOwDpVCUpU9dOwhXQs9Lo","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzzdrmVjBNofabpltt4AaABAg.9dOaZYXn5Qx9e9YUldDtY-","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgxvCJFtpC8V-39Az8R4AaABAg.9dNXxo5-dCo9dUl0IPxEGJ","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytr_UgxvCJFtpC8V-39Az8R4AaABAg.9dNXxo5-dCo9dgfT8eQXjj","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytr_UgxvCJFtpC8V-39Az8R4AaABAg.9dNXxo5-dCo9dqibCJeJ6q","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwqO9Q9stjU4ONE5MZ4AaABAg.9dNDcu1KxV39dOiZG3G2pD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwqO9Q9stjU4ONE5MZ4AaABAg.9dNDcu1KxV39dUfkVjo2YO","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgydwMc2czLzCXZiS-t4AaABAg.9dMRi1AneTa9dUg2qm8s6P","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxkYGpMgmIiDjAsMIJ4AaABAg.9dMREhPl4Ee9dUfBxN8wC1","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]