Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
my chatgpt said "I'm really sorry you're hurting. Losing someone as special as y…
ytc_Ugz33Y-2g…
G
He wants you to think that he is so empathetic. And cares so much about humani…
ytc_UgytHatd5…
G
For him to listen to aI he must have no iq bro my son is 8 and tell ai mannn suc…
ytc_UgyMpsZXr…
G
We're totally going to have battle mechs in the next few decades aren't we?
We'…
rdc_ezfeqjh
G
I don't know.. these guys must have a different ai than me. Cause my ai can't ev…
ytc_UgxA5Phgi…
G
AI makes me think of the "Monorail" episode of The Simpsons. We need to lose our…
ytc_UgxXOsKF0…
G
The point Penrose is trying to make is that computers are basically computation …
ytc_UgwCN15o8…
G
If you put out your art in the public eye, you have no right to cry about it bei…
ytc_Ugz2QRPYa…
Comment
- The turing test obviously does not show whether or not something is sentient.
- The title Google Engineer being applied to Blake Lemoine is misleading as it implies he is qualified to understand the engineering behind LaMDA. He isn't and doesn't and wasn't supposed to.
- There is nothing close to 'AI' there at all. He's not making a valid point. He's talking total crap. There is no more AI in LaMDA than there is in predictive text or calculators or any programs that people have written or any video game AI. Nobody has talked about ethics for calculators or software or game AI. What about when you kill someone in a first person shooter game. Should we discuss the ethics of the treatment of the AI?
Correct title should read: Unskilled idiot at Google gets freaked out by something he doesn't understand and makes a big fuss.
youtube
AI Moral Status
2022-07-01T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[{"id":"ytc_Ugy--nBrbwfUY0dBkOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugwzqpzm30HX4C3wyNN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgyuIMACQCCGvmzg0pt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyC6viuep8ppeuEP094AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzpHU5Gxc7f97ZvyU54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}]