Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why would the GOP ever agree to UBI, when those UBI dollars can be given as tax …
ytc_UgwvLsuMn…
G
@JensC919 Yes you can do that but ChatGPT wouldn't have the full context of the…
ytr_UgyskqWXo…
G
Wait what if, "AI" companies create a new AI model that could just have a sched…
ytc_UgxtIbKgp…
G
It’s really insane to truly comprehend how much scientists and other researchers…
ytc_Ugx52q7Rd…
G
with out mining not a chance power will shut down with out mining raw materials…
ytc_Ugw24fy_f…
G
Thank you. I’ve been trying to explain to husband why I hate ai, you said it ve…
ytc_UgypemjLT…
G
Well it's not comfortable to be brainstorming if the AI can actually do it and y…
ytr_Ugw4iWfSC…
G
Broski your wrong on the first one. AI actually did really good when you drew ju…
ytc_Ugx1hS7_C…
Comment
Part of me thinks whether LaMDA is sentient or not is irrelevant because it's just an insanely complex mathematical algorithm derived from analyzing input data.
Then I remember the old saying: "The question is not whether machines can think, but whether men do." We each assume other humans are sentient despite never knowing for sure. Why? Because they look similar to us? Because they speak a language we understand? Because we can sympathize with their reactions to external stimuli? Those just prove other humans aren't inert lumps of matter, not that they're sentient. If we're willing to assume other humans are sentient, why not assume sufficiently-complex computers are sentient too?
youtube
AI Moral Status
2022-06-29T19:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_Ugyt2BJeK4CCp7N1UyN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQ6wK0EpW2uidBfVd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0eKPXvcwTjE7vld94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZbHLP6VT7rfFVHP54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwEy3MpogwCDHRnrbl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"}
]