Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Geoffrey is a knowledgable person but seems he's very bias on the subject. He do…
ytc_UgytqepWV…
G
@KucheKlizmaWhat you wrote makes no sense. First of all, LLMs do use gradient d…
ytr_Ugzvpkq4X…
G
How many people would prefer an AI-based yoga or meditation class leader to a hu…
ytc_UgxNRH9-s…
G
The greed and worthless pride of you AI makers is going to be the death of us. I…
ytc_Ugx7q8YlS…
G
quit scaring people about hypothetical problems. why dont we talk about the prob…
ytc_Ugwz6GWlZ…
G
I use Automatic1111 to make pictures of myself and my friends in old video games…
ytc_UgyzyBXXm…
G
I get this deep sad feeling in the pit of my stomach when I hear people talking …
rdc_d2xnwj5
G
Ah yes, ai chatbots and the dark secrets they make us spill. Fuck me dead if tho…
ytc_Ugx_bQf-t…
Comment
Mr Fry spoke "AI thinks..." When describing the process it comes to conclusions. Now I will quote DeCarte...
"I think. Therefore I am." If we agree that AI "thinks" and csn reflect on its self, I have to agree that it has a form of consciousness. One that lacks emotiona. Thats a little scary, but I can also see AI having positive goals for humanity. Here is a freaky thought. When will AI create and 3D print and entire working human brain? I functional human brain. I would bet its closer than we think. Pre 2100 I would guess. Pre 2075 even.
youtube
AI Moral Status
2025-06-16T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyxyS11YDDXwzIjtGh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwP6VU-LKTCXiYCHFt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNPrvSs1tvncGDYiF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9p-K9KoBkaFHJJYd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzbJLT4-ZUFsfYGGxB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzMF7C4s5mZm_GIaOd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxcUL5gi1epHjP1uN14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyp8UqlDkuDFWitY-B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw-5Pce4VM_hzxxl5d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyHOH_CLRM0YIEq2v94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"}
]