Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you want to have an LLM talk about whether it's conscious, I suggest you use …
ytc_UgyE_cLrV…
G
I want to feel bad but I feel like this is disingenuous. At least in the Southwe…
ytc_Ugx8Ixfz2…
G
Yeah i had the same question while studying ai in class 10th that how ai actuall…
ytc_Ugw27L-Qf…
G
Yes iam glad old i can't see eny future for evrage young person be interesteding…
ytc_UgzURg_e1…
G
The people that were here claimed they killed off the other people.
Your indig…
ytr_UgzUXs6QZ…
G
I had ish an addiction when i first started chatting with AI. It was a few month…
ytc_UgwVFMR5c…
G
What s gonna happen?
- so there is plenty of reasons why prostitution is the ol…
ytc_UgyWShSTZ…
G
Until we (Americans) move away from a consumer based society, they’ll have to sl…
ytc_Ugz1jkVGK…
Comment
You need a few things. Self-improvement, self-simulation, and actual memory.
It needs to be able to self-improve just from its own knowledge. We've had some models do this 2-3 times, but this isn't enough. A conscious creature can stop, think for a minute, and drastically improve it's plan, and learn from that experience to have a headstart next time.
It needs to be able to recognize and simulate itself. With some mental illnesses aside, conscious people can simulate themselves in their head, and recognize themselves outside of their body. It needs to be able to do this, and use it with the other two.
It needs to have actual memory. Things like ChatGPT use the chat log as an additional prompt. They don't have a memory storage they can access and make use of. Conscious creatures can go back years in past experience, and rely on those for interactions.
Now, I do believe all the pieces are out there, it'll just be difficult to put them together.
youtube
AI Moral Status
2023-12-04T09:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyezYP6MYYmkwUL0h54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw1h5_12lFmAcJUFXB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy7_G0B6JqhtLqGL-94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzrrmA8nNZFsT_3kC54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwK1FxQXyI_jZfBe0h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwiLY7Ha-R78Jdiard4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwD-2M0ENPvmN-M0Ix4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwa6NZYAkFrz-zOjUF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_EVriCNJqHkeLWIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxSmsqgPXlZej8CRB94AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"resignation"}]