Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Watch AHS: Apocalypse its interesting and very relevant for a season thats 8 yea…
ytc_UgwZ0u3hw…
G
AI replacing all jobs would be great if the benefits were given to the people, a…
ytc_Ugy2g5sNi…
G
its really just a 50/50 gamble a flip of a coin. it is either going to destroy …
ytc_UgwdCYG2B…
G
I had a chat with somebody who works with AI—we did not chat about apocalyptic A…
rdc_jifee4v
G
Any non-religious arguement for why humans deserve rights would also apply to an…
ytc_Ugy-JKQrS…
G
My job is focused on deep interpersonal relationships and will never be taken by…
ytc_Ugy4AAnkQ…
G
I don't think Ai Weiwei needs any publicity to be fair, appreciate not everyone …
rdc_loqkg53
G
Many companies already make their income by selling things to rich people.
Once …
ytr_UgwwjMIL1…
Comment
If you stuff a loop with a large dataset of queries within the AI default (Idle) algorithm and just let it run infinitely, isn't that conscioussness? Didn't facebook do something like that a few years back & the chatbots actually started pinging each other with a coded pattern? Granted, nothing "Scary" but imagine you do it with a dataset the size of, idk.. Chatgpt??
youtube
AI Moral Status
2023-11-22T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzjinUEGsjwCexVMTN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzfSMZhC4YCwbEKPYF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxEJ41N6bGU1hX77wZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxM7QdYkyTp-HPyzTF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugzq-1BFhhb4fLm8Hpd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzilT3y3SXFP3T--c94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgwoZCfF_7r9uD74bTF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwhOU0EzTBLkaFYXIt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_Ugwc4addqL8_KptcZhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzdOhB8w3rCcuMjv_V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"})