Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is only useful for creating C O R N of your favorite characters Xd…
ytc_Ugxqa-pyL…
G
18 trillion dollars in, and this is what we have for the use of AI. Bugger all.…
ytc_UgxqLbEl7…
G
@swim9tyesbe careful bro, these AI tech bros are so salty and delusional that t…
ytr_UgxTIhUKy…
G
@talkingbutt3150 because that looks anywhere as realistic as AI version that cou…
ytr_UgxNjtAT4…
G
Notice how AJ is very similar to AI (I have no idea what I'm saying)…
ytc_UgzDGiBp1…
G
If you were in a cooking competition would using instant noodles grant you as ch…
ytc_UgwCuf6G2…
G
Also allows them to blame the AI when they want to start culling the masses…
ytr_Ugy_WNIbB…
G
What this has taught me is that I have no hope telling good AI from reality. I w…
ytc_Ugx9IAxN4…
Comment
1. soul: A robot, strong AI or other does not have a soul, because they lack choice. There programming is not subject to the laws of nature but the laws of man. Two human twins, or humans created through genetic engineering using the same base model will each act differently when placed in the same environment and environmental situation however a pair of robots made from the same code can not. They will make the same assessments and produce the same results as such they are the same. Two people cannot share one soul and the soul is meant to be unique to each person as such robots cannot possess souls
2. for those worried about AIs taking over the world do not, even if a computer were to develop human levels of intelligence you cannot program emotion. Therefor though a computer may have the ability to take over the world they lack the 'want' to take over the world
youtube
2016-08-10T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UghlVHdKSsFDl3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghOLJXJkinIxXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ughp0m-7OLTnKngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Uggkc-b_dQ7sPXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjUboft16pmnXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggXYUtVSt6pTXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghCEDSQhCbKyHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgiRZubvHnok63gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh_eqMzofsL5ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ught2widn_LlsngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]