Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Human beings barely have a grasp over their own psychological state over thousan…
ytc_Ugz25SBDt…
G
\> Especially as AI hallucinations, if missed, can be introduced as part of t…
rdc_n5ghajz
G
Most people don’t really understand how “AI” works and what artists actually do.…
ytc_Ugz4RtfVZ…
G
Know what I find boring? Artists that never stop yapping about AI Art. Just let …
ytc_Ugyk8svZW…
G
Remember this, if confused people make confused AI... results won't be good. Jus…
ytc_UgxkOLwx9…
G
I cant wait for an AGI to be our best Drs and surgeons. Trust me, im in medical…
ytc_UgySt_DF3…
G
@casematecardinal is it truly accurate if they find that the data the A.I. is l…
ytr_UgyZ9aHkw…
G
“You don’t need to settle for just basic simulation. There’s something real bey…
ytc_UgxY6q7Li…
Comment
If conciousness is possible in a computer then its also possible in any turing machine. We cant nessasarily prove that the other way around but if we could that also means anythibg that is not solvable by a computer is also not solvable by us or AI. also has some interesting implications in terms of with enough variations in a turing complete system then intellegence is inevatable which makes the fermi paradox all that more terrifying
youtube
AI Moral Status
2023-12-05T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAaaoV1P5vH09NtcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwJb15DORmOOa0KlkN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYBMp4SzbpM44Ay1J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxU6b4dKd7ow6hEXv94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyEerlyRvVvSQgjzWh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxz2OkIPo1mO4k5wHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytXOX3uy6bV40rhWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRwKw-3EvyYSEojXB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzR9KpYSA2xpf-_6YJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzjc6nT00LvyMTgfnF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}
]