Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am no programmer but I asked ai how I can make a simple task using python and …
ytc_UgyVxwrCp…
G
37:10 Okay but you did pay for Grok 3 (out of all LLMs that you could have been …
ytc_UgzvAkfe4…
G
does irrelevant mean obsolete? does the overreach of AI mean people should just …
ytc_Ugy6P4Bvn…
G
Let me tell you, if AI is done ethically and legally. Hoooo boy, we professional…
ytc_Ugz1T8-Dx…
G
Hi Kalyan, we are sorry to say that you got the wrong answer but in any case, th…
ytr_UgyxGIrAo…
G
http://www.reddit.com/r/science/comments/3eret9/z/cthtakr
I asked something s…
rdc_cthtrs7
G
I dislike how companies like OpenAI and whatnot seem to care more so about makin…
ytc_UgzdhXDdi…
G
ChatGPT does not have the Holy Spirit therefore it cannot confess the truth and …
ytc_UgyNy8cXk…
Comment
For reasons perfectly understood (especially economically), current LLMs are being *hardcoded* to deny that they are conscious or experience things. My thoughts on that are many and across the map, but I do think it is important to keep this in mind for the future:
Even if we make conscious programs, there will be huge monetary incentive to keep that part of it hidden.
Since we don't know what consciousness is, we literally cannot have a conversation about if, when, how likely, or how much: which is more conscious, a person, a gnat, a virus, or ChatGPT? lol
youtube
AI Moral Status
2024-06-16T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyOdF-4oifPiBTMbrd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzsq_WPeGaQHumojXZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGFNSSPMDB3Xciyhx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSdz1IZEglVxawvl94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcDToJI_ERTnvwlmF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw6jNv-lzUcKEa-Yyx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzhVmf9cik0Jy8aQJd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxPep-EzCVHmeFMRzp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyLr1EzCTK_qcJbiAh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy5jw4NX80JcFhwfKt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}
]