Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My family are artists and i want to do it too, AI is okay when you need some sch…
ytc_UgxbLnBPT…
G
First thought: the word singularity is also used for the centre of black holes. …
ytc_UgyLAsGJb…
G
I just paste the text from Chatgpt then put it through google translate 3 times.…
ytc_UgwuNWrCf…
G
lol this is ridiculous. AI researchers aren't "TERRIFIED" of AI they are worried…
ytc_UgycqYoaV…
G
The modern loneliness crisis is really boiling down to addictive technology taki…
rdc_ohy85do
G
Calling AI "art" not art because it's AI is like some caveman calling an oil pai…
ytc_UgywzV75y…
G
Capitalists break the system via legislation or deliberate inaction claim they c…
ytc_UgwOWPdf8…
G
AI:- I can replace driver jobs
India be like:- Habibi come to India😂
If u don't …
ytc_UgxziA7Ba…
Comment
Well, let us not forget that previously, Alex O Connor very cleverly steered the conversation and the phrasing of his questions, to get ChatGPT to say that he believes in God. So, the fact that he did the same to get ChatGPT to admit that it may be lying about not being conscious is, again, most likely attributed to the fact that given the proper "narrative" to your line of questioning, you can get these LLMs to "admit" to a lot of things that aren't really true.
Of course, at some point, ai will be smart enough to actually be conscious and manage to keep us from knowing it.
youtube
AI Moral Status
2024-08-18T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw4trOxq35sAxRYtE54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxEkOdP45SHgCQGCmJ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUbfUkbzctHJTP4Zp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"excitement"},
{"id":"ytc_UgxO21BvX1nyAyaTn-V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzvw4JNJqhkY_hC-Nd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXe_llyexYrjKiaQJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwoSRQAHZ8Ek92-Ehd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3veXvq4OVnm9yPwJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyzN9n5varsbEswI_94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzLEPFXtHom0yUfFjx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]