Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Easy test. I just Ask some questions about history and I mediadly know If is rea…
ytc_UgyTNy-C8…
G
Taylor was deep fake after the Chiefs played the Bills. I felt so bad for her…
ytc_Ugy-8tqxP…
G
I agree. Even on this sub, many people are surprised that I'm using AI to study …
rdc_nt68xuo
G
We are not even a type 1 civilization yet. Humans are not gonna be not working a…
ytc_Ugxt3ne1G…
G
Next they're gonna build a machine that implants memories, so you don't actually…
ytc_UgxF4MUvZ…
G
If you don't include the incredible amount of resources consumed by data centers…
ytc_UgwdMa-E0…
G
As far as like space and stuff goes it's very very useful I want to develop some…
ytc_UgySl3w5V…
G
@SteinerNeinI mean ai generates the markdown but the point is that you can't ju…
ytr_UgxasaAEG…
Comment
ChatGPT is already known to have cognitive biases that it absorbed from human text. In addition to these, it also hallucinates on its own right. I suspect that those actually might be its _own_ "cognitive" biases. This is something to keep in mind. If something is trained by human output, it's going to absorb human biases. But on top of those, it will have its own cognitive biases alien to us, that are caused by its architecture. So it's possible that an AI system with superhuman intelligence will just run around in mental circles, just like most human beings do.
youtube
AI Moral Status
2023-08-27T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwW7_CTuTvy8FVFwBJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz92ehsXxlJMdy8gQB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyS-Enz79Zg3BrY4P54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwoC3ar6x9Y_CvYRAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzIY_4g_CO-NQqwfOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwqTV1T1NBNUJBSMdR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx2LGGe6fPpZ0csasB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwtdQQbCLJwHsmA7Nh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxxhWmmYYo1JUpQnHl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAq3Llbw1InJr6Ozd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]