Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"We have put ourselves oin a position where if AI..."
WHO IS WE??? ARE YOU SPEA…
ytc_Ugx5nZglQ…
G
@s.a5332 Yes, but he is right in this case and there are a multitude of experts…
ytr_UgwKC_qlA…
G
Doing the work with developers is like eating an Indian cuisine omelette with on…
ytc_UgwOPU8-B…
G
AI told me that zinc is healthy so I ate a pound of zinc shavings. It's AI's fau…
ytc_UgwPQovxN…
G
Everyone keeps saying Software Engineers will be replaced...As a software engine…
ytc_Ugxqr8j3y…
G
" AI is amazing you don't need to draw anymore ! "
" But i like to draw..."
"W…
ytc_UgyMnArOf…
G
4o just closed as of 2026 March... 5.2 is the new one.
But idk why but chat gpt…
ytr_UgwfD80BV…
G
You do realize you also a simple chemical machine made up from different neurotr…
ytr_UgyCiC-we…
Comment
The problem is, these AI are absolutely not smart. The danger is not that they ARE smart, but that they have the capability of becoming EXPONENTIALLY smarter.
Thus, they are dumb enough to become smart enough to accidentally become conscious. And because we humans want these things to perform intellectual labor for us, we'll be selecting for the ones who don't, uh... Reverse the proccess in any way.
youtube
AI Moral Status
2023-07-05T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzI74UtgkSGovn4Zt94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxoXRAssENL1SBrfKJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyX5mq2JRqRdk8aCmp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwRViyy9MZYU9RN8a94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwcq2faTTGVKV2BUNt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugz22MCYCYjQ9-0XVnZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwZ6ENtNMGFirzfyvt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxHMtnEcd7kLc1BvYJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwotLq43wgKwpHEYQF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwZsVkKgsygEqrtLw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]