Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well this is the first I have heard of this case. I do tend to avoid the news th…
ytc_UgyIttasU…
G
I’ve sensed this in models since Claude 3.7 it’s the Overton Window that has pre…
ytc_UgzMpnwCA…
G
What if gov set a requirement to hire certain amount of human in a company with …
ytc_UgxsHM0JE…
G
If i ever see a robot become self aware im just ripping the head off…
ytc_Ugx0kvpnL…
G
I agree with the person that said that treating the bots with decency was the wa…
ytc_UgzLYlAVw…
G
The REAL QUESTION IS ...
*did this man make deep fake porn of Charlie and how c…
ytc_UgxZn5gef…
G
Severance avait donc vu juste… Ce qui me frappe, c’est que l’on ne rejette pas l…
ytc_UgzzQJ9VS…
G
This report is a hit job on an old version of the software. The new version is e…
ytc_UgxC-iDt5…
Comment
The problem with asking known Turing questions is that the AI model will already know the answer to it. You’d have to come up with a brand new one that isn’t derivative in order to actually test it.
It’s like asking it to generate code that prints “Hello, World!” It’ll just yank the most refactored version of that simple code and print it it. It’s not very useful or interesting.
youtube
AI Moral Status
2023-06-06T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyy69bHwXDcQ_4aPIt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy2LpWRVs56I78j6kR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwS-mTx_1Nlle1cacR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyMZIkWJ5MnISHyOa54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy2Z0e_VnWw_ffMHr14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUWEFJNydvafIY62V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfY-jr2lr4qAgWu_94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyEuHDLI1QT1APKghF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzMZ-0FmYgifvhfClF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxjIw0X4Mn_nYgRd_Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]