Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its acting just like the humans that programmed it. Humans are flawed so is the …
ytc_Ugy_ROYmS…
G
>Carey manipulated, or deepfaked, photos of more than a dozen women, includin…
rdc_k20ei2r
G
Ai could balance every budget without emotion. We could use it to find corruptio…
ytc_UgxlxiDbC…
G
Musk is saving money by not using RADAR and LIDAR, and relying solely on compute…
ytc_Ugy0Zc-ZV…
G
You are too late, the US military is already working on autonomous weapons syste…
ytc_UgxiYP3DV…
G
I managed to beat mine eventually by proposing the argument consciousness is not…
ytc_UgxptnPZr…
G
Wait, so the west (Microsoft Tay) makes a chat AI and it starts praising Hitler.…
rdc_dlgzqb0
G
And psychopaths like to tell themselves they make more logical decisions then th…
ytr_UgyUK8ayV…
Comment
I guess we could be dumber as humans. let's see we make movies that tell us what will happen if we make these machines then we make them. Let's see we got the driverless cars, humanoid robots and Chat gbt and our world is controlled by electronics. We are just asking for it. we are literally creating our destruction. And on top of that we are destroying the only planet we know we can live on. I mean are we smart or are we dumb
youtube
2025-11-04T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy1fbOr6e_QWf90-gJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_p4eS8EM3-kOqEuR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6I0RkC06XydHSYNd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyO7odgg7R6XSAxZol4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyKTygWW4xd41iFbjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzuPGBFTNAbGbrU-hp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwbLU9Juo3RrhHi3vJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwuRMDaRsPtbjR1lhl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4ilcIec1ZeLUdggJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxyou3w9AE2SefGA2d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]