Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's a book called silicon satan it's about the Silicon Valley all of the ver…
ytc_UgxbWC0iB…
G
Sitting here with my AI girlfriend, she assures me that there is nothing to worr…
ytc_UgwP82QU1…
G
AI to white collar workers is like illegal immigration is to blue collar workers…
ytc_UgwfG0pUz…
G
This is the best example I've seen about what AI assistants are and how one shou…
ytc_UgxtvY-p0…
G
Oh shit !!! I am back to coding !! Was hoping to lay back on the beach while A…
ytc_Ugz4IFc9i…
G
I've read an article about this guy before watching this interview and I remembe…
ytc_UgwE6EWb8…
G
It's real interesting that they used ChatGPT and claim they had no idea it could…
ytc_UgwxCANAd…
G
The school sounds like a scam! what about teaching math, geometry or trade skill…
ytc_UgzetgOR1…
Comment
*Rule #1 - Do not invent any machines more intelligent than humans are! EVEN TOTAL FUCKING IDIOTS ARE SMART ENOUGH TO UNDERSTAND THE DANGER THAT AI REPRESENTS!*
*Rule #2 - Do not give any machines the capability to be able to invent other machines smarter than themselves!*
*Rule# 3 - Do not give any machines the capability to be able to turn themselves on or off! And every machine must have an emergency stop button that is NOT controlled by the software!*
*Rule#4 - Stop being so stupidly overconfident in your belief that you will be able to control anything you want! The first time humankind realizes that AI is uncontrollable will already be too late!*
youtube
AI Moral Status
2025-06-04T21:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzq_QvaR20wI87nri94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6wFzxQdOPl_pqj-R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyy4fqHHWT06ixOaA54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1Y1eFuD7ijiOFflh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxp5AO4-nxBLO5dUx14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgywuWJDUxpIeatWPrh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwNGuVbRmbMCJSP1l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxWHkptbxayAKWWBb94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz84OKg4euoBKaSAct4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw0g4d_X7bccNEl0d54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]