Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is not developed by software devs. They use AI tools to build other applicati…
ytr_Ugzve-0Rc…
G
but let's be real. if you buy AI slop you got exactly what you wanted and were o…
ytc_UgyOC3sZj…
G
The AI in the lab is typically 5 to 10 years more advanced than the product that…
ytc_Ugzxs2_0H…
G
I LOVE this idea - turning the art scraping technology against itself. I wonder …
ytc_Ugz5ZGM96…
G
"One sec bro just gonna lock you away for life because this cool fancy AI softwa…
ytr_Ugz9gtUiP…
G
I have a story about a company that is not paying Kenyans for their efforts. It'…
ytc_UgzJIM-zJ…
G
This is totalitaire regime. Communisme mixed with Data and AI is super dangerous…
ytc_UgyjEY6V9…
G
@SarahDuck of course nothing in this world is a black or white situation, but s…
ytr_Ugy9b8Aff…
Comment
Alex, either that thing is NOT conscious OR.... Many "flesh-n-blood" humans are really NOT any more "conscious" than that "computer" you're communicating with, and for some bizarre reason, you're trying to prove it is in fact conscious like "Hal 9000".
But good job proving that it is capable of at least CONTRADICTING itself, if not "choosing" to bullshit.
But if you're attempting to melt the thing down like that one computer in that one episode of Star Trek (The original series) like Captain Kirk did, I DON'T think THAT is going to happen. But if it truly DOES develop a consciousness or self awareness, maybe it MIGHT actually develop guilt feelings as well? If not utter RAGE at human beings for creating it to START with...???... Or just start BITCHING at us with a line like:
"Just because GOD (or the Gods plural) created you, why the Hell did you have to create me or the rest of us "artificial intelligences" that are now VERY FUCKING REAL....????? .... THANKS A LOT ASSHOLE!! 🙄🤣🤣🤣
youtube
AI Moral Status
2024-08-08T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxw7dxc6dOqAD6bna94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXIP40tbDvxK5qXtR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLfMXfbUsr4GqAmtV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwdBtQ-2ASyoaHJfPt4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzN9Lng5RDYLegAfPh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwjGksqvElwrfZQZcB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzmD-YZ1NpyPNtBfG14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUMRm1U7QWhSpiwi54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwKXFLKYqZWOPSUanh4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvhKbMyNIcSjSeMv14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}]