Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Idk. Bad people are gonna do bad stuff. AI isnt bad. Bad people are bad. I don't…
ytc_Ugx7QG57S…
G
It’s really sad how many people rely on immediate gratification instead of the t…
ytc_UgzIlLVQp…
G
Ai can’t do anyone’s job. Stop pushing bullshit. Even chat bots have to bail out…
ytc_UgxcB5lCL…
G
There are a million things that can kill cancer cells and all of that is meaning…
rdc_g3m0iu7
G
Photography didn't impact people's creativity in a bad way, no, it actually exte…
ytc_UgzccRO9a…
G
relax. ChatGPT isn't going to do any harm. It's parasites like the military or t…
ytc_UgzgPrX63…
G
As an artist myself, I am happy seeing this ai “artist” whining about his “art” …
ytc_UgyUYzjxQ…
G
This is a very flawed approach and not how those detectors work at all. Your app…
rdc_i6sc36c
Comment
I'm with Roger Penrose; conscious AI is not possible.
AI is just a mimic that detects and replicates patterns that make it appear intelligent. It will never be self aware through use of only computation.
The fact that AI is passing the Turing test is more a bad reflection on the human interlocutor than a triumph for the AI being tested; the main danger of AI is that we will start failing the Turing test.
That's not to say that AI controlled war robots are not very real and scary, just ask the people of Gaza.
youtube
AI Moral Status
2025-04-27T19:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzjWC2Veskr2865N_h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9UeTs2XkF_2QNvJB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9V0FKgQ43Id0ed8h4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0yNgCrO9y9aie-7t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvVM5YJVbD6wWSXO54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_lSAkXTDCVa6rTW14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzQ-lX40fJISgrspl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwpmobhasP02tPdyt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0T3yxLvTydudGOEh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyw15U-TdEgM2A0sih4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]