Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think you are missing an important point: more and more, humans will start to …
rdc_oht2o1e
G
Google is paying Reddit $60 Million to use its content to train AI models. Googl…
ytc_UgwefI3CM…
G
Neuro sama isn't treated as a replacement for streamers. Her being an ai makes h…
ytr_UgyS41ruL…
G
Humanity managed duel use technology by controlling those who don’t have it. Wi…
ytc_Ugyb2DU9a…
G
The only time I ever use A.I. art is when i am trying to figure out what it's so…
ytc_UgzmzADIM…
G
That's an interesting thought! In the video, Sophia touches on the balance betwe…
ytr_Ugy-T3NAc…
G
AI should be used as a tool in people's jobs. It should not be replacing worker…
ytc_Ugz-rEr5W…
G
Why would ai need us in the first place? It doesn’t!
You can’t control Ai cert…
ytc_Ugy3sIF8r…
Comment
I'm a Christian, and the whole AI thing, to me, is basically playing Frankenstein (You know, the line cut from the original movie when he says "Now I know what it's like to be God!"?). Being able to create is part of being made in His image. It's how we know we each have a soul--you might even say art is the way we reveal our existence. When we try and create something that can create our art for us... It's demeaning our value as human beings. Yes, computers are a handy tool. But when they're used in such a way that makes humanity as a whole seem worthless... Then it's time to rethink it. You DO have value!
youtube
Viral AI Reaction
2025-07-27T03:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzdVNxXc-I8MIehFCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycwIIKBApndfVnTV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKT87FWe62tPoJ-Yl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRxw1YxDpgqIggHid4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxeVC0DJf51St222qx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbNVPFP1j43HkNpsl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyoqUaox0-bvDnR4FJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwxaa8I9j0h7anxFiJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwWCvQua1frshgufo94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwtzp2GqCc6y4HqhQh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]