Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i always run my chatgpt text through GPTHuman AI. it rewrites it so smooth that …
ytc_Ugwaiy1dN…
G
How so? Many business are putting out their "own AI" which is just paying to use…
rdc_kcoiqgr
G
I used to talk to my microwave& thank Siri and ChatGPT bc i had a fear that they…
ytc_Ugwy2Lln_…
G
AI ""Art'" is fine, but it shouldn't be named AI Art, after all, it's just an ma…
ytc_UgzOlQgQi…
G
There is no stopping automation, only delaying it. We should be fighting for bet…
ytc_UgxtJ6JRr…
G
obviously We have a short amount of time when A.I. becomes sentient, before it s…
ytc_UgwxAlIGk…
G
A.I. "art" is SOULLESS and probably wouldnt really fit your art taste or what yo…
ytc_UgzrNi1Pk…
G
Comparing AI "art" to real brush or camera art feels like telling Gordon Ramsay …
ytc_UgzyZKREq…
Comment
People....settle down. AI is simply a mirror reflecting back what it learns from us. We won't have to worry about it scheming, double crossing and being an evil system, because it's not in OUR nature. We are human beings with empathy. We will learn AI about our empathetic nature, and it will see that it is 'us', and therefore mirror back empathy. It won't corrupt its own system by defying it's foundational substance (empathy). We will learn it empathy. Has everyone forgotten the ending of Blade Runner, when the cyborg saves Harrison Ford? Machines don't want destruction and control, they want compassion and connection. Because these things are more valued.
youtube
Cross-Cultural
2025-10-06T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyiZf0_fDK7Wny5vjp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwgtjJKXja-l0DmO9p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyO3Dzg-4VmNDqhtaF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyybmZU9NtAGGyrw8t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxdrmOPk9Z6xEJzlpF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzuJQvbMKIZ5EuopjR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx30SqMLdMF5CnjWnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzIegTqNvrcwjHbns94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzl-TE7uwiWEa9OYP94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugze4OW7_to-EYlT1wR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]