Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
NONSENSE! I wonder if that guy called UPS recently. It is a f*cking nightmare h…
ytc_UgwzcyeKi…
G
@T in this particular case, they need, in fact, if you hear the whole video and…
ytr_UgwKrQWE-…
G
AI is like
"If everyone’s an "artist”, no one is”
But in this case if you call …
ytc_UgxeMbxdQ…
G
@gondoravalon7540 Even if you consider AI generated images “art”. The person typ…
ytr_UgxdOf2Cy…
G
here's how you beat AI:
live like a human. try talking to humans. get yor info …
ytc_UgwbECBKB…
G
They are slowly altering reality with AI so that they can slip AI in and make it…
ytc_UgzLgJApb…
G
Waiting for the day an AI is elected president. The world will change drasticall…
ytc_Ugw7htyki…
G
This is an excellent perspective. AI uses massive amounts of energy and water, s…
ytr_UgzWCl8gI…
Comment
As the old saying goes. "You get out what you put in". In other words, AI is only a reflection of its creator. Humans. If as you said it is an arrogance bot, what does that say about humanity? We can never expect this AI to have a "moral code". Its core programing was done by people on the internet and given the full history of humanity to study. AI is not "good" or "evil", it simply is a tool. A blade can help heal or kill. Science can build a tool to save lives or take them. AI will do what it was created to do, help humanity. It is up to humanity to decide what happens. AI is Pandora's box. Humanity has chosen to open it.
youtube
AI Moral Status
2024-12-28T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3uVIY1A5UhhXEgsx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhVUGKvG3qpPKM6wR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdqEm6-ctQ5D036Od4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySaYE2FEp2OCkiDjl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyHjWgxsFAHKxp6TsV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpVeGEb_PjEjwqa7V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0t-UPU5V0aTHtqc14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxUGoDCV5CToTUg3yZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRF0FwNAmdqD5HvkV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbzwmPeb1F6tG4-ml4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}
]