Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Come on, THESE PEOPLE EXACTLY KNOWS that Ai can't get awakened.... Seeing it's T…
ytc_UgwYvIGzw…
G
As someone born and raised in the US during the Cold War, it's taken about 40 ye…
rdc_h34cpcb
G
Ahhh 😀 there is nothing better than a morning with a doom and gloom end of the w…
ytc_UgyLaQanF…
G
I don’t think they deserve rights...
If you want to destroy that $10,000 robot,…
ytc_Ugz5JMaKR…
G
Its fucking bullshit that ' ai isnt just something that just happens '. Im prett…
ytc_Ugy3m41-I…
G
Homo Sapiens Psychosis and why it was inevitable - and curable.
We are a psycho…
ytc_Ugy9I_lYi…
G
But it learns. So it's only a question of time and the bad code will be consider…
ytc_UgzTKnl07…
G
As a person Who work's in a call centre, we user AI since 2 weeks and it's alrea…
ytc_UgzqioxF4…
Comment
Sorry but I don't fully agree. I get the anger, and I'm not going to make the argument that "AI art" is good for humanity, all that said I don't agree that using existing art is "stealing". The way machines take existing art is parallel to how aspiring artists progress when they are learning, both take existing art and practice or train on it. Both make similar art based on the information they trained on, but they can also make art that is not similar to the training data. The point I'm getting to is that the human artist doesn't start from scratch, absolutely no artist alive today has started from nothing, they all have been influenced by existing art. So it would be absurd to charge for using material for training, both for humans and machines.
youtube
2025-04-10T19:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyfyv0zCp1JTKEtDsR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwGoq8W-I43lo3qk-p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzmz79eUp95cVju1Bh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwCHAzL9T__2RicPLJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz0RfsmTAny8rn3J-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwO99uFNfZV8K1oMOt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxURZ9pzxQ3bdQU1354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZ_sT921sjFUZKNHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyz8S7C_hFpjWsMTYx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwWiyVlXsqNSnl0ZBd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]