Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People are thinking about robot rights when we can't even after on human rights …
ytc_UgyobE93-…
G
good for Europe. The chance that a small business/w limited to no budget innovat…
ytc_UgzSlo-5i…
G
Yet isn't that actually what the Pentagon is asking of Claude, from Anthropologi…
rdc_o7qiqs3
G
What about AI where your content is altered and makes you appear in a false way?…
ytc_UgzhYIZQN…
G
Here is what AI (Gemini) has to say (and it's neither interesting or intelligent…
ytc_UgzpQdY5i…
G
I continuously make reports through ChatGPT against its code and possible infilt…
ytc_UgzOKnGO5…
G
I think you have to consider the use of the training data for AI. Unless you own…
rdc_jwvqofk
G
@ryanbentley1965I agree. There are problems with AI computing and data centers, …
ytr_Ugx-Gy2g5…
Comment
AI has no 'intent'. It does not want anything. It cannot draw even a single line by itself because it cannot intentionally choose a direction, unlike humans with consciousness. It doesn't learn from others artwork like we do. For it they're just a bunch of data it copies a part from, adds other data from its archives, randomizes, and puts out a bunch of iterations according to the users requirements. As for whether the data is copyrighted or ethical, that has nothing to do with it. That's the responsibility of the ones who feed its ever expanding data bank. At least that's how I see it from what I know. I might be wrong as I'm no expert in the matter, but I don't understand what other way a thing with no conscious inten could 'create' anything.
youtube
2024-01-01T04:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyzU9KwNK5cTv-dQVB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxEVQ2dn_jJt-tXSD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdhWjKw7IPiNyY1cJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwDzOr2Kh_9XnkLqOR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7LhylY6MfS0EIeIt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx5V1Bl2TZrQbYUHeN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyS63THOHVfM0hJq0B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw1ZTCwVBIH-nuPMVF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwRXft2tiGseUlsKQR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOU1ObDJeUCsVrZIF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}
]