Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've been very impressed by Claude AI. I expect a lot depends on how you use it,…
ytc_Ugyu89iTd…
G
if its anything exurb1a is one of the few people who can say they called it with…
ytc_UgzJ3FLTy…
G
The billionaires and big businesses are not only going to save money with automa…
ytc_UgytvPcdo…
G
does anyone else find it funny all the "ai uprising" fears and all these schemes…
rdc_myw2vqz
G
We definitely sure as HELL do not need
AI robots
Anyone ever watch terminator…
ytc_UgwxnZyXa…
G
Tesla drivers are almost as bad as the Tesla AI. There's a full-time writer on a…
ytc_UgzncvP7W…
G
I wouldn't say AI "learns" but it still does do the same thing humans do, we all…
ytc_UgxOPmvZ0…
G
If they replace too many people with AI, then there won't be anybody that can af…
ytc_UgyVCFGQ7…
Comment
The whole a human does the same thing that a machine does, for example learning and taking what they learn to replicate it, is complete nonsense. I get what people are trying to get at, justifying the stealing of artist hard work that took years for them to master. That’s why I always find that whole argument stupid. There no way you can justify it. When humans attempt to replicate they will not 100% recreate the piece a human can implement there own personal touch or be inspired by another piece and implement that into the work. Humans don’t seek copy rather imitate what they see. That’s what makes human work human. The AI’s work will never feel human unless it learns to add its own personal touch
youtube
AI Responsibility
2023-06-14T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwqkUifUKROcJyt8IZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw6j-OhdcSnVcw_MFl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFiD2nquVi-MNyeQR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxT7TJwp56WcWoQgrR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz3UyAmzU8jRgpitTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzwutzq6iolP6Gd5sZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxhs6wzVmSWxvQ6F2d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_Ugxfh72tjmSKrsXDYM94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyeIQAPBK_VM7d8JBx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwjhnUaB75sSNU9aZB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]