Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Muscles have been replaced??? Yea......nah... jobs like landscaping, maintenance…
ytc_UgxRSHsSB…
G
"pro-ai artists" are literally just lazy ass people not willing to put the time …
ytc_UgzbELDNT…
G
I'll tell you what's coming... Exponentially more month by month... humans are g…
ytc_UgzxMY6M4…
G
Per the Bible, I’ve been predicting this for a couple years, since chatGPT went …
ytc_UgyJAUPct…
G
Let them make it. Not like they can copyright it, so just make money off what th…
ytc_UgzReSjLf…
G
ChatGPT, in the further will have major issues. From that talk you can clearly s…
ytc_UgzkCAIwT…
G
@sarabeatriz5569 Fine, thats why an AI can iterate fast with modern computers a…
ytr_UgyioDVkO…
G
With my interns and junior staff I know where their pitfalls are and limits of t…
ytc_UgyZPTzNj…
Comment
Bawsed on comments, not going to watch the interview. I do have a theory. AI is going to peak, at some point AI will produce more "content" (in whatever form that is via telling history, programming, creating video, etc) than humans do. And then AI will stagnate and actually become far less reliable because AI requires content made by humans because the way it works is it steals and mashes up human created content. Once humans are pushed out by AI, then AI will be stealing from other AI... and that is the start of the decay.
youtube
AI Moral Status
2025-05-16T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySv3XF7N-WEXo6CLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxJpN2GTP5iy3u3zy54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz4VVyqMGIn-MpLV0x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgykmlggxIR6hbe-jr54AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyprLYxt9bA-f_BqA54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqOH4GOVNCP-rUowp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwzQyqfKmpzzinh-Y94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugza5LYbjRFJPyBva_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZL06bcBNdtFldrEF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyPlkcEaQzV1K6Ujkx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]