Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hinton is a socialist. His end goal is about one world government. AI was design…
ytc_Ugx460Gb8…
G
I know a counter attack. Spam the ai generated videos to make Trump look bad to…
ytc_Ugx7rJRJv…
G
well, easy thing, give AI "artists" a pencil and make them draw, done, oh wait i…
ytc_UgzKcHCHD…
G
Ok, this sounds good imo, and I hate ai generated content.
But this is the accep…
ytc_UgykCVu4g…
G
Artists are butthurt over literally anything. If your art can be replaced by AI,…
ytc_UgyOxjtwb…
G
yeah I think a lot of people are missing the point. Yes, art can be a financiall…
ytr_UgzFcsxTx…
G
I'm the only one that his angry that the people that did nothing about private i…
ytc_UgyfmaZJZ…
G
So those tools are built to make drawing easier so you can do it longer that has…
ytc_UgwhwKq7n…
Comment
AI is gonna get better in the future, that's a fact. But there are things that AI can't do no matter how good it gets. Things like coming up with new creative ideas for a problem, creating a new system/software which has NEVER been done before. All of these things need creativity, and AI don't have the power to do these things.
AI is trained on all the existing data, and if something has NEVER been done before, AI can't do it too.
When Google was growing rapidly, they needed a completely new way of storing and managing their HUGE database, so they had to come up with a new database which is suited for their workload. I don't think AI would have come close to doing so.
Sure AI could help potential code such new software in the future, but it would still need someone(a human) to give it instructions and the design of the new software for it to code the solution.
youtube
2025-03-12T17:1…
♥ 11
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwkowvxn5FVzp-BAM94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_Ugz91JSy8sftEmz_U5d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-t4Z6fWRrdbkQVjN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgytM6z8C_8K9eKJkCV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyMZ5PzShQk6IlJ-at4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxnPkL23AvAQJYQZt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkdPciGNCy7ngAoNt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhbNSC73F1ftU23GV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxkublFe5_VUWRgNXF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy51mvxBVgKFemp9Dh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]