Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While I despise AI images for how they steal art. I despise the capitalists who …
ytc_Ugxof8G_E…
G
Traveling thousands of miles, powered by insatiable ELEC needs—paid for by THE P…
ytc_UgwjRAPHW…
G
I've seen a few people on X comment that this is AI, but it seems the "artist" j…
ytc_UgzetFr3g…
G
Here's a better idea ... Stop relying on AI ... stop using it. Start doing thing…
ytc_UgzGzhPkh…
G
What happens if/when two super intelligent AI’s see each other as a threat? How …
ytc_UgzTi_kTJ…
G
AI that literally copy non AI arts then combine into one... That the concept of …
ytc_UgyqdFk6W…
G
This is spooky😳is this a human with a chip inside her or a real robot AI? She lo…
ytc_UgzEnS_Bu…
G
I find it funny that people complain about ai while breathing in smog. They worr…
ytc_UgznZB_az…
Comment
Ilya Sutskever said that AI is at a standstill. Scaling models to just more parameters made them better but didn't reinvent the technology. That was the most effective, low risk policy of the last 5 years. Yes, transformers are almost a decade old. Since then AI had essentially no new leaps.
To cut a long story short they will get better to the same extent cars got better in the last 50 years.
I believe the technology is there, and it will take a lot of work and hands on time from developers to interface it with the current infrastructure in society.
In a way we are at the same stage human kind was when the printing press was invented, having the press means nothing unless you use it to solve real world problems. AI will be the same, it is there, it works reasonably well, possibly most of the work starts now.
youtube
Viral AI Reaction
2026-03-07T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxMG1y2fVyYmyxndgp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzUqXv3nrOQoaLIbrN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwilFSKhxPoULn7xnx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwBXkk3QyFoSIyMqql4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxke2ynF8eKA52wSGR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_nRaBd-i_ryWhO-h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxEopswpl5R5ITvTd54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzz9GDGrrT7ofT8zz14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8OOftEau5qjCqVx94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugypsx32OxG71ENinMl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]