Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@stephenpeterson7558 thanks for your reply. Im assuming this is an LLM answer.…
ytr_UgyR8SvVs…
G
i wasn't expecting this argument, but it's so accurate, AI may be a danger far s…
ytc_UgxBOOvtl…
G
Most people have an idea but do not have the skill or willpower to do it. For th…
ytc_UgztUqXoG…
G
Anyome who downplays AI's potential, has sold their soul to it. We have heard th…
ytc_UgxbDFqEf…
G
I think it's generally going to depend on how you were raised, and how your worl…
ytc_UgxeRFQXg…
G
Thank you for such a thorough analysis of the driverless car problem. I really h…
ytc_Ugx732lBj…
G
I wish people who don't know what they're talking about (Glenn Beck) when it com…
ytc_UgzD6HEgr…
G
DID U KNOW IN QURAN MENTION THAT LAST DAY JUDGMENT MONTER WILL KILLING HUMANS A …
ytc_UgxX2yhOX…
Comment
Theoretically, it is the blue collar jobs which can be easily automated.
However we got an anomaly, which is LLM, before it's time, even before the household robots. Hence this unbelievable 'shift'. What's driving this is opensource code sharing which is unlike in history.
Robotics will catch-up within 5 years. It should be easier to automate than cognitive functions.
youtube
2025-08-23T06:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwRBtzO650In68PkkZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgydqBdz-kbliVhyeXR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgyruepogiIqV5Wx-LZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxtunJtkSMrxaZLCmp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyPRZ1O_Cn8hoHi6Nt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxnJo-VIYhxhGteJzZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx5hgartjgXVfU-AvF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxQJAWO1eAs9q9_owF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugx8bnydz3PSB59Cymd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgwNjMS_1PGFX6Y-OQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})