Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am just as cautious, if not more so, when using self-driving technology as I a…
ytc_UgwAcUNaT…
G
Could we stop calling those people artists? they are basically the "customers" o…
ytc_UgzKNChUB…
G
because of the lack of spirituality within we look for machines to dictate our l…
ytc_UgwG-XvCX…
G
Ask yourself the question why would a sentient AGI implanted in a robot form wan…
ytc_UgzqWEsvu…
G
Its not true A.I. til it can write its own code. What we have now are programs n…
ytc_UgzfBK9IL…
G
For my opinion to make a robot is bad because it may kill a people when they thi…
ytc_UgyRtH3JL…
G
Me Personally, I know we cant stop ai entirely. But I think human creativity sho…
ytc_UgzHg7IlA…
G
Exactly - it's all about the copyright and the "cheating" against traditional ar…
ytc_UgwPEXZXy…
Comment
Probably one of the best videos on this topic. Just wish you had more reach.
There's no danger of drivers being replaced within the next 10-20 years. For many of the reasons you stated.
I've followed this tech since the late 80's and early 90's. It's been around that long lol. And we're just in the early stages of Stage 2, leaning hard on Stage 3 requirements.
My son works in the AI development side of self driving tech.
And he states many of the same reasons why this won't happen any time soon. Infrastructure and the rollout of technology. Technology accelerates exponentially. Infrastructure requires a massive workforce, permitting, environmental reviews, and money.
Regulations will be 50% of the battle.
youtube
AI Jobs
2021-04-23T14:4…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxtbqQZ8UnVcKlHF1N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyRsQEkv36K8CCVn94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgweGEUcy44ZoGljKLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyASTI6F0WNSogzPDp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzw6Our6ioxf_2A0_94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6m8v1YAKLdZi0bV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyao_d6g2gpU_ApJC94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxCCrNUPo3SKZGPok14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx1mb8mvLDHBbfm_994AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwi2nqNhynmTZggROJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})