Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Copilot couldnt understand a deeply nested XML document I asked it to check for …
ytc_UgwKr0-M9…
G
We appreciate your curiosity about the potential future implications of AI techn…
ytr_UgwSVtOdM…
G
Why hasn't anyone stepped in to stop AI generated false info , lies , etc ... It…
ytc_UgxMoHvam…
G
Likely it's A.I. generated and tailored to be as close to the real deal as possi…
ytr_UgxuKHNgR…
G
The only people who say that it will replace programmers are people who have nev…
rdc_mt822ud
G
At this point, I'd rather look into all of AI's code by itself and try to unders…
ytc_UgzeI1LUO…
G
analyste de données... et une fois sortis des écoles ils seront au chomdu du fai…
ytc_UgxlfsU6h…
G
we keep talking about sanctioning companies and sanctioning banks
what about ju…
rdc_luesweo
Comment
What Mr Musk appears to overlook is that 'governments' have been systematically whittled down, and their regulatory authorities have become insignificant extensions of moneyed interests; again, their objectives and workings are determined by the next quarter's results as compared to their competitors; interests of humanity take a far-flung back seat. There is no global authority, and all the worldwide forums are a joke in driving global interests. Competition drives AI innovation, and especially when it comes to its development by competing world powers, there is no stopping it. 'Security' and 'National Interest' will drive its advancement, human interest be damned.
The scene in Ice Age where the dodos jostle towards their own extinction comes to mind.
youtube
AI Governance
2024-04-13T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz2krT4l_erfc8pZMt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCCqjUs7yh1FYxgGd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyl2D7viy_okB5ZN_h4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzrvLvarP3dW2ZWVXt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzvj_T9keAzMy9nfgN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh5Z9ZrKrK9cNvDC14AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxeK84F2H9c4p5v2MR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQUGOR80oq2BnCPyB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw_14of_S6cVWFVSj54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxW9XrF_UgbYCul8WZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]