https://ignaciomsarmiento.github.io/2017/01/20/Quantile-and-OLS-Regression-with-Rmosek.html
that compares the speed of several implementations of quantile regression in Mosek with
that of the simplex implementation in my quantreg R package. He finds that for problems
of modest size (n = 1000) the quantreg simplex implementation which is based on the 1972
algorithm of Barrodale and Roberts was substantial faster than any of the Mosek implementations.
However, in a more realistic econometric problem based on data from Josh Angrist (n = 97367, p = 5)
the simplex algorithm was quite a bit slower than the best of the Mosek implementations. I wondered
how the interior point algorithms implemented in fortran in quantreg would perform.
Now, Mosek is clearly the Ferrari in this contest. I've used Mosek in many different convex
optimization problems and always found it to be remarkably quick, responsive to special road
conditions and (especially) elegant in design. So it requires a special sort of hubris to think
that the so-called Frisch-Newton implementations that were figuratively "made in my garage"
could compete, but lo and behold, here are the results:
Unit: milliseconds
expr min lq mean median
qr.mosek.dual.rep(X, y, tau = tau) 346.9841 357.2982 365.9998 364.2814
rq.fit.fnb(X, y, tau = tau) 193.7651 194.5455 202.1676 195.2470
rq.fit.sfn(Xs, y, tau = tau) 330.3882 341.1280 365.0841 349.7058
rq.fit.br(X, y, tau = tau) 7266.7237 7270.0682 7289.8437 7274.8538
uq max neval
370.6428 462.1361 100
206.7692 288.0764 100
369.8432 451.3124 100
7285.0870 7522.6136 100
The results for the basic (dense) Frisch-Newton appear as rq.fit.fnb while the sparse version of this
algorithm appear as rq.fit.sfn and the simplex version is rq.fit.br.