Damian Bogunowicz, Neural Magic: On revolutionising deep learning with CPUs https://www.artificialintelligence-news.com/2023/07/24/damian-bogunowicz-neural-magic-revolutionising-deep-learning-cpus/ #ai #sparsity #llm #tech #technology
#AI #sparsity #llm #tech #Technology
Revisiting Sparsity Hunting in Federated Learning: Why the Sparsity Consensus Matters?
#sparse #sparsity #distributed
'Fundamental limits and algorithms for sparse linear regression with sublinear sparsity', by Lan V. Truong.
http://jmlr.org/papers/v24/21-0543.html
#sparse #sparsity #interpolation
#sparse #sparsity #interpolation
https://www.jstatsoft.org/index.php/jss/article/view/v105i01
cglasso: An R Package for Conditional Graphical Lasso Inference with Censored and Missing Values
Augugliaro, L., Sottile, G., Wit, E. C., & Vinciotti, V. (2023). cglasso: An R Package for Conditional Graphical Lasso Inference with Censored and Missing Values. Journal of Statistical Software, 105(1), 1–58. https://doi.org/10.18637/jss.v105.i01
#conditional Gaussian graphical models #glasso #high-dimensionality #sparsity #censoring #missing data #R
#conditional #glasso #high #sparsity #censoring #missing #r
https://www.jstatsoft.org/index.php/jss/article/view/v105i01
cglasso: An R Package for Conditional Graphical Lasso Inference with Censored and Missing Values
Augugliaro, L., Sottile, G., Wit, E. C., & Vinciotti, V. (2023). cglasso: An R Package for Conditional Graphical Lasso Inference with Censored and Missing Values. Journal of Statistical Software, 105(1), 1–58. https://doi.org/10.18637/jss.v105.i01
#conditional Gaussian graphical models #glasso #high-dimensionality #sparsity #censoring #missing data #R
#conditional #glasso #high #sparsity #censoring #missing #r
New podcast from @thegradient with Hattie Zhou (twitter: https://twitter.com/oh_that_hat):
`Lottery Tickets and Algorithmic Reasoning in LLMs`
https://thegradientpub.substack.com/p/hattie-zhou-lottery-tickets-and-algorithmic
The first half is focused on the lottery ticket hypothesis, which is a favorite topic of mine.
If you want to know how people did matching pursuit in the 40s to compute stresses in a network of springs (and implicitly compute the product of the inverse of a "huge" 13x13 matrix with a vector), here is a slight update on an old post of mine:
"Matching Pursuit Before Computer Science" https://laurentjacques.gitlab.io/post/matching-pursuit-before-computer-science/
(a post recently moved on my homepage from WordPress) #Sparsity #SignalProcessing #GreedyMethods
#greedymethods #signalprocessing #sparsity