Is there a classical regression model where, for π=1,β¦,π,
πΈ(πα΅’) = π πα΅’
with π a known constant, and
πα΅’=(exp πα΅’ Ξ²) / (ββ±Ό exp πβ±Ό Ξ²)
Thus πα΅’ β (0,1) and β πα΅’ = 1.
Note that this is *not* a multinomial logistic regression. There is a single vector Ξ² to estimate. It should be estimated from a single set of observations πβ,β¦,πβ (and the covariates πβ,β¦, πβ).
#rstats #glm #statisticalmodel #statistics
This is an example that compares fitting a #logistic regression as a #generalized #linear #model #GLM & as an #AI #MachineLearning using #tensorflow. Marvel at the computational waste (number of epochs ; a typical IWLS #GLM fit converges in <10 iterations) https://atm.amegroups.org/article/view/30334/html
#logistic #generalized #linear #model #glm #ai #machinelearning #tensorflow
imnotycobb: Since Jan 1 there have been 30 (now 31) bolides or meteors captured with #GOES #GLM lightning mapper. Note: this data only goes through the 8th since each event is verified for accuracy. More info about bolide detection can be found here #okwx https://twitter.com/imnotycobb/status/1616468906100621314
Thread from @AndyChenML@twitter.com arguing that #GLM-130B beats GPT-175B, is open source, and can run on a single A100 with 4-bit quantization: #llm https://twitter.com/andychenml/status/1611529311390949376?s=46&t=jgk40-JPcPZRoMLFx1B0tw
Cheatsheet for linear regression in r. #rstats #linearRegression #glm #cheatsheets
Iβm trying not top post things from the bad place but this one is really good!
Originally posted by Ben Larson blarson424 on the bird site.
#rstats #linearregression #glm #cheatsheets