START
Me: you should index this 300GB EMR dataset in #Postgres before FA in #Rstats
Analyst: this is going to take for ever. Let's raw dog this.
(7 days later)
Analyst: Should have put the indices in, #SQL queries take forever.
Me: No, let's raw dog this.
(5 months later)
GOTO START
Hackers target poorly secured MS #SQL servers to deploy #CobaltStrike and the new FreeWorld #ransomware.
https://thehackernews.com/2023/09/threat-actors-targeting-microsoft-sql.html
#informationsecurity #Malware #CyberSecurity #Ransomware #cobaltstrike #SQL
Popularity of #programming languages from August (#perl is 28 at 0.68%).
Has its limitations, but a couple of points worth noting :
1. #COBOL and #FORTRAN are holding rather well
2. languages that are involved in some sort of data analysis and processing (#sql, #clang /c++) are doing very well. Not sure what to make of #Python; are ppl in #AI seeing through the reality is a scripting over extremely performant c/c++ and that there are other lang that can glue as well?
#golang & #Julia are ⬆️
#programming #perl #cobol #fortran #SQL #clang #python #ai #golang #julia
SQL full course 👇🏼
FreeCodeCamp released a new SQL course. While the course was built for web developers, it focuses on general SQL topics and is good for beginners. It includes topics such as:
✅ Tables
✅ Querying
✅ Structuring
✅ Aggregation
✅ Normalization
✅ Joins
✅ Optimization and performance
#dataengineering #DataScience #Data #SQL
The @LabPlot Team has just published a new video on:
How to import data from SQL databases in LabPlot.
Watch it, like it, share it and subscribe to the LabPlot's channel and comment.
https://www.youtube.com/watch?v=nntP1okY0zg
#Databases, #SQL, #SQLite, #MySQL, #MariaDB, #PostgreSQL, #LabPlot, #DataAnalysis #Statistics #Data
#Data #statistics #dataanalysis #labplot #PostgreSQL #mariadb #mysql #sqlite #SQL #databases
Three ways to create a pivot table in plain SQL.
#SQLite #Database #SQL #Programming
https://antonz.org/sqlite-pivot-table/
#sqlite #database #SQL #programming
💡Je viens de découvrir le blog de @fljdin qui parle de #sql et #postgresql comme pas grand monde, et qui plus est, en français 🇫🇷 ! Je recommande chaudement ~ https://fljd.in/
@henry @vowe also most of these "Marketplaces" are just more wasteful and ineffective closed-off walled gardens.
Regardless if #Steam's marketplace which AFAIK runs on some regular #SQL DB or #UbisoftQuartz which only adds #Blockchain #Enshittification.
Because in both cases, one doesn't even legally "own" the account or it's contents!
https://mstdn.science/@henry/110766090031218984
#enshittification #BlockChain #UbisoftQuartz #SQL #steam
Teil 11 der Nerd-Enzyklopädie über eine Mutter mit
#Hacker-Fähigkeiten…
#sqlinjection #sql #mysql #schule #neuland
https://nickyreinert.medium.com/ne-11-der-kleine-bobby-tables-1a3a1d77d92d
#hacker #sqlinjection #SQL #mysql #schule #neuland
I use SQLAlchemy, but if you're only talking about hundreds of records, that would seem to be Overkill 9000. Especially if you're using SQLite on the back-end, because it's lousy for strongly-typed data.
What strengths of #SQL do you actually need when you only have a few hundred records? I'd be tempted to just load them all into memory.
Here's how I was able to create a virtual field in the CSV data that combines two fields, title cases them, and then replaces the " De " with " de ", which looks better:
REPLACE(TITLE(CONCAT("CANTON", ', Provincia de ', "PROVINCIA")), ' De ', ' de ')
Generating some medical documents from binary code, with headers to boot, the data set is large enough to warrant optimizing my #SQL to run faster, throwing around some syntax, trying a few tricks, and compare results... A #CTE is often the fastest ticket to #turbocharge a query, especially when there's complex joins to be navigated. It requires re-thinking the problem and a little more coding than just flipping a JOIN or IN/EXISTS clause (which often speed things up too), however.
So yesterday at work - one of the dev teams mentioned in passing that they were doing #database schema cleanup. They are dropping 30+ columns from the #ptient table that are no longer used by the application. I asked if they were testing for downstream impact. The response? crickets, wind in the grass, the occasional owl floating by.
No matter how long I work in IT, I am always surprised by how blind senior architects can be when it comes to data related stuff.
#database #ptient #it #SQL #stupiddevelopertricks