However, when #Wikidata emerged, I wanted to connect my database with it. Before two years, I tried to store SMW in #Blazegraph but I made a bad experience with it. When the power went off, it led to fatal error and problem to boot.
Two months ago, I tried a second attempt, with #Virtuoso for this time. It runs without problems.
Now, I can make #sparql queries, however it is quite complicated, e.g. to query properties and pages with longer string names.
#wikidata #Blazegraph #virtuoso #sparql
Now available for download, the 2023-04-23 #Blazegraph dump for #Wikidata Query Service. Use to bootstrap your own local copy of WDQS. https://datasets.scatter.red/orb/
#Wikidata Query Service #Blazegraph dump now available to download, free of charge (344 GB compressed; over 1 TB decompressed).
Wrapping up this thread with a new #blog post: "Deploying #Wikidata to different graph databases and what works best" #blazegraph #sparql #wikibase
https://harej.co/posts/2023/01/loading-wikidata-into-different-graph-databases-blazegraph-qlever/
#blog #Wikidata #Blazegraph #sparql #wikibase
The challenge here isn't going to be initial setup, as far as I can tell. The index build is in progress. It is going to be taking updater tech built for #Blazegraph and adapting it to #QLever. Which I suspect is possible, if both use #SPARQL, but I haven't actually tried it yet.
After successfully experimenting with and deploying #Blazegraph for #Wikidata querying I am now experimenting with #QLever which is like a breath of fresh air.
Eventually I would like to offer QLever as an experimental service alongside Blazegraph, eventually retiring Blazegraph.
Rebuilding #Wikidata in #Blazegraph was successful, as was uploading the 1 TB database file to cloud storage.
Now I am syncing the database with present day. Unlike last time, I am pretty sure I am actually pulling from Wikidata this time.
Rebuilding #Wikidata on #Blazegraph ...again... after a misconfigured updater caused data corruption. In parallel I am working on re-attempting Wikidata in #QEndpoint
#Wikidata #Blazegraph #QEndpoint
Recommended system configuration to run a #Blazegraph instance loaded with #Wikidata –
* 2x 1 TB NVMe SSDs, formatted together as a single 2 TB volume
* 128 GB of RAM
* If your CPU can address 128 GB of RAM it's probably good enough
Successfully rebuilt #Blazegraph data.jnl from dump; now backing up my effort. (I may make such dumps available for download, since building #Wikidata from dump takes multiple days even on good hardware.)