Unlock the Black Box by Interpreting Graph Convolutional Networks via Additive Decomposition
#gnn #subgraph #interpretability
LightGlue: Local Feature Matching at Light Speed
Improving SuperGlue with changes to transformer (GNN matching). Iterative design gives speed boost.
My summary on HFPapers: https://huggingface.co/papers/2306.13643#64ba93f75e13c0d8659b4ce8
Links: [PapersWithCode](https://paperswithcode.com/paper/lightglue-local-feature-matching-at-light), [GitHub](https://github.com/cvg/lightglue), [arxiv](https://arxiv.org/abs/2306.13643)
#lfm #arxiv #PaperThread #newpaper #gnn
Am Wochenende geht es mit #1000TanksFürNürnberg los. Eine schöne Ergänzung zum #Bardentreffen in #Nürnberg!
#sdgsgolocal #GNN
#1000tanksfurnurnberg #bardentreffen #Nurnberg #sdgsgolocal #gnn
#Introduction I'm a 3rd year PhD student focusing on brining #haemodynamics into the clinic in the context of #cardiovascular research. To do this, I use novel graph neural network (#gnn) architectures to provide on-the-fly uncertainty quantification for clinical metrics. I regularly use #python, #r, #fortran and #sql. I plan to post mainly about interesting papers, repositories and my own work.
#Introduction #haemodynamics #cardiovascular #gnn #Python #r #fortran #sql
Graph-based Multi-ODE Neural Networks for Spatio-Temporal Traffic Forecasting
Zibo Liu, Parshin Shojaee, Chandan K. Reddy
Action editor: Ivan Oseledets.
FASTRAIN-GNN: Fast and Accurate Self-Training for Graph Neural Networks
Amrit Nagarajan, Anand Raghunathan
Releasing Graph Neural Networks with Differential Privacy Guarantees
Graph Neural Networks Designed for Different Graph Types: A Survey
Josephine Thomas, Alice Moallemy-Oureh, Silvia Beddar-Wiesing, Clara Holzhüter
Fast&Fair: Training Acceleration and Bias Mitigation for GNNs
FASTRAIN-GNN: Fast and Accurate Self-Training for Graph Neural Networks
Graphcore's partnership with PNNL (part of the US Department of Energy) unlocks dramatically faster 3D molecular modelling using SchNet Graph Neural Network #GNN https://t.co/Lv2dpaKUMK
Node-Level Differentially Private Graph Neural Networks
🧙♂️ "Show me your NFT and I tell you how it will perform" ! 🧙♀️
💥 Check out our postprint of "Multimodal representation learning for NFT selling price prediction", just accepted at #TheWebConf2023 🎉
📄 arxiv.org/abs/2302.01676
📝 w/ @starquake and @andreatagarelli
#TheWebConf #WWW2023 #NFT #Web3 #Blockchain #Metaverse #AI #NLP #NetworkScience #MachineLearning #DeepLearning #Transformers #ComputerVision #GNN
@webscience @economics
@networkscience
@complexsystems
@computationalsocialscience
#thewebconf2023 #thewebconf #www2023 #nft #web3 #blockchain #metaverse #ai #nlp #networkscience #machinelearning #deeplearning #transformers #computervision #gnn
Postdoc position in graph reasoning for narrative
https://www.rug.nl/about-ug/work-with-us/job-opportunities/?details=00347-02S0009QSP
Deadline: 19 Feb
Tasks: ontology learning, semantic relation identification, extracting structured information from narrative texts. #NLProc #DH #GNN #wikidata #fanfic #job #academic
@gronlp
#academic #Job #fanfic #wikidata #gnn #dh #nlproc
Do Transformer Really Perform Badly for Graph Representation Learning?
https://www.microsoft.com/en-us/research/project/graphormer/
https://arxiv.org/abs/2106.05234
https://github.com/Microsoft/Graphormer
#GNN, #AI, #ML, #DeepLearning, #Graphs
#gnn #ai #ml #deeplearning #Graphs
Ok, finally found some time to play with #ModelAngelo
@SjorsScheres I’m assuming that for homooligomers I just repeat the seq in the fasta file? #cryoEM #GNN
You can sample nodes for scalable #GNN #training. But how do you do #scalable #inference?
In our latest paper (Oral #LogConference
) we introduce influence-based mini-batching (#IBMB) for both fast inference and training, achieving up to 130x and 17x speedups, respectively!
1/8 in 🧵
#gnn #training #scalable #inference #logconference #ibmb
This seems like a promising approach: a bit of physics informed and has flavor of DDA.
https://pubs.acs.org/doi/full/10.1021/acsphotonics.2c01019
'a Graph Neural Networks (GNN) architecture which learns to model #electromagnetic scattering, can be applied to metasurfaces of arbitrary sizes. Most importantly, it takes into account the coupling between scatterers. Using this approach, near-fields of #metasurfaces with dimensions spanning hundreds of times the wavelength can be obtained in seconds.' #photonics #GNN #MachineLearing
#machinelearing #gnn #photonics #metasurfaces #electromagnetic
Presenting a hybrid algorithm able to find groups of users who can influence as many people as possible in: “Boosting a Genetic Algorithm with Graph Neural Networks for Multi-Hop Influence Maximization in Social Networks” by Camilo Chacón Sartori, Christian Blum. Proceedings of the 17th Conference on Computer Science and Intelligence Systems, appeared in: M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds); ACSIS, Vol. 30, pages 363–371 (2022)
#GNN #BRKGA
Open Access: https://lnkd.in/dn5QZGqf
Towards Geometric Deep Learning, article series by Michael Bronstein, Oxford.
#gnn #neuralnetworks #geometricdeeplearning #deeplearning