DeepMind Presents GraphNets: A Library for Graph Networks in Tensorflow And Sonnet.

Oct. 24, 2018, 9:38 p.m. By: Kirti Bakshi

graphnets

Neural networks that operate on graphs, and structure their computations in accordance to the same, have been developed and explored extensively for more than a decade under the umbrella of “Graph Neural Networks”, but have seen a rapid growth in scope and popularity only in the recent years.

And DeepMind believes that adequate knowledge in graphs and relations can turn out to be a significant step toward the goal of realizing an Artificial General Intelligence (AGI) and hence, presents to its audience a Graph Nets(GN) library, that enables the use of graph networks in TensorFlow and Sonnet.

What is GraphNets?

GraphNets, is a machine learning framework that was published by DeepMind, Google Brain, and MIT University. It is after a wait of four months that the coding part of the work has finally been presented.

The library has been shared by DeepMind on GitHub and anyone can install and make use of it with TensorFlow. The library also includes demos which show the users how to create, manipulate, as well as train graph networks to reason about graph-structured data, on a sorting task, a shortest path-finding task, and a physical prediction task.

Each demo makes the use of the same graph network architecture, which as a result highlights the flexibility of the approach.

What are the design principles behind Graph Networks?

Graph network, an underused deep learning building block, is a framework that unifies existing approaches that also operate over graphs, and provides a straightforward interface for assembling graph networks into both complex as well as sophisticated architectures. Given below are the design principles behind Graph Networks:

  • Flexible representations,

  • Configurable within-block structure,

  • Composable multi-block architectures.

These three design principles come together and combine in the framework which as a result makes it extremely flexible as well as applicable to a wide range of domains that range from perception, language, and symbolic reasoning.

Combinatorial Generalization in Graph Networks:

The structure of Graph networks naturally supports Combinatorial Generalization because they do not perform computations strictly at the system level, but also apply shared computations across the entities and across the relations as well.

This strong relational inductive bias possessed by graph networks that supports Combinatorial Generalization, overall makes it to be a powerful tool both in terms of implementation as well as theory.

To Learn more about GraphNets, its features, limitations and more, refer to the links mentioned below:

More Information: GitHub

Link To The PDF: Click Here