A Curated List Of Dedicated Resources: TensorFlow Papers

Dec. 3, 2017, 3:58 a.m. By: Kirti Bakshi

TensorFlow Papers

As we all know that TensorFlow is an open source software library that is used for the numerical computation along with the usage of data flow graphs. In other words, it can also be said that it is the best way to build deep learning models. Even though TensorFlow was made with the main purpose of conducting machine learning and research in deep neural networks, alongside the system is still very general enough to be applicable in a wide variety of other domains as well.

And this specific article will throw light upon various papers and studies related to TensorFlow:

TensorFlow Papers:

TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems:

"TensorFlow is an interface that is used for expressing machine learning algorithms, and also as an implementation for executing such algorithms. A computation that is expressed with the use of TensorFlow can result in execution with almost little or no change when related with a wide variety of heterogeneous systems, that range right away from mobile devices such as phones and tablets up to large-scale distributed systems."

This specific paper mainly describes the TensorFlow interface and also an implementation of that very interface that has been built at Google.

Link: Click Here

TF.Learn: TensorFlow's High-level Module for Distributed Machine Learning:

TF.Learn that is used for distributed machine learning inside TensorFlow is basically a high-level Python module that provides its users with an easy to use Scikit-learn style interface in order to even further simplify the process of creation, configuring, training, evaluation, and the experimentation of any machine learning model.

This module that uses a general-purpose high-level language basically focuses on bringing machine learning to non-specialists as well as researchers who wish to benchmark, implement, and on the contemporary also compare their new methods in a structured environment. The Emphasis here is put on ease of use, performance, documentation, and consistency of API.

Link: Click Here

Comparative Study of Deep Learning Software Frameworks:

The study was basically performed on several types of deep learning architectures in order to evaluate the performance of the above frameworks when further employed on a single machine for both CPU that was multi-threaded and GPU: The Nvidia Titan X settings.

This mentioned paper presents to you a comparative study of five deep learning frameworks that are as follows: TensorFlow, Torch, Caffe, Neon, and Theano, and also on mainly three important aspects that comprise of: utilization of hardware, extensibility, and lastly speed.

Link: Click Here

Distributed TensorFlow with MPI:

In this paper, there is an extending of a recently proposed Google TensorFlow with the use of Message Passing Interface (MPI) for the execution of clusters on a large scale. The basic approach required minimal changes to the TensorFlow runtime hence resulting in the generic implementation of the proposed and also resulted in the readily use of the same to an increasingly large number of users of TensorFlow.

Link: Click Here

Globally Normalized Transition-Based Neural Networks:

This paper describes the models behind SyntaxNet and introduces normalized transition-based neural network model that is global and that achieves state-of-the-art, part-of-speech tagging, dependency parsing and results of sentence compression as well.

Link: Click Here

TensorFlow: A system for large-scale machine learning:

This specific paper describes the data flow model of TensorFlow in contrast to all the existing systems and also sideways demonstrates their compelling performance achieved for several real-world applications.

Link: Click Here

For More Information: GitHub