Computing Exponentials of Essentially Non-negative Matrices with Entry-wise Accuracy
Speaker: Qiang Ye, University of Kentucky
Abstract:
A real square matrix is said to be essentially non-negative if all of its off-diagonal entries are non-negative. In this talk, I will present new perturbation results and algorithms that demonstrate that the exponential of an essentially non-negative matrix can be computed with entrywise relative accuracy.
Learning Algorithms for Restricted Boltzmann Machines
Speaker: Devin Willmott, University of Kentucky
Abstract: Restricted Boltzmann machines (RBMs) have played a central role in the development of deep learning. In this talk, we will introduce the theoretical framework behind stochastic binary RBMs, give motivation and a derivation for the most commonly used RBM learning algorithm (contrastive divergence), and prove some analytic results related to its convergence properties.
Learning Algorithms for Restricted Boltzmann Machines
Speaker: Devin Willmott, University of Kentucky
Abstract: Restricted Boltzmann machines (RBMs) have played a central role in the development of deep learning. In this talk, we will introduce the theoretical framework behind stochastic binary RBMs, give motivation and a derivation for the most commonly used RBM learning algorithm (contrastive divergence), and prove some analytic results related to its convergence properties.
Learning Algorithms for Restricted Boltzmann Machines
Speaker: Devin Willmott, University of Kentucky
Abstract: Restricted Boltzmann machines (RBMs) have played a central role in the development of deep learning. In this talk, we will introduce the theoretical framework behind stochastic binary RBMs, give motivation and a derivation for the most commonly used RBM learning algorithm (contrastive divergence), and prove some analytic results related to its convergence properties.
Learning Algorithms for Restricted Boltzmann Machines
Speaker: Devin Willmott, University of Kentucky
Abstract: Restricted Boltzmann machines (RBMs) have played a central role in the development of deep learning. In this talk, we will introduce the theoretical framework behind stochastic binary RBMs, give motivation and a derivation for the most commonly used RBM learning algorithm (contrastive divergence), and prove some analytic results related to its convergence properties.
Learning Algorithms for Restricted Boltzmann Machines
Speaker: Devin Willmott, University of Kentucky
Abstract: Restricted Boltzmann machines (RBMs) have played a central role in the development of deep learning. In this talk, we will introduce the theoretical framework behind stochastic binary RBMs, give motivation and a derivation for the most commonly used RBM learning algorithm (contrastive divergence), and prove some analytic results related to its convergence properties.