MATH CLUB--MATH AND PIZZA SEMINAR
Math Club Meeting
Title: R as a vector space over Q, with an interesting consequence
Speaker: Dustin Hedmark
Abstract: We will look at the real numbers as a vector space over the rational numbers. After reviewing relevant linear algebra terminology, we will show that this is an infinite dimensional vector space. Next, we will use the vector space R over Q to show that there does not exist a tiling of a rectangle of dimensions 1 by x with squares, where x is an irrational number.
Math Club Meeting
David Murrugarra will talking about some research he did over the past year with two UKY undergraduate students. The title and abstract of his talk are below. Please come and ahangout with other mathematically minded students. There will be pizza.
Title: Estimating Propensity Parameters using Google PageRank and Genetic Algorithms
Abstract: Stochastic Boolean networks, or more generally stochastic discrete networks, are an important class of computational models for molecular interaction networks. The stochasticity stems from the updating schedule. The standard updating schedules include the synchronous update, where all the nodes are updated at the same time and gives a deterministic dynamic, and the asynchronous update, where a random node is updated at each time step that gives a stochastic dynamics. A more general stochastic setting considers propensity parameters for updating each node. SDDS is a modeling framework that considers two propensity values for updating each node, one when the update has a positive impact on the variable, that is, when the update causes the variable to increase its value, and the other when the update is negative, that is, when the update causes it to decrease its value. This extension adds a complexity in parameter estimation of the propensity parameters. This talk presents a method for estimating the propensity parameters for SDDS. The method is based on adding noise to the system using the Google PageRank approach to make the system ergodic and thus guaranteeing the existence of a stationary distribution and then with the use of a genetic algorithm the propensity parameters are estimated.
Math and Pizza Talk
Title: Linear Algebra and Error-Correcting Codes
Abstract: Encoded data is transmitted every day. While the complete accuracy of such transmissions is obviously desired, it cannot be guaranteed. There are, however, methods for encoding data that can assist with the detection of potential errors. In this talk, we will investigate how linear algebra is used in the context of encoding data in such a way that errors can be detected, and even corrected, upon receiving a transmission.
Are matrices the only “things” which have eigenvalues
A beautiful theorem in elementary linear algebra asserts that any symmetric matrix A 2 Rn×n will have n real eigenvalues _1, _2, . . . , _n with corresponding eigenvectors v1, v2, . . . , vn (Avi = _ivi). Furthermore, these eigenvectors vk 2 Rn may be chosen so that they are unit-length and mutually orthogonal (perpendicular), kvik2 = vi · vi = 1 for each i , vi · vj = 0 when i 6= j . A nice consequence of this result is that any vector u 2 Rn can be uniquely expressed in terms of these eigenvectors in a simple way: u = (u · v1)v1 + (u · v2)v2 + · · · + (u · vn)vn . We will discuss how one can generalize such ideas and results to vector spaces of functions, with matrix multiplication being replaced by differentiation. Although the focus of the talk will concern describing a way of (approximately) computing eigenvalues and eigenfunctions in this broader context, we will also give some intuition as to why such problems are of physical interest. The approximations themselves will bring us right back to where we started—eigenvalue problems for (really huge!) symmetric matrices.