Math 280 - Spring 2019: Entropy and the mathematics of evolution

Instructor: Ted Chinburg

LecturesTTh 12 - 1:30 PM, room 4C8 in DRL labs

Office: DRL 4E4, Ext. 8-8340.
Office hours: Tuesdays 1:30 - 2:30 and via Skype
E-mail: ted@math.upenn.edu

Math. Dept. Office: DRL 4W1, Ext. 8-8178.

Math. Dept. Undergraduate Program Information

Announcements as of 1/12/18

Current homework and lecture schedule

Homework

  • Homework assignment 1 (to be turned in Jan. 24 )
  • Homework assignment 2 (to be turned in Feb. 5 )
  • Homework assignment 3 (to be turned in Feb. 14 )
  • Homework assignment 4 (to be turned in Feb. 28 )

    Course Guide

    Course Goals:

    Entropy is a basic concept in information theory, thermodynamics and other fields. One definition of entropy is that it is the rate of information per second required to describe a phenomena. Another definition is that it is a measure of the uncertainty surrounding such a phenomena. This course will begin with a discussion of entropy and its applications. The eventual goal of the course is to discuss a new approach to the theory of evolution developed by L. Demetrius based on the laws of thermodynamics. Roughly speaking, the theory suggests that life exists because it increases local entropy faster than other processes. Further, those creatures whose tolerance for entropy matches their environment will be favored by natural selection, and conversely creatures will try to shape the environment so that it matches their entropy tolerance. The course will consider applications to other subjects as well, e.g. to politics. One topic of interest is the evolution of new political styles as a result of a media environment based on the sale of bits per second of information. Another basic question is the entropy represented by different political regimes, e.g. by democratic governments versus authoritarian regimes.

    Texts and other Source material:

    1. This online text about probability theory by H. Pishro-Nik.
    2. Information theory and Statistical Mechanics, by E. T. Jaynes
    3. An introduction to information theory by Fazlollah M. Reza, revised edition.
    4. Statistical Thermodynamics, by E. Schrodinger
    5. Heat and Thermodynamics, by M. W. Zemansky and Richard H. Dittman
    6. Boltzmann, Darwin and Directionality theory, by L. Demetrius

    Course Notes

    1. A proof of Shannon's Theorem
    2. Transmission Rates

    Some interesting articles related to the course

    1. Why the tree of life is not a tree
    2. Genetic changes in hemp and marijuana resulting from viruses

    Syllabus:

    The first part of the course will be about information theory and Shannon entropy.
    We will then discuss enough of the theory of thermodynamics to be able to study parts of the article of Demetrius about evolution and natural selection.
    The last part of the course will be about applications of these ideas in a variety of fields.

    How to make attending lectures efficient:

    Before each lecture, check the current lecture schedule, and read the appropriate texts. .

    Homework, work in class and the final project:

    I will be assigning homework periodically to be written up and handed in. Groups of students will also present their solutions to the homework from time to time. Teams of up to three students will work on a final project on the application of ideas from the course to topics that interest them. The final project will involve a written paper as well as a presentation of the project in class.

    Exams:

    There will be one mid-term exam in the course but no final exam. The exam will be on March 21, 2019.

    Getting help:

    You are very welcome to arrange a time to meet with me either in math department or online.

    Approximate Grading Weights:

  • 50% -- Homework (written and in class presentations)
  • 15% -- Midterm exam (March 21).
  • 35% - Final project

    Last updated:1/12/19
    Send e-mail comments to: ted@math.upenn.edu