Some ideas from theoretical computer science lead to a theory for "interval algebras". Its intended model is the closed real interval. A major theorem is that every algebra of this finitely presented equational theory has a simple quotient and that any simple algebra appears uniquely as a subalgebra of the intended model. This acts as a powerful completeness theorem and allows one to recover -- using algebraic techniques -- what appears to be a good part of real analysis. I'll report on a number of recent results in this project.
Logic and Computation Seminar
Monday, October 11, 2004 - 4:30pm
Peter Freyd
University of Pennsylvania