Classification and Regression Trees
The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties. Topics covered include an introduction to tree classification, right sized trees and honest estimates, splitting rules, and mass spectra classification.
Conformal Mapping
Combined theoretical and practical approach covers harmonic functions, analytic functions, the complex integral calculus, families of analytic functions, conformal mapping of simply-connected domains, mapping properties of special functions and conformal mapping of multiply- connected domains. Only prerequsite: working knowledge of advanced calculus.
Asymptotic Methods in Analysis
"A reader looking for interesting problems tackled often by highly original methods, for precise results fully proved, and for procedures fully motivated, will be delighted." -- Mathematical Reviews.Asymptotics is not new. Its importance in many areas of pure and applied mathematics has been recognized since the days of Laplace. Asymptotic estimates of series, integrals, and other expressions are commonly needed in physics, engineering, and other fields. Unfortunately, for many years there was a dearth of literature dealing with this difficult but important topic. Then, in 1958, Professor N. G. de Bruijn published this pioneering study. Widely considered the first text on the subject -- and the first comprehensive coverage of this broad field -- the book embodied an original and highly effective approach to teaching asymptotics. Rather than trying to formulate a general theory (which, in the author's words, "leads to stating more and more about less and less") de Bruijn teaches asymptotic methods through a rigorous process of explaining worked examples in detail.Most of the important asymptotic methods are covered here with unusual effectiveness and clarity: "Every step in the mathematical process is explained, its purpose and necessity made clear, with the result that the reader not only has no difficulty in following the rigorous proofs, but even turns to them with eager expectation." (Nuclear Physics).Part of the attraction of this book is its pleasant, straightforward style of exposition, leavened with a touch of humor and occasionally even using the dramatic form of dialogue. The book begins with a general introduction (fundamental to the whole book) on O and o notation and asymptotic series in general. Subsequent chapters cover estimation of implicit functions and the roots of equations; various methods of estimating sums; extensive treatment of the saddle-point method with full details and intricate worked examples; a brief introduction to Tauberian theorems; a detailed chapter on iteration; and a short chapter on asymptotic behavior of solutions of differential equations. Most chapters progress from simple examples to difficult problems; and in some cases, two or more different treatments of the same problem are given to enable the reader to compare different methods. Several proofs of the Stirling theorem are included, for example, and the problem of the iterated sine is treated twice in Chapter 8. Exercises are given at the end of each chapter.Since its first publication, Asymptotic Methods in Analysis has received widespread acclaim for its rigorous and original approach to teaching a difficult subject. This Dover edition, with corrections by the author, offers students, mathematicians, engineers, and physicists not only an inexpensive, comprehensive guide to asymptotic methods but also an unusually lucid and useful account of a significant mathematical discipline.
Vector and Tensor Analysis
"Remarkably comprehensive, concise and clear." -- Industrial Laboratories"Considered as a condensed text in the classical manner, the book can well be recommended." -- NatureHere is a clear introduction to classic vector and tensor analysis for students of engineering and mathematical physics. Chapters range from elementary operations and applications of geometry, to application of vectors to mechanics, partial differentiation, integration, and tensor analysis. More than 200 problems are included throughout the book.
Probability Theory
This book, a concise introduction to modern probability theory and certain of its ramifications, deals with a subject indispensable to natural scientists and mathematicians alike. Here the readers, with some knowledge of mathematics, will find an excellent treatment of the elements of probability together with numerous applications. Professor Y. A. Rozanov, an internationally known mathematician whose work in probability theory and stochastic processes has received wide acclaim, combines succinctness of style with a judicious selection of topics. His book is highly readable, fast-moving, and self-contained.The author begins with basic concepts and moves on to combination of events, dependent events and random variables. He then covers Bernoulli trials and the De Moivre-Laplace theorem, which involve three important probability distributions (binomial, Poisson, and normal or Gaussian). The last three chapters are devoted to limit theorems, a detailed treatment of Markov chains, continuous Markov processes. Also included are appendixes on information theory, game theory, branching processes, and problems of optimal control. Each of the eight chapters and four appendixes has been equipped with numerous relevant problems (150 of them), many with hints and answers. This volume is another in the popular series of fine translations from the Russian by Richard A. Silverman. Dr. Silverman, a former member of the Courant Institute of Mathematical Sciences of New York University and the Lincoln Laboratory of the Massachusetts Institute of Technology, is himself the author of numerous papers on applied probability theory. He has heavily revised the English edition and added new material. The clear exposition, the ample illustrations and problems, the cross-references, index, and bibliography make this book useful for self-study or the classroom.
About Vectors
From his unusual beginning in "Defining a vector" to his final comments on "What then is a vector?" author Banesh Hoffmann has written a book that is provocative and unconventional. In his emphasis on the unresolved issue of defining a vector, Hoffmann mixes pure and applied mathematics without using calculus. The result is a treatment that can serve as a supplement and corrective to textbooks, as well as collateral reading in all courses that deal with vectors.Major topics include vectors and the parallelogram law; algebraic notation and basic ideas; vector algebra; scalars and scalar products; vector products and quotients of vectors; and tensors. The author writes with a fresh, challenging style, making all complex concepts readily understandable. Nearly 400 exercises appear throughout the text.Professor of Mathematics at Queens College at the City University of New York, Banesh Hoffmann is also the author of The Strange Story of the Quantum and other important books. This volume provides much that is new for both students and their instructors, and it will certainly generate debate and discussion in the classroom.
Elementary Introduction to the Theory of Probability
This compact volume equips the reader with all the facts and principles essential to a fundamental understanding of the theory of probability. It is an introduction, no more: throughout the book the authors discuss the theory of probability for situations having only a finite number of possibilities, and the mathematics employed is held to the elementary level. But within its purposely restricted range it is extremely thorough, well organized, and absolutely authoritative. It is the only English translation of the latest revised Russian edition; and it is the only current translation on the market that has been checked and approved by Gnedenko himself.After explaining in simple terms the meaning of the concept of probability and the means by which an event is declared to be in practice, impossible, the authors take up the processes involved in the calculation of probabilities. They survey the rules for addition and multiplication of probabilities, the concept of conditional probability, the formula for total probability, Bayes's formula, Bernoulli's scheme and theorem, the concepts of random variables, insufficiency of the mean value for the characterization of a random variable, methods of measuring the variance of a random variable, theorems on the standard deviation, the Chebyshev inequality, normal laws of distribution, distribution curves, properties of normal distribution curves, and related topics.The book is unique in that, while there are several high school and college textbooks available on this subject, there is no other popular treatment for the layman that contains quite the same material presented with the same degree of clarity and authenticity. Anyone who desires a fundamental grasp of this increasingly important subject cannot do better than to start with this book. New preface for Dover edition by B. V. Gnedenko.
Mathematical Foundations of Statistical Mechanics
The translation of this important book brings to the English-speaking mathematician and mathematical physicist a thoroughly up-to-date introduction to statistical mechanics. It offers a precise and mathematically rigorous formulation of the problems of statistical mechanics, as opposed to the non-rigorous discussion presented in most other works. It provides analytical tools needed to replace many of the cumbersome concepts and devices commonly used for establishing basic formulae, and it furnishes the mathematician with a logical step-by-step introduction, which will enable him to master the elements of statistical mechanics in the shortest possible time. After a historical sketch, the author discusses the geometry and kinematics of the phase space, with the theorems of Liouville and Birkhoff; the ergodic problem (in the sense of replacing time averages by phase averages); the theory of probability; central limit theorem; ideal monatomic gas; foundation of thermodynamics, and dispersion and distribution of sum functions. "An excellent introduction to the difficult and important discipline of Statistical Mechanics. It is clear, concise, and rigorous. There is a very good chapter on the ergodic theorem (with a complete proof!) and . . . a highly lucid chapter on statistical foundations of thermodynamics . . . useful to teachers . . . and to mathematicians." ― M. Kac, Quarterly of Applied Mathematics.
Mathematical Foundations of Information Theory
The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite "scheme," and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts "to give a complete, detailed proof of both ... Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory."Partial Contents: I. The Entropy Concept in Probability Theory -- Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory -- Two generalizations of Shannon's inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein's Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.