<p/><br></br><p><b> About the Book </b></p></br></br><p>This updated new edition includes a wealth of additional material. As well as its integration of mathematical theory and numerical algorithm development, it features new chapters on topics such as the calculus of variations, integration, and block relaxation.</p><p/><br></br><p><b> Book Synopsis </b></p></br></br><p>Elementary Optimization.- The Seven C's of Analysis.- The Gauge Integral.- Differentiation.- Karush-Kuhn-Tucker Theory.- Convexity.- Block Relaxation.- The MM Algorithm.- The EM Algorithm.- Newton's Method and Scoring.- Conjugate Gradient and Quasi-Newton.- Analysis of Convergence.- Penalty and Barrier Methods.- Convex Calculus.- Feasibility and Duality.- Convex Minimization Algorithms.- The Calculus of Variations.- Appendix: Mathematical Notes.- References.- Index.</p><p/><br></br><p><b> From the Back Cover </b></p></br></br><p>Finite-dimensional optimization problems occur throughout the mathematical sciences. The majority of these problems cannot be solved analytically. This introduction to optimization attempts to strike a balance between presentation of mathematical theory and development of numerical algorithms. Building on students' skills in calculus and linear algebra, the text provides a rigorous exposition without undue abstraction. Its stress on statistical applications will be especially appealing to graduate students of statistics and biostatistics. The intended audience also includes students in applied mathematics, computational biology, computer science, economics, and physics who want to see rigorous mathematics combined with real applications.</p><p> </p><p>In this second edition, the emphasis remains on finite-dimensional optimization. New material has been added on the MM algorithm, block descent and ascent, and the calculus of variations. Convex calculus is now treated in much greater depth. Advanced topics such as the Fenchel conjugate, subdifferentials, duality, feasibility, alternating projections, projected gradient methods, exact penalty methods, and Bregman iteration will equip students with the essentials for understanding modern data mining techniques in high dimensions.</p> <p><p/><br></br><p><b> About the Author </b></p></br></br>Kenneth Lange is the Rosenfeld Professor of Computational Genetics at UCLA. He is also Chair of the Department of Human Genetics and Professor of Biomathematics and Statistics. At various times during his career, he has held appointments at the University of New Hampshire, MIT, Harvard, the University of Michigan, the University of Helsinki, and Stanford. He is a fellow of the American Statistical Association, the Institute of Mathematical Statistics, and the American Institute for Medical and Biomedical Engineering. His research interests include human genetics, population modeling, biomedical imaging, computational statistics, and applied stochastic processes. Springer previously published his books <i>Mathematical and Statistical Methods for Genetic Analysis</i>, <i>Numerical Analysis for Statisticians</i>, and <i>Applied Probability</i>, all in second editions.
Price Archive shows prices from various stores, lets you see history and find the cheapest. There is no actual sale on the website. For all support, inquiry and suggestion messagescommunication@pricearchive.us