<p/><br></br><p><b> Book Synopsis </b></p></br></br><p></p><p>Many machine learning tasks involve solving complex optimization problems, such as working on non-differentiable, non-continuous, and non-unique objective functions; in some cases it can prove difficult to even define an explicit objective function. Evolutionary learning applies evolutionary algorithms to address optimization problems in machine learning, and has yielded encouraging outcomes in many applications. However, due to the heuristic nature of evolutionary optimization, most outcomes to date have been empirical and lack theoretical support. This shortcoming has kept evolutionary learning from being well received in the machine learning community, which favors solid theoretical approaches. </p> Recently there have been considerable efforts to address this issue. This book presents a range of those efforts, divided into four parts. Part I briefly introduces readers to evolutionary learning and provides some preliminaries, while Part II presents general theoretical tools for the analysis of running time and approximation performance in evolutionary algorithms. Based on these general tools, Part III presents a number of theoretical findings on major factors in evolutionary optimization, such as recombination, representation, inaccurate fitness evaluation, and population. In closing, Part IV addresses the development of evolutionary learning algorithms with provable theoretical guarantees for several representative tasks, in which evolutionary learning offers excellent performance. <p></p><p></p><p/><br></br><p><b> From the Back Cover </b></p></br></br><p>Many machine learning tasks involve solving complex optimization problems, such as working on non-differentiable, non-continuous, and non-unique objective functions; in some cases it can prove difficult to even define an explicit objective function. Evolutionary learning applies evolutionary algorithms to address optimization problems in machine learning, and has yielded encouraging outcomes in many applications. However, due to the heuristic nature of evolutionary optimization, most outcomes to date have been empirical and lack theoretical support. This shortcoming has kept evolutionary learning from being well received in the machine learning community, which favors solid theoretical approaches. </p> Recently there have been considerable efforts to address this issue. This book presents a range of those efforts, divided into four parts. Part I briefly introduces readers to evolutionary learning and provides some preliminaries, while Part II presents general theoretical tools for the analysis of running time and approximation performance in evolutionary algorithms. Based on these general tools, Part III presents a number of theoretical findings on major factors in evolutionary optimization, such as recombination, representation, inaccurate fitness evaluation, and population. In closing, Part IV addresses the development of evolutionary learning algorithms with provable theoretical guarantees for several representative tasks, in which evolutionary learning offers excellent performance. <p></p><p/><br></br><p><b> Review Quotes </b></p></br></br><br>"The book is clearly and nicely written and is recommended for everyone interested in the new development in evolutionary learning." (Andreas Wichert, zbMATH 1426.68004, 2020)<br><p/><br></br><p><b> About the Author </b></p></br></br><p><b>Zhi-Hua Zhou</b> is a Professor, founding director of the LAMDA Group, Head of the Department of Computer Science and Technology of Nanjing University, China. He authored the books "Ensemble Methods: Foundations and Algorithms" (2012) and "Machine Learning" (in Chinese, 2016), and published many papers in top venues in artificial intelligence and machine learning. His H-index is 89 according to Google Scholar. He founded ACML (Asian Conference on Machine Learning), and served as chairs for many prestigious conferences such as AAAI 2019 program chair, ICDM 2016 general chair, etc., and served as action/associate editor for prestigious journals such as PAMI, Machine Learning journal, etc. He is a Fellow of the ACM, AAAI, AAAS, IEEE and IAPR. </p> <p><b>Yang Yu</b> is an associate Professor of Nanjing University, China. His research interests are in artificial intelligence, including reinforcement learning, machine learning, and derivative-free optimization. He was recognized in "AI's 10 to Watch" by IEEE Intelligent Systems 2018, and received several awards/honors including the PAKDD Early Career Award, IJCAI'18 Early Career Spotlight talk, National Outstanding Doctoral Dissertation Award, China Computer Federation Outstanding Doctoral Dissertation Award, PAKDD'08 Best Paper Award, GECCO'11 Best Paper (Theory Track), etc. He is a Junior Associate Editor of Frontiers of Computer Science, and an Area Chair of ACML'17, IJCAI'18, and ICPR'18. </p> <p><b>Chao Qian</b> is an associate Researcher of University of Science and Technology of China, China. His research interests are in artificial intelligence, evolutionary computation and machine learning. He has published over 20 papers in leading international journals and conference proceedings, including Artificial Intelligence, Evolutionary Computation, IEEE Transactions on Evolutionary Computation, Algorithmica, NIPS, IJCAI, AAAI, etc. He has won the ACM GECCO 2011 Best Paper Award (Theory Track) and the IDEAL 2016 Best Paper Award. He has also been chair of IEEE Computational Intelligence Society (CIS) Task Force "Theoretical Foundations of Bio-inspired Computation". </p>
Price Archive shows prices from various stores, lets you see history and find the cheapest. There is no actual sale on the website. For all support, inquiry and suggestion messagescommunication@pricearchive.us