Stephen J. Wright

Something went wrong. Please try your request again later.
Follow to get new release updates, special offers (including promotional offers) and improved recommendations.
OK
About Stephen J. Wright
Steve Wright is a Professor of Computer Sciences at the University of Wisconsin-Madison. He does research in computational optimization and its applications to many other areas of science and engineering. He has also been active in professional roles, most notably as a recent chair of the Mathematical Optimization Society, the leading professional society in optimization. During his career, he has been excited to witness the increasing vitality of optimization and its growing visibility across the whole scientific enterprise. He looks forward to many more years of enjoyable collaborations with excellent colleagues.
Customers Also Bought Items By
1 11 1
Author updates
Books By Stephen J. Wright
Numerical Optimization (Springer Series in Operations Research and Financial Engineering)
11-Dec-2006
$114.35
$125.50
Optimization is an important tool used in decision science and for the analysis of physical systems used in engineering. One can trace its roots to the Calculus of Variations and the work of Euler and Lagrange. This natural and reasonable approach to mathematical programming covers numerical methods for finite-dimensional optimization problems. It begins with very simple ideas progressing through more complicated concepts, concentrating on methods for both unconstrained and constrained optimization.
Optimization for Data Analysis
21-Apr-2022
$54.10
$56.95
Optimization techniques are at the core of data science, including data analysis and machine learning. An understanding of basic optimization techniques and their fundamental properties provides important grounding for students, researchers, and practitioners in these areas. This text covers the fundamentals of optimization algorithms in a compact, self-contained way, focusing on the techniques most relevant to data science. An introductory chapter demonstrates that many standard problems in data science can be formulated as optimization problems. Next, many fundamental methods in optimization are described and analyzed, including: gradient and accelerated gradient methods for unconstrained optimization of smooth (especially convex) functions; the stochastic gradient method, a workhorse algorithm in machine learning; the coordinate descent approach; several key algorithms for constrained optimization problems; algorithms for minimizing nonsmooth functions arising in data science; foundations of the analysis of nonsmooth functions and optimization duality; and the back-propagation approach, relevant to neural networks.
Other Formats::
Hardcover
Numerical Optimization (Springer Series in Operations Research and Financial Engineering)
28-Apr-2000
$117.06
Presents a comprehensive and current description of the most effective methods in continuous optimization. Responds to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems. DLC: Mathematical optimization.
Other Formats::
Hardcover