- About
- Events
- Calendar
- Graduation Information
- Cornell Learning Machines Seminar
- Student Colloquium
- BOOM
- Spring 2025 Colloquium
- Conway-Walker Lecture Series
- Salton 2024 Lecture Series
- Seminars / Lectures
- Big Red Hacks
- Cornell University / Cornell Tech - High School Programming Workshop and Contest 2025
- Game Design Initiative
- CSMore: The Rising Sophomore Summer Program in Computer Science
- Explore CS Research
- ACSU Research Night
- Cornell Junior Theorists' Workshop 2024
- People
- Courses
- Research
- Undergraduate
- M Eng
- MS
- PhD
- Admissions
- Current Students
- Computer Science Graduate Office Hours
- Advising Guide for Research Students
- Business Card Policy
- Cornell Tech
- Curricular Practical Training
- A & B Exam Scheduling Guidelines
- Fellowship Opportunities
- Field of Computer Science Ph.D. Student Handbook
- Graduate TA Handbook
- Field A Exam Summary Form
- Graduate School Forms
- Instructor / TA Application
- Ph.D. Requirements
- Ph.D. Student Financial Support
- Special Committee Selection
- Travel Funding Opportunities
- Travel Reimbursement Guide
- The Outside Minor Requirement
- Diversity and Inclusion
- Graduation Information
- CS Graduate Minor
- Outreach Opportunities
- Parental Accommodation Policy
- Special Masters
- Student Spotlights
- Contact PhD Office
It is almost impossible to miss the fact that “big data” is a topic du jour, and that numerous conferences, talks, and publications feature the specific combination of optimization with big data. Hence some have asked whether continuous optimization problems without big
data have ceased to be interesting.
We try to address this question from the perspective of a numerical analyst/computer scientist by examining recent work on several related and overlapping questions, with a particular interest in "nasty" instances: (i) What makes an optimization problem hard? (ii) How should the size and complexity of an optimization problem be defined? (iii) What advice can reliably be provided about the best methods for solving a given optimization problem (or family of problems)? (iv) And what about the best software?
Margaret H. Wright is the Silver Professor of Computer Science and former Chair of the Computer Science department at Courant Institute of Mathematical Sciences, New York University, with research interests in optimization, linear algebra, and scientific computing. She developed an interest in mathematics at an early age and studied the subject at Stanford University, where she received a B.S. degree in Mathematics, and an M.S. and eventually a Ph.D. in Computer Science. She is a member of the National Academy of Science and the National Academy of Engineering. She has served as president of the Society for Industrial and Applied Mathematics (SIAM) and is senior editor of the SIAM Review. In 2009 she became a Fellow of the Society for Industrial and Applied Mathematics. In 2012 she became a fellow of the American Mathematical Society.