Say you’re starting a business to make and sell, I don’t know, cardboard boxes. You have some big decisions to make like,where should you put your factories, and your warehouses, and your distribution centers? This might not sound like a mind-bending problem, but it can be ridiculously complex.
You clearly want to be close to your customersand to your suppliers to keep shipping and transportation costs down. But you also have to account for those suppliers’prices; the demand for each type of box; how much material each box uses; setup and maintenancecosts; production capacity; the costs of power, and rent, and real estate; and, if you’rea good citizen, the environmental impact of every choice.
It’s a /lot/ to keep track of! This type of situation is called a constrained optimization problem: You’re looking for the best balance of competing factors /while/ making sure your solution is actually possible. Constrained optimization doesn’t exactlyget much press, like you probably had never heard of it before, but it’s the hidden architecture beneath practically every human endeavor youcan think of, from cardboard manufacturing to the power grid, airline routing, and evenmolecular biology.
And one of the most popular ways to solveproblems like these takes inspiration from an unusual source: the way metals cool. Now, when I say “business decisions,”metallurgy probably isn’t the first thing that comes to your mind. And it also wasn’t, like, computer scientists’ firstthought.
They initially designed lots of efficientalgorithms to find the absolute best solutions to constrained optimization problems. But those methods work best when the problemhas a nice, friendly mathematical form like when the algorithm just has to findthe mathematical equivalent of like the highest spot on a hill.
When the problem involves lots of variablesrelated in more complicated ways, even a computer would take ages to find the best solution! So eventually, researchers looked for inspirationin nature, which is great at finding solutions that might not be perfectly optimal, but comepretty close. For example, in 1983, three IBM researchers drew an analogy between optimization and, of all things, cooling metal. See, when a metal is heated and then slowlycooled, its atoms tend to settle into the arrangement with the lowest possible energy.
In other words, the arrangement of the atomsgets naturally optimized. That’s convenient in metalworking, becauseif you want to work some metal into a certain shape, it’s easiest if the atoms are allneatly arranged in a crystal structure. But normally, there are defects, like atomsout of place and fault lines between different sections of the crystal.
So metalworkers use a technique called annealing:They heat up the metal really hot, then slowly cool it down. When the metal is hot, its atoms move around randomly. But as they cool, their movements become lessrandom: With less energy, they become less likely to jump from low-energy positions intohigh-energy ones. So they naturally settle into the lowest-energyconfiguration, which is a defect-free crystal. Because the atoms cool so slowly, they havethe opportunity to find low-energy positions before they get frozen in place. For the IBM researchers, this natural processwasn’t just a neat fact about metalworking.
It was also inspiration for an optimizationalgorithm in other words, a procedure computers could follow to solve complicated optimizationproblems. They called it simulated annealing and the idea was to imitate the natural process. Just like physical annealing goes throughdifferent arrangements of atoms, the algorithm goes through different solutions to a constrained optimization problem.
This might not sound like an obvious comparison,but here’s how it works: In the example of the cardboard business,you’d start by generating some random choices of products, facilities, and locations. Then you’d keep making random changes. Initially, you allow even some changes thatmake the profit margin or environmental impact /worse/ like moving a warehouse far fromits nearest factory.
That’s because at this point, allowing detrimentalchanges is like letting the atom bounce into a position that temporarily puts the metalstructure into a higher-energy state. Ultimately, that high-energy state might turnout to be a step on the way to the lowest energy one. It’s kind of like giving something a goodshake to get it out of a rut.
Similarly, when the algorithm suggests a /bad/solution, that can eventually lead to an even /better/ option than it would have found otherwise. Over time, though, the algorithm becomes lessand less willing to accept changes that don’t improve the result—just like, as the metalcools, it keeps getting harder for atoms to bounce into higher-energy states. Eventually, you converge on something veryclose to the best possible set of products, facilities, and locations. Simulated annealing is extremely popular.
It’s been used in over a /million/ research papers and real-world projects and not just for cardboard boxes. Say you work for an airline, planning flightroutes. Deciding where to fly and which planes shouldfly each route is tricky because you want to optimize profit per passenger and minimizewasted fuel. Meanwhile, you’re constrained by thingslike travel time and each aircraft’s capacity.
How do you balance all those factors? How do you make decisions in that wild world? Maybe simulated annealing! Or maybe you’re a molecular biologist tryingto work out the structure of a protein say, a protein that some deadly virus uses to replicate. You want your structure to match your experimentaldata as closely as possible, but it’s constrained by the fact that the structure has to be physically possible.
Again, you could try simulated annealing! This method has also been used to schedule university exams, to decide when and where to build power generators, to help robotsfind the best route from point a to point b, and on and on And it’s just one of dozens of popular optimization algorithms each of which works for a different class of problems.
So really, we’ve just scratched the surfaceof how constrained optimization powers modern society and one of the best methods comesfrom studying nature itself! Thanks for watching this episode of SciShow! And a special thank you to this month’s Presidentof Space, Matthew Brant, who is one of the amazing supporters making it possible forus to keep doing episodes like this.