Lean Center of Excellence

Economies of Scale Are Nice, But Rigidities of Scale?

Written by Lean Center of Excellence | Oct 20, 2022 8:31:07 PM

 

 

 

The effect of economies of scale is well-established and important for businesses. It is the principle behind the “learning curve effect” and the strategic value of gaining market share.

Economies of scale are essentially the conjunction of several mechanisms:

  1. The burden of fixed costs diminishes with scale, thus increasing the profit margin and profit amount.
  2. With scale, labor and machines can be more specialized and therefore more efficient (that is the economic theory, anyway).
  3. There is a “learning curve effect” (the direct labor hours per unit drop by a certain percentage with each doubling of quantity produced, and this percentage remains stable regardless of the quantity). The table below represents learning rates for various industries, as reported in various papers. Please note that the learning curve effect has not been validated in mass production but rather in industries where operators perform complex tasks or produce a relatively low number of units.

Typical Learning Curves

Aerospace

15%

Shipbuilding

15%-20%

Machine Tools (New Models)

15%-25%

Electronics (Repetitive)

5%-10%

Electrical Wiring (Repetitive)

15%-25%

Machining

5%-10%

Manual Assembly+25% Machining

20%

Manual Assembly+50% Machining

15%

Manual Assembly+75% Machining

10%

Punch Press

5%-10%

Raw Materials

5%-7%

Purchased Parts

12%-15%

Welding (Repetitive)

10%

 

What This Means for Operational Strategy

An important part of business strategy is to determine the right capacity and the right timing for investment. Businesses try to set goals that will maximize their financial performance and try to understand how costs will be affected by the volumes they produce.

As shown below, the relationship between average cost per unit and quantity produced can be approximated with Short Run Average Cost (SRAC) and Long Run Average Cost (LRAC) curves.

 

In economics, “long-run” refers to the period where a firm can vary all of its factors of production. In other words, in the long run, all inputs are variable. 

The Short Run cost curves have a U-shape: the cost per unit of output starts high since the fixed cost resources are underutilized and must be spread over a smaller number of units. They reach the theoretical minimum when utilization is high but not so high that chaos starts disrupting the value stream. It’s important to understand that aiming for 100% utilization is inefficient if there is variation, and the more variation, the more inefficient high utilization becomes. Wherever you have a combination of variability and dependency in a process, the overall efficiency of that process will drop at high utilization levels because there’s not enough capacity to handle the disruptions, which then snowball and overwhelm the system.

The Long Run cost curve indicates the minimum cost of production and can never be higher than the Short Run costs. This curve is derived from taking the optimal, lowest point of various short-run cost curves, at different levels of inputs or capacity. It assumes that a firm can set its capacity at any arbitrary point it wishes. 

What’s interesting is that economists also draw the long-run cost curve as u-shaped. Since they are not constrained by any limitations on inputs, why would there be an optimal point beyond which, scale becomes less efficient? One reason is that the deadly combination of variation and dependency also exists in the long run. The second reason is the cost of bureaucracy: overhead expands and becomes harder to optimize the bigger it is. The third is the cost of complexity. Complexity increases transactional costs, and more importantly, it makes the cost of incorrect forecasts and incorrect information much higher. If you want an example, look at what it means for General Motors to make a wrong decision on their electrical vehicle (EV) business, compared to what it means for a start-up to make the exact same mistake. Even though the start-up is 100% EV, it suffers less from its mistakes. That’s because the start-up, being smaller, is more nimble and learns from its mistakes faster. This is what I call the “rigidities of scale”: higher size usually means higher rigidity, and this makes learning much more costly and long.

This blog often talks about the effects of digitalization, and this post is no exception. Here, we’ll explore two main effects: the added flexibility obtained through the elimination of transactional costs, and the flattening of the “rigidities of scale” through more efficient markets.

 

What Does the New Economics of Digital Mean For Economies of Scale?

First, as digital platforms for businesses become more prevalent, there are more and more factors of production that become flexible. Remember the Long Run cost curve? Until now, this curve was a purely theoretical quantity, because capacity was (and still is) seen as a discrete function (with a stair-like shape), not a continuous one. Digitalization is changing this paradigm and is making the capacity function ever closer to a smooth continuous function. Veryable is a good example of how labor can now be transformed from a fixed cost to a true variable cost. You might say “but isn’t labor a variable cost since companies routinely lay off and hire employees?”. It’s a question of timescale, as we’ve seen in the distinction between short-run and long-run above. If a company adjusts its workforce no more than once a month (which by traditional standards is considered “often”), its labor costs are fixed for a month, hence the staircase shape of the capacity curve. The reason this is unsatisfactory is that, in the end, everything is about the customer, and because customer demand changes every single day, such a company not flexible enough. Look again at the illustration: the short-run curves represent how unit costs change with volume when capacity is inflexible. By contrast, businesses using the Veryable platform can adjust their labor inputs every day, and even at the hour timescale, since they can post ops for a precise number of hours, or a precise quantity of parts that corresponds to what needs to be shipped that day. That is represented by the long run, or “ideal” cost curve, one that is better because its use of resources is much more efficient.

Here, we’re talking about how reducing or eliminating transaction costs allows actors to make decisions more often, at a smaller scale, and therefore reduce waste by being more flexible and being “truer” to actual demand. This is the effect on the Short Run Cost curves.

The other main effect is on the Long Run cost curve: the digital transformation of businesses replaces the increasing complexity and rigidity that occurs with the scale with the automated efficiency of the marketplace. By giving more flexibility to the economic actors, the digital economy extends the reach of market dynamics, leading to a more efficient allocation of resources and higher productivity. With flexibility, there is less need for planning, budgeting, monitoring, controlling, etc. In other words, if you want to reduce bureaucracy, invest in flexibility. Because this flexibility is furthermore rewarded by market forces rather than by compliance to plans, it is self-regulating and self-improving. Finally, because digital promises to make it easier for information systems to talk to each other, loosening the top-down command-and-control methods doesn’t come at the expense of being blind to what happens. Reducing the cost of complexity shouldn’t be confused with reducing the number of parts, customers, or suppliers, as is often the case with the so-called 80-20 method. As its name implies, 8020 is based on the Pareto principle: 80% of the output is attributable to 20% of the inputs. In 8020, once you’ve identified your ”best” inputs (the 20%) you focus your effort on them, and increase the price on the others so that they either become more profitable or drop out. When applied the first time, 8020 may be useful by “trimming the fat” i.e., removing inactive customers or obsolete products. Fundamentally, however, it doesn’t reduce complexity but rather it reduces the size of the system. That is not a great way to grow! Reducing the cost of complexity means that you can handle a large number of things without incurring waste. Typically, this is achieved by moving the decision closer to where the consequences are. In other words, decentralized decision-making. Lean is a good example of decentralized decision-making. Digitalization makes decentralization an even more viable option by improving the data flow and providing high-level visibility into the effects of low-level decisions.