Micron Investment Thesis

Published on 01/19/21 | Saurav Sen | 6,098 Words

The BuyGist:

  • This Thesis was published on June 11, 2018. 
  • For 2.5 years, we've watched this stock languish. 
  • The stock is finally getting some love. 
  • Some of our thesis is playing out. Some of it still isn't.

Micron suddenly finds wings.

For most of our existence, Micron has been a portfolio laggard. We had bought into the stock in the summer of 2018, and until recently it had almost always traded below our purchase price (around $60 per share). It was frustrating to watch this stock go nowhere for 2.5 years. Why did this happen? Micron’s stock seems to move in tandem with DRAM memory prices. Now, DRAM has become a commodity, which is why most DRAM companies have gone out of business over the last decade. Three companies remain standing – Samsung, SK Hynix and Micron – and now it’s a triopoly. This was a big part of our thesis – that supply-demand imbalances would be less severe with only 3 large players.

Our Micron Thesis was also based on:

  1. Volume: the exponentially greater need for memory with the advent of AI and the data explosion.
  2. Product Differentiation: the bet that the lines between Memory and Logic in semiconductor chips will start blurring, thereby making Memory technology less commoditized.

The first part played out as per script. The second part is still unclear. We believe that as more AI-heavy technologies take center-stage in our lives, Memory will become a more specialized technology. But nobody knows when that will happen.

Having said all this, Micron’s stock has found enthusiastic buyers in the last couple of months. What has changed? It’s the same old cyclical datapoint: DRAM pricing. Forecasts are rosy today. They won’t be next month or next quarter. We’ve seen these cycles for the last 2.5 years.

Despite Mr. Market’s constant swings from enthusiasm to despair about Micron’s prospects, we’ve been steadfast in our valuation estimate of Micron at $100 per share. We will probably sell if the stock gets close to that level unless the Product Differentiation part of our thesis materializes in a big way. If that happens, we’ll revaluate Micron. If not, we’ll probably exit at around $100.

Competitive Advantage: The Castle

Core Competency? Memory for Cloud Datacenters, Mobile, IoT devices and Autonomous Cars.

We can think of Memory in computers as “short-term memory” and “long-term memory”. A computer needs both, just like our brains do. Short-term Memory, which I referred to as “hot memory” in my AI Hardware article, is instrumental in what a computer does – compute. That’s because data in hot memory needs to be accessed quickly for real-time computation, pronto. Long-term Memory, which I had referred to as “Cold Storage” in that article, is where data is stored for the, well, long-term. This type of Memory is a little more removed from the computer’s engine – it’s processor. This data doesn’t need to be accessed as fast or as frequently. This is your hard-disk, if you will.

Micron makes both kinds of Memory – hot and cold, short-term and long-term. That’s all they do, and that’s something I like in a company – focus. Their products are good at one thing – store data. They offer two main types of products – DRAM (for hot memory) and NAND Flash (for cold memory). Lately, however, NAND Flash has been making its way into some hot memory applications as well.

As data has been growing exponentially, especially over the last few years, Micron and its few competitors are front-and-center in this new information revolution. And it shows in the numbers – their (and their competitors’) revenues have seen tremendous growth over the last couple of years. The reason is Data. But the catalyst is AI. All that good data can finally be used for profitable insights – that’s what AI is good for.

It used to be that Hot Memory and Cold Storage were used mostly in PCs and Servers. And so, the Memory companies’ fates were inextricably tied to the fates of these PC and Server companies. The pattern was cyclical. The world has changed. Now the “use-cases” are multitudes more. One of the reasons we’re at a grand inflection point is because we’re right at the beginning of this amazing confluence of Cloud Computing, Artificial Intelligence (AI), Internet-of-Things (IoT), Virtual Reality (VR) and a faster 5G network. We’ve never been here before. Among the things I just mentioned, Cloud Computing has taken off and flourished. AI has just begun. We’re not even in the early innings of IoT, VR and 5G; the game hasn’t even begun.

The point is that the “end-use” cases of Memory are multitudes more than the PC-Server era of just 3 years ago. In just these last 3 years, several reinforcing trajectories point to a sustainable growth scenario for Memory companies:

  1. Exponential growth in Data, because…
  2. Many more sources of Data, which can finally be used for insights (read as making money), because…
  3. Artificial Intelligence, which can finally be democratized, because…
  4. Growth in Cloud Computing, but…
  5. This increases the complexity of computing in a way we’ve never seen before, which…
  6. Increases the need for more Memory, and Memory that doesn’t slow down this new, complicated computing.

Innovation in Memory is more important now than ever before.


Products: Better or Cheaper? Moving from commoditized products to Memory “solutions”.

Memory has traditionally been a commodity. Memory companies would make components for PC companies, who would tack it on into a computer chassis. The importance of hot memory and cold memory changed over time, but not by much. Increase in Memory was really the biggest driver of change. It would allow PC and Server makers to tout these increases as “differentiators” for which they would charge a higher price. In the PC world, however, many of these memory increases were nice to have but made small incremental changes to our lives. For gamers and IT professionals (who manage Servers for companies), these increases meant something more. But even then, Memory was a commoditized, seasonal product. Many people made it. Innovation happened, but mostly in terms of fitting more memory into a PC or a Server. When that innovation took place, Memory companies would make money for a while. And then the laws of competitive destruction would ensure that supply increased, and prices fell; as did their profits, until they released some more memory to pack into a computer. This cycle went on and on for roughly 20 years, with its ebbs and flows. This was the old paradigm.

That paradigm doesn’t apply any more. Here’s why:

  1. It’s not about just PCs and Servers anymore, as mentioned in the section above.
  2. The new complexities in computing demand more innovation – in both processing and memory, because…
  3. The two functions are more intertwined than ever before, as we enter this new computing paradigm of Cloud/AI/IoT/VR/5G.

In this new paradigm, the Memory architectures that were built for the PC-era aren’t cutting it. As mentioned in the previous section, there are 2 main types of Memory in use today – DRAM and Flash. Both have advantages and disadvantages. Neither one is ideal for this new era.

There are 2 main variables that computer makers need to worry about in both Processing and Memory: Speed and Power Efficiency. Obviously, we want both. In the “Cold Memory” part of the equation, Flash is clearly the winner. That’s because of a 3rd variable: Volatility. Data needs to remain intact when power is switched off – in industry-parlance it means that Memory is Non-Volatile. Flash Memory checks that box. Volatile Memory – like DRAM – doesn’t hold on to data when the power is switched off. But in terms of the first two criteria, here’s the tradeoff:

  1. DRAM is fast; much faster than the ironically named Flash. But it’s very power-hungry.
  2. Flash is power efficient, but slower, partly because it holds on to data, which involves slower read-write procedures.

DRAM, because it’s fast, is usually used in conjunction with the Processor. Flash, because it holds on to data, is usually used for “Cold Storage”. I’ve gone through this in some detail in the AI Hardware article, but the main point is this: DRAM is too power-hungry and Flash is too slow. As data and the use-cases of all that data increase, these constraints become a problem. They already are a problem at the scale at which Cloud infrastructure is growing.

Memory companies now have a renewed sense of urgency. They need to come up with products that solve the Power Efficiency and Speed problems. This is a major shift – the Memory guys are now trying to find solutions to new problems – problems that are bound to get worse in the next few years.

So, what is Micron doing about it? A few things:

  1. In DRAM, they (and their competitors) are now facing some version of Moore’s Law. They are facing diminishing marginal returns with each new innovative move. In industry-parlance, the rate of bit-growth is slowing down with each new product. There seems to be 2 ways to counter the Memory version of Moore’s Law: 1) A New Architecture and 2) New Materials, for more power-efficiency. Micron is working on both and have 4 new versions of DRAM in the pipeline (1Y, 1Z, 1-alpha, 1-Beta). 1Y will be released this year.
  2. In the Flash world, the way to work around the Memory version of Moore’s Law is to go three-dimensional. This increases the complexity of Flash technology significantly. Micron recently released 3D QLC NAND Flash. They were the first-to-market with this. They claim it has 30% more bandwidth and uses 40% less energy than the previous generation 3D NAND Flash product. They claim this is industry-leading.
  3. 3D Xpoint: This is Micron’s (and their partner Intel’s) first cut at solving the DRAM-Flash tradeoff problem for the new computing world. According to Micron, it’s 1000 times faster than Flash, 1000 times the endurance of Flash, and 10 times denser than DRAM. It’s a novel solution that comes about half way towards solving the speed issue of Flash. Management suggests there is significant interest from customers. But it’ll only be released in 2019. Until then, the jury will still be out.
  4. New Non-Volatile Memory products like NOR and NVDIMM. These are also aimed at solving the Speed/Power-efficiency tradeoff problem.

The point is this: The prevailing Memory solutions are not satisfactory for the new era of AI and Big Data. The speed/power-efficiency tradeoff may have been fine for the PC-era, but it doesn’t suffice in this new era. Things have to change, and technological innovation, as opposed to higher volume of production or packing more bits in a wafer, is the way to do it.


Evidence: Profitability? Revenues and EBITDA increasing fast due to growth in end-markets.

Micron has had almost 2 blockbuster years. All the stuff I’ve mentioned above is more than evident in their numbers. Revenue grew by more than 60% in the 2017 fiscal year ending August 2017 compared to the same time the previous year. In the latest quarter (ending February 2018) Revenue was up more than 90% compared to the same time last year! Both price and volume growth are exacerbating each other. Sequentially, however, revenue growth was about 15% from last quarter (ending November 2017), which is still impressive. But there are some signs of NAND Flash price declines. DRAM prices, while continuing their upward trajectory, can’t keep growing forever. This is why the market is skeptical of this story. Sure, the stock price has skyrocketed over the last few years, reflecting the awesome revenue and EBITDA growth rates. But Micron is trading at a very meek multiple, whether measured in terms of earnings (which I don’t care for) or free cash flow (the much more logical measure).

The “market” seems to think that there is massive risk of Oversupply in DRAM, which constitutes at least 2/3rds of Micron’s revenue. I’ve alluded to this in the first section on Core Competency – in the PC Era, memory was commoditized, which means that there was almost no difference between Micron’s products and that of its competitors. So, as PC and Server sales grew, demand grew, and revenues grew. That was quickly followed by a secular ramp-up in Memory production, which led to oversupply, which was quickly followed by steep declines in price. The result was extreme volatility in revenues, which eventually led to many Memory companies going out of business.

This research piece from McKinsey highlights the problem throughout the history of the Memory industry. But this paper also highlights another problem – and that has to do with market perception. The fears of a massive supply-demand imbalance have two underlying assumptions:

  1. Memory is still as commoditized as it was 3 years ago.
  2. We still live in an era where PCs are the major demand driver of Memory.

For reasons mentioned in the previous sections, these don’t apply any more. The McKinsey paper had three mentions of “Cloud” and ZERO mentions of “Artificial Intelligence”. This paper was written in March 2016, just over 2 years ago. They’re supposed to be “thought-leaders” and they were still thinking in the PC-era not so long ago. I believe much of the market is still thinking in the PC-era.

Having said that, supply-demand imbalances can still happen. But the demand-side is much more diverse now and has a massive growth trajectory in front of it. The probability of oversupply is much lower than it was just 2 or 3 years ago.

Durability of Competitive Advantage: The Moat

Competition? Heavy. Samsung, SK Hynix and new Chinese upstarts. And new technology.

There are now 6 companies that make Flash Memory. And 3 of those also make DRAM. They are Samsung, SK Hynix, and Micron. Among these, Micron is the smallest by revenue. But Micron claims it’s gaining market share. I can’t find data on that. But if I compare revenue growth rates of Micron and SK Hynix (Samsung isn’t a pure Memory company), Micron’s claims seem to have some merit.

The threat of competition from Samsung and SK Hynix is on-going. So far, in this 2-year bull run for Memory companies, Micron has more than held its own. On DRAM, it started off as the least cost-competitive player. Micron claims to have closed the gap with Samsung and SK Hynix. On Flash, Micron claims to have outdone its rivals. This brings me to the 3 main criteria for judging competitiveness of Memory companies:

  1. Cost-effectiveness
  2. Solutions to the DRAM/Flash tradeoff problem.

On #1, I would peg Micron as “average”. It’s hard to see them having a sustainable cost advantage over its Korean rivals. On #2, I believe Micron has an advantage that can snowball into something sustainable as we move through the new inflection point in AI/Big Data. Here’s why:

  1. Micron seems to have the most comprehensive suite of Memory products among the 3 biggies. I say this based on what they advertise on their website. Apart from DRAM and Flash, they will soon release their 3D -XPoint product. At the moment, they are the only company with a third alternative to DRAM and Flash that is in production.
  2. Micron seems to have changed its self-perception. They claim to be a “Solutions” company now, as opposed to a Components company. What does this mean? It’s not just the volume of data that’s increasing (at an exponential pace, no less), but it’s also the sources of data that are increasing (and will increase) at an unprecedented pace.

On #2, Micron’s theory is that different end-use cases will require different Memory solutions and architectures. It’s not just the PC and the Server any more. It’s those, and Cars, Robots, Smart Homes, Virtual Reality, Smart Cities, etc. Most of the things mentioned here aren’t even that smart yet. But with the democratization of AI via Cloud, that will change fast. And it will require memory that works optimally with its specific processing architecture, to save both money and power.

In Autonomous Cars, which seems to be an inevitability, Micron claims to have #1 market share. Again, I couldn’t find any data to support their claim. But it may have to do with Micron’s realization that it needed to get out of the “one product to rule them all” mentality that dominated the Memory world. I was encouraged to hear Nvidia’s CEO Jensen Huang’s statement at Micron’s Investor Day in May 2018: Nvidia (the leading maker of Graphics Processing Units that are used often in Machine Learning and AI) starts working with Micron on a “blank sheet of paper”, as and when they are at the initial design phases of their products. This type of deep cooperation never really happened in the PC-era. As computing architectures change (and they are changing fast) to accommodate AI and the unprecedented volume of data, they must take into account the best available Memory architectures for that application. And that goes both ways. Memory companies need to find solutions to enable AI on the Cloud and on Edge Devices like phones and IoT machines.

But Micron and its 2 main rivals face other threats:

  1. China is keen to get into the Memory game.
  2. New, emerging technologies that may leapfrog DRAM, Flash, 3D XPoint etc.

Can the Big 3 protect themselves from this onslaught?


Protection? Low. But focus is now on technology innovation (more IP).

There isn’t much Micron (or Samsung or SK Hynix) can do about new Chinese factories that will churn out DRAM and Flash products. And there isn’t anything they can do about some startup coming up with a new Memory technology that has DRAM-type speed with Flash-type persistent memory and power-efficiency. But there are 2 reasons to be optimistic:

  1. Memory is more about technology innovation now, than about volume. It’s more about Intellectual Property now. We’ve discussed this at length.
  2. It’s likely that one of the 3 biggies will find more solutions to bridge the gap between DRAM and NAND Flash, because it takes scale and know-how to execute a plan. Micron has a head-start over it’s rivals with 3D-Xpoint. I’m sure Samsung and SK Hynix will come up with something else. And that’s because they have the ability to scale up their innovative designs, which startups won’t have. It doesn’t do the Cloud/AI market any good if it can’t get enough of the new innovative product. It’s a major investment for them, and the Googles and Amazons of the world won’t change their Datacenter architectures for a marginal deployment of some experimental Memory technology.

The sum-total of these points is this: Memory is a much more capital-intensive business now. It’s much harder for new entrants to enter and hit the ground running. And it’s much riskier for someone with deep pockets (like the Chinese) to come and assume that the prevailing technology will remain intact long enough for them to get a decent return on their capital investment.


Resiliency of Cash Flows? Fast Cycle product. But less volatile in the post-PC world.

All the points discussed above should provide some immunity from the massively volatile cash flow profile of the PC-era. Let’s recap:

  1. Exponential growth in Data, because…
  2. Many more sources of Data, which can finally be used, because…
  3. Artificial Intelligence, which can finally be democratized, because…
  4. Cloud Computing, but…
  5. This increases the complexity of computing in way we’ve never seen before, which…
  6. Increases the need for more Memory, and Memory that doesn’t slow down this new, complicated computing.

Because…

  1. It’s not about just PCs and Servers anymore.
  2. The new complexities in computing demand more innovation – in both processing and memory, because…
  3. The two functions are more intertwined than ever before, as we enter this new computing paradigm of Cloud/AI/IoT/VR/5G.

More Data. More types of Hardware. More complicated Software. More use-cases. In a weird way:

  1. The Economic Time of Cash Flows is decreasing.
  2. But product specialization is increasing, which…
  3. Decreases the volatility in cash flows caused by long periods of oversupply.

The key to a Memory company surviving and thriving is that it must innovate and release a new solution for the myriad of use-cases every couple of years. It must be proactive now, instead of being reactive, as it was during most of the PC-era.

Management Quality: The Generals

Strategy & Action? Positive. Investing in R&D; not capacity growth. But heavy share buybacks.

Micron’s Management is relatively new. Under new-ish CEO Sanjay Mehrotra (from SanDisk), Micron has ridden an amazing Memory bull-run. The upswing in demand certainly got Micron out of a horrible 2016 and has now given them enough cash to work with. Micron’s Management believes that this time the demand-surge is sustainable, for all the reasons we’ve discussed. So, with this newfound energy in the form of cash, what are Mehrotra and Team doing with it?

Micron’s Management is committed to reinvesting about 30-35% of its revenues as Capital Expenditure. And they plan on reinvesting that cash mostly in what they call “technology transition” projects as opposed to “wafer growth” projects. This is a positive. Micron has decided to become a Solutions company instead of a Components company, and it must put money where its mind is. Technology transition refers to everything we’ve been talking about – all the steps from now until the time we have Memory technology that has the speed of DRAM and the power-efficiency and persistence of NAND Flash. So, I like the decisiveness of Mehrotra and team. But I have some questions.

Why use cash to return to shareholders in the form of the newly-announced $10 billion buyback program? If Memory is not what it used to be, and if “technology transition” is needed, why not go all in and beat Samsung and SK Hynix on innovation?

Having said that, I must give Management (present and past) credit because they’ve used a lot of this cash windfall over the last couple of years to pay down debt. They are inching closer to a ZERO Net Debt profile, which means they’ll be able to pay down their entire debt with cash on the Balance Sheet, should they choose to do so.


Alignment of Incentives? Average. Stock-based compensation, but no direct link to ROE/ROIC.

I don’t have major complaints with Micron’s Executive Compensation plans. As I read the Proxy Report, I got the sense that they were trying to just shoot for “average” within their peer group. Stock awards seemed to feature a lot in their plans. But I would rather have a big component linked to Free Cash Flow and ROIC. That would be more directly tied to the results of Management’s capital deployment actions, which is its most important job.


Financial Productivity? High ROE/ROIC company with low debt.

Micron’s ROE is in the mid-teens, which should be sustainable in this new era. Again, I wish Management was incentivized to keep it sustainable, especially in an industry that’s plagued with cash flow volatility and sudden dips in ROE.

Micron’s Debt is now low, thanks to the recent cash windfall which they used to pay down debt. So, there isn’t a whole lot of leverage in the ROE number. The big question is, whether margins will hold up. If Management is right, and Micron succeeds in becoming predominantly a Memory Solutions company because of their investments in “technology transition”, this ROE profile should hold up. And that should create some value over and above whatever “cost of capital” analysts slap on to a company like Micron.

This brings me to a pet peeve of mine, which is the whole concept of “Cost of Equity”. A company like Micron, which has shown them a lot of volatility in the past (and thus highs), is bound to “justify” a high cost of equity in their calculations. I’ve mentioned the differences between the PC-era (when Micron exhibited this high volatility) and this new computing era. On top of that, the whole concept of volatility and Beta is based on a flawed theory anyway. I’ll stop ranting now and get to the point: I believe this flawed measure of risk (the denominator in their valuation models) partly explains why Micron is trading at such a low valuation.

“Beta and Modern Portfolio Theory and the like – none of it makes any sense to me…how can professors spread this? I’ve been waiting for this craziness to end for decades. It’s been dented and it’s still out there.” – Charlie Munger


Sustainable Free Cash Flow? Roughly $5.8 billion. Translates to roughly $100/share.

The valuation of Micron to $100/share is pure coincidence. What it represents is this:

  1. Assumed that DRAM and NAND prices fall sharply from current levels. But this is partly offset by continued growth in volume, based on all the factors discussed above. Overall, I’ve assumed a 10% decline in revenue from current levels.
  2. Assumed EBITDA margins drop to 50% from current levels of about 60%. This is because a lot of the operating costs are fixed.
  3. Assumed a lower cash paid for interest bill, because of a big paydown of debt in the last couple of quarters and the continued commitment to pay down debt going forward.

Other than that, there isn’t much science to the valuation. I’ve assumed my normal 20X multiple that I’m comfortable paying for what I consider are “good” companies. In the end, the thesis is essentially this:

Micron is transforming from a Memory Component to a Memory Solutions company for the Cloud/AI/Big Data era with innovative products. Revenues, margins and free cash flow should be more stable in this new, post-PC era.


We use cookies on this site to ensure the best service possible.