Understanding Total Addressable Market’s Research Paper

TAM is defined as the revenue a company could realized while having 100 per cent market share while creating shareholder value. The ability to calibrate the total addressable market (TAM) is a major part of anticipating value creation.

This report provides a framework for estimating TAM through the process of triangulation. Three methods are used – first is based on population, product and conversion. Next diffusion model is analyzed in which addresses absolute size of market and also rate of adoption. Finally, base rate check is done for reality check.

Categorizing New Products

To assess market size categorization of product is a logical starting point. This allows us to appraise a company’s strategy to promote the product. Companies can influence their TAM through pricing strategy and product enhancement. Pricing strategy is when companies sell products at discount in order to gain market share. Product enhancement can reflect improvement in the product itself.

Current technologies that have applications for new customers and new technologies used by current customers are the categories where a TAM analysis is most relevant. TAM is tricky to analyze when both technology and customers are new.

Some specific ways to estimate TAM

Market Size – Population, Product and Conversion

First approach is to estimate absolute market size, in which number of potential customers is multiplied with expected revenue per customer. There are three parts for this analysis – first is population in which we estimate the population which can use our product, second is product in which we estimate population which is likely to use our product and the last is conversion where we analyze what can be the potential revenue. Factors that shape up demand and supply can be considered for doing more in – depth analysis of absolute market size.

Factors to consider in assessing demand are financial resources, physical limitations, elasticity of demand, cyclicality, substitution and substitution threats.

Factors to consider in assessing supply are ability to supply, unit growth and pricing, regulatory constraints, incentives, scale, niches.

TAM and the Bass Model

The Bass model allows for a prediction of the purchasers in a period, say for each year, as well as a total number of purchasers. Bass model relies on three parameters –

  1. The coefficient of innovation (p) – This captures mass-market influence
  2. The coefficient of imitation (q) – This reflects interpersonal influence.
  3. An estimate of the number of eventual adopters (m) – A parameter that determines the size of the market.

The equation for the Bass model is – N(t) – N(t−1) = [p + qN(t−1)/m] x

This formula simply says that new adopter’s equal the adoption rate multiply the number of potential new adopters.

N(t) – N(t−1) shows the number of adopters during a period i.e. simply the difference between the users now, N(t), and the users in the prior period, N(t−1). First term on the right side of the equation spells out the rate of adoption. The second term on the right side is the number of users who have not yet adopted the product.

Investor application of the Bass model –

The first is to estimate product potential based on the parameters from historical diffusions. Second way to use the model is to start with a company’s stock price and backwards.

Bass model also allows solving for the size of peak sales. For calculating the size of peak sales this equation can be used –

Size of peak sales = m[(p+q)2 / 4q].

There are typically three stages in industry evolution. During the first stage, the number of competitors grows. In second stage there are large shakeouts as the result of higher number of firms exiting. In third stage, number of competitors and market shares stabilizes. In stage two company’s sales can grow faster as the numbers of competitors are reducing.

Limitations for Bass model –

This model can be helpful in judging TAM. But there are some limitations that don’t allow it to capture certain considerations.

First limitation is of replacement cycle. Bass model is used primarily for forecasting adoption of a product. But very less attention is given to replacement cycle. Replacement cycle talks about replacement of a product once the purchase is done.

Second limitation is of economies of scale. Economies of scale is when company’s fixed cost gets spread over higher sales i.e. when company do more sales then the fixed cost spreads over a larger base. This limits the level of TAM because companies fail to create value after a certain size. There is a separate issue with similar implications. This is when companies overshoot their markets. Two symptoms of overshot market are customers use only a fraction of the functionally the product offers and they are not paying for new features.

Third limitation is of network effect. Network effect is there when value of one product increases when more people start using it. Example – telephone. If only one person is using then it has very little value. But as more number of people starts using it the value increases. In businesses where strong network effect is there, market shares of companies are higher. Companies spend heavily in the hope that their product will become the product of choice but in sectors with strong network effects, most companies fail to go from early adopters to mainstream users.

Base rates as a reality check –

Third method in the triangulation process to estimate TAM is careful consideration of base rates. The main idea is to refer to what happened to other companies when they were in the same stage. This can be useful because, say for example, a company’s management is saying that they can grow at 10% CAGR in the coming 5 years. But in base rate method we check that what happened to other companies when they were in the same stage. This can be a reality check. Base rates provide a check on the output of the first two approaches to estimating TAM.

TAM and Ecosystems

Generally business can be in three categories – Physical, Service and Knowledge. Main objective is to understand the characteristics of these categories and to consider how companies can expand across them.

Physical – Main source of cash flow for these businesses is tangible assets like manufacturing facilities, stores and inventory etc. Sales growth is tied to asset growth.

Service – Main source of competitive advantage is people for service businesses.

Knowledge – These businesses also rely on people as their main source of competitive advantage.

These categories differ in economic characteristics. Some considerations are as –

Source of advantage – Physical companies depend on tangible assets while service and Knowledge business depend on people.

Investment trigger – Physical and service business can grow by adding capacity either in the form of CAPEX or new employees. Knowledge businesses generally invest to deal with obsolescence.

Products and Protecting Capital – Rival goods are those where consumption of one’s product will decrease the consumption of others. Non-rival goods are generally difficult to protect because they are relatively easy and cheap to replicate. Creator of a non-rival good often has difficulty capturing the value. One strategy companies use to increase TAM is to extend business in new business categories. This extension can have challenges such as trade-offs between open and closed systems, functionality in the product or in the cloud, determining which party owns the data, and whether or not to monetize data by selling to outside parties.  

Understanding Value Migration’s Research Paper

This research paper discusses about value migration in 4 industries.

What is value migration?

Value migration is flow of economic value from old business model to new business model which are better able to satisfy customers. For example – Shifting of people from black and white television to colour television was value migration. For colour television industry it was value inflow but for black and white television industry it was value outflow.

Value migration happens in three stages –

Value Inflow – Companies or Industry is able to capture value from other industries or companies due to superior value.

Stability – Competitive equilibrium is established.

Value Outflow – Value move towards new industry or industry which are better able to cater needs of customers.

The four sectors covered in this paper are BFSI, Information technology, Oil and Gas, Consumer (Jewelry)

BFSI –

Shift is happening from public sector banks to private sector banks which are customer friendly. Corporate banking sector is facing challenges and public sector banks will face challenges on capitalization and growth. While private sector banks can emerge stronger because they have used digital capabilities and also expanded their branch network. They also have a strong traction in CASA mix.

Private sector banks have invested heavily in technology and have come up with various innovative products. Their CASA grew at fast pace. Also private sector banks have good digital architecture. They have higher share in digital transactions. Private sector banks have taken balanced approach. Through strategic partnership costs are under control and also profitability has been maintained.

For public sector banks, government announced recapitalization plan but most of the money will be used in provisioning requirement. FY 19 was in full compliance with Basel –III regulations; there was pressure on public sector banks to meet the capital norms.

Thus, value migration is accelerating in BFSI.

IT –

In last decade, Indian IT industry was in high growth trajectory. It was driven by cost – led value migration. But industry reached the stability phase once the market share gains and profit margins started to settle. But now the changing priorities are clearly visible. Next level of savings is offered by automation which will make location irrelevant. Client’s technology systems are changing. Digital transformation is needed to survive threat from born in cloud organizations. Indian IT industry needs to replace traditional stream revenue with new ones.

Oil and Gas –

In recent years, rising pollution is a key concern while making policies. According to WHO, half of the 20 most polluted cities globally are Indian. Various policies like Hydrocarbon Exploration Licensing Policy (HELP), Open Acreage Licensing Policy (OALP) etc. are expected to boost domestic gas production.

There’s a lack of infrastructure. But government’s focus on battling pollution by taking initiatives in gas sector can help in increasing demand. Small scale LNG is yet to take off in India. Companies have announced their intentions. But with enabling policies, improvement in pipeline infrastructure etc. whole gas sector is likely to benefit. Importers would be the biggest beneficiaries, as demand will increase and domestic gas production is unlikely to keep pace.

Consumer – Jewelry

 The core drivers for jewelry such as rising disposable incomes, changing consumer preferences etc. remain relevant. Other drivers have emerged that are – GST implementation has tilted the balance in favor of organized layers. Unorganized players will have further disadvantage because of more stringent rules being introduced. Unorganized players also have lower credit availability after breakout of Nirav Modi scam. Companies are also taking initiatives in the form of exploring the unexplored segments of businesses. According to World Gold Council (WGC), gold demand was flattish or declining for past 3 quarters. Overall there was no increase in market size but Titan Company’s market share increased.    

Understanding Covid-19 Impact on Cement Industry

What I Understood?

Due to Covid, government imposed lockdown in March and companies were unable to operate for a few days. In this lockdown, many labours migrated to their hometowns and thus when factories were started again there was unavailability of labour. Construction in metro cities took a halt due to lockdown and thus demand of cement was not there. Even after lockdown was lifted construction is not started with the same pace so demand is yet to increase in these cities.

On the other hand as the labour migrated to their hometowns they didn’t had much work to do. So they took the work of either constructing or repairing their own houses. Also people who wanted to start construction didn’t start but the ones who had started their construction began to finish the work. Also demand can increase before monsoon because people will try to finish some work before monsoon.

So on current situation rural areas led the demand. Now as the labours will migrate back to cities around Diwali cement demand can be increase at a substantial rate.

On costing front, some companies saw increase in freight cost while some managed it by selling higher volumes. Overall, according to companies, there were no major changes in cost.

On CAPEX, companies are delaying their CAPEX plans. One major CAPEX is done on WHRS installment. Companies are going to install WHRS which will help in reduction of fuel cost. Other CAPEX programme such as expansion of grinding units, maintenance work etc. will be delayed and also some amount of CAPEX will be reduced depending on company to company.

Sources usedhttps://drive.google.com/drive/u/2/folders/1BGzJuQW8pvCkiiImNXMCRYy3yrSrfRsb

Price to Book ratio – A Wrong Metric for Service Business

What is Price to Book Ratio?

This ratio calculates the market value of share to its book value. Book value is the net assets of the company. Market price is the current price of the share.

Formula – Market Price per Share / Book Value per Share

This ratio is used by investors to check whether they are overpaying for a particular company’s stock or not.

This ratio can be a bane for service industry because in service sector companies the main assets are its employees and the cost of those employees is deducted in profit and loss account. Hence, the main assets are not on balance sheet and thus lower assets.

Let’s take two examples to interpret this.

We will take two companies i.e. one is asset heavy and other is engaged in services. For one company its assets are on balance sheet and for another its main assets are on profit and loss account.

All amounts are in INR. (Data Source – Annual Report)
All amounts are in INR. (Data Source – Annual Report)

As we can see, price to book ratio of HDFC AMC is coming higher but it cannot be said that the investors are overpaying by seeing only this ratio. As the company is mainly engaged in providing services, its major assets i.e. its employees, their cost is deducted in profit and loss account and thus having lower assets and overall net assets are also lower. But in UltraTech cement which is an asset heavy company, its major assets are on balance sheet only and thus higher net assets also.

So, P/B ratio cannot be a great ratio to analyze service sector companies.

Operating Leverage Research Paper by Michael Mauboussin

“Operating leverage measures the change in operating profit as a function of the change in sales.” To check about operating leverage in a company one can check the ratio of it’s fixed to variable cost. Fixed costs are not affected by company’s sales. So if company’s sales are low, then fixed costs will dampen profit. But if sales are higher than profit will be higher too. Generally companies with higher fixed assets to total assets ratio will have higher fixed cost too. So there’s a positive correlation between fixed assets to total assets ratio and operating leverage.

Sales Growth

Sales growth forecast is done by using economic growth, industry growth. Industry growth follows an S-curve and analysts make mistake in the middle of the S-curve. To assess industry size, number of potential customers can be multiplied by revenue per customer. Mergers and acquisitions also need to be carefully analyzed as it can change the nature of company’s operating leverage. Also evidence shows that it is challenging to create value by doing mergers and acquisitions. Increasing market share can also result in increasing profitability as there is a positive correlation between market share and profitability.

Sales growth is an important value driver because it is the source of cash and affects value factors. If company is earning more than cost of capital then only sales growth creates value otherwise it destroys value.

Level of operating profit margin at which a company is earning its cost of capital is threshold margin. Company with higher capital intensity requires a higher operating profit margin to break even in terms of economic value. So threshold margin can be used to make connection between sales growth, profit and value creation.

Value Factors

Operating profit margin can vary based on sales. To sort out cause and effect of changes, value factors can be considered.

Volume – Volume changes lead to sales changes and thus can influence operating profit margin by operating leverage and economies of scale.

Price and Mix – Change in price can impact margins i.e. if a company sells same units at higher prices then margin will rise and vice versa. Warren Buffet argued that “the single most important decision in evaluating a business is pricing power.” To assess pricing power, price elasticity can be used.

Operating Leverage – Preproduction costs i.e. investments done before generating sales and profits are capitalized on balance sheet and are depreciated later on. These costs lower operating profit margin in short run. But as the sales rise, operating margin increases because the incremental investment is small. Capacity utilization can be used to assess operating leverage.

Economies of Scale – Economies of scale is that a company is able to lower its cost per unit by producing higher quantities. Economies of scale lead to greater efficiency as volume increases.

Cost Efficiencies – These efficiencies can come in two ways i.e. company either reducing cost within an activity or it can reconfigure its activities.

Financial Leverage

Debt increases the volatility of earnings because interest has to be paid. Adding debt creates more volatility in earnings. Higher debt to total capital ratio is consistent with higher financial leverage but holding substantial amount of cash distorts this relationship.

Credit ratings are a proxy for financial leverage. Companies with higher ratings generally have low debt, higher margins, and good interest expense coverage ratio.

Threshold and Incremental Threshold Operating Profit Margin

Threshold margin is the level of operating profit margin at which the company earns its cost of capital. If the company is earning more than its cost of capital then it is creating value. Incremental threshold margin captures the margin required on new sales.     

Summary of “How to Lie with Statistics” by Darrell Huff

This book tells us about how we can be tricked using different tools of statistics. We can see statistics in our daily life like in toothpaste advertisement, in company’s annual reports, in different kind of surveys etc. But if we don’t know the exact essence of this data, we can be easily fooled by this data. This book gives an idea on how to interpret this data in a more correct sense.

Sample with the Built-in-Bias

When a data is studied in statistics it is based on a sample. Now, this sample can be anything. Without knowing the details of actual sample, if we interpret this data we can go wrong. The best example for this can be the average income. In many different surveys we can see this line “The average income of this group of people is Rs. X”. If we blindly follow this data then we can be wrong because we don’t have knowledge about the sample like who participated, were they of same group, and the biggest gamble we play here is we believe they aren’t lying. If they are lying then the data is of no use. In the same way if the sample does not comprise appropriate group of people then it can mislead anyone reading this data. Author has given different example to explain this. We can take another example of marks of students. If one wants to prove that a particular class has higher average marks than others then he will include only those students in the sample who have good scores.

Different kind of Averages

When anyone talks about average most people think that it is about simple average. But wait, there are two more averages i.e. Median and Mode. The same data with no change can be shown in three different forms. Mean will depict a story, median will depict another and in the same way mode will have a different story. Statisticians or say anyone presenting a data will use an average which will best show his data. Although meaning of these three averages is different but overall they all are averages.

Here we can take an example of marks of students. The mean, median and mode can be different here depending on the marks of the students. So on asking what the average marks of the students is, it is necessary that one knows about the average used.

Missing the little ones

While studying data in statistics there are a lot of factors to see upon. These figures could be small also which many people ignores. And here they are tricked. Suppose a company wants to show that the product which it is manufacturing is effective, it will conduct surveys and will ask people to use those products and later give their results. As the company wants the result to be effective, it may happen that survey is conducted on a very small number of people and the results turn out to be in favour of company. Even if things go wrong they can conduct the surveys again because small surveys doesn’t cost too much. We can take another example of tossing a coin. If we toss a coin ten times, it may be that eight times head appears and the probability of head coming up is 80%. But if the coin is tossed 100 times then the result may wary. One should also not a follow number blindly. Say if you went for camping and you selected the place for camping by seeing its average temperature. In this case it is necessary that the range must be focused upon. It can be that the temperature ranges from very low to very high. So missing these small but important points can mislead anyone easily as these points are not given that importance.

Ignoring the Errors

While calculating any data point in statistics there could be some or the other thing which isn’t considered. Due to this the data point obtained can’t be trusted because there could be an error in this. Say for example, while conducting surveys it is not necessary that everyone is telling the truth. So there can always be a margin of error. This error can be due to ignorance of qualities, or people lying etc. While calculating IQ of students, qualitative figures are ignored like leadership skills, creative skills etc. So this IQ could also have error. And hence error should be considered while studying any data point.

Playing with the Graphs

Many people use charts or graphs to present the data in a much better way. Like if there is any trend it can be observed from it. But this chart can be easily manipulated. The data it is depicting will be correct but if the way of presentation is changed then the story can be changed. Let’s take an example. A person wants to show the increase in cases of a particular disease. Say for example he is showing increase over a period of 1 year. Cases started from zero and went up to 1000 at the end of the year. Now on the ‘x’ axis he will plot months and on ‘y’ axis he will plot cases. If he takes the scale on ‘y’ axis as of 100 cases on 1 centimetre the observer will see that the hike in cases was high. But if the scale is taken in thousands then it will look like there was not a bigger hike. In this play, charts can be manipulated.

One – Dimensional Picture

Apart from line charts, bar charts can also be used to manipulate the way of presentation. Let’s take an example. You are showing the number of corona virus patients over two different time periods. Let’s say the cases have doubled in this period. In first bar the cases were 1500 and in next bar the cases are 3000. As the cases have doubled the second bar would also be double in size. So the viewer will get that the cases have doubled. But if this same thing is applied in pictorials too, it will depict a whole new story. Author took example of cows to explain this. If the number of cows in a country has doubled and the pictorial shows two cows in which the second one is almost double in size then a person will think that the size of the cows is increased and not their population.

Semi – Attached Figure

In statistics, there are a numerous methods to misinterpret data. One such method is semi attached figures. The figures counted and the ones which are reported sounds the same but is not the same. Say for example, a report shows that “X” number of people were dead in rail accidents. People will believe that all this persons were travelling in the rail. But this figure also includes the people who were in their car or two-wheeler and had an accident with rail. So the number sounds the same that this many people were killed in rail accident but it is not the same. This semi attached figures can easily mislead anyone. In advertisements also this kind of numbers are shown. Like any chips packet with a label 10% extra. Extra of what! So one must be aware of this kind of trick and should not fall for it. A number can be presented in many ways and hence its actual jest may not be able to grasp by readers.

Post Hoc Rides Again

There are a lot of data where the person presenting it would have used correlation between different things. On seeing this correlation for the first time, it would look like there’s no problem in this i.e. it is appropriate. Having a correlation means one factor is responsible for the happening of other factor. But here, the correlation can be wrong also. Say for example, a study shows that people who smoke tend to have fewer score in test. Now it can be seen in both ways, as the student is scoring low, that’s why he smokes or as he is smoking very often that’s why he scores less. So there could be a wrong interpretation of the data. There may be a few cases where coincidently there’s a correlation.

Statistical Manipulation

Author is focusing on the ways through which data can be manipulated. He shares various examples to prove this. Manipulation can be done by using different kind of averages i.e. mean, median and mode. Different surveys can give you different averages for same data by using different averages. Percentage can also be used to mislead readers. Like if one person is calculating percentage for a certain data and if he wants to show the data to be higher than he can play with the base while calculating percentage. Double sided charts can also be used to mislead readers.

How to Talk Back to a Statistic

As there are many ways in which the statisticians can lie to us or mislead us. We can test it till some point. There are few questions with the help of which we can come to a conclusion on is the data believable or not.

  1. Who Say’s So?

Check for conscious bias and unconscious bias. Data could be suppressed for showing the specific result or units can be transferred. Charts can be manipulated due to special attention needs to be given here.

  1. How Does He Know?

As the surveys are conducted over a large sample, not everyone participates in it. Check whether the sample is large enough to describe the data precisely. As we saw earlier that there are many correlations also, check whether the correlation is significant or not.

  1. What’s Missing?

Look for the things missing in it. Like if correlation is given then whether measure of reliability is given or not. If average is given then check for which kind of average is given. Also, sometimes the factor is missing which caused a change to occur.

  1. Did Somebody Change the Subject?

Check if the subject is changing or not. Like for example, the definition of the subject changes. Earlier it meant something and now it has changed. Author has given example of farms. The number of farms was increased. This was due to the change in the definition of the farms in which there were lesser farms qualifying for that definition.

  1. Does it Make Sense?

Many numbers are just assumptions or it is derived from a formula which is not precise. In this case it is just a number and not any average.

Thus, Darrell Huff has tried to give an idea on how we (readers) can be fooled using different kind of statistical tools and how we can tackle them and interpret in a more smarter way.

Equity Dilution v/s EPS Dilution

Equity dilution means change in the percentage ownership of shareholders. As the percentage ownership will change it will result in change in the profit/loss of shareholders. Let’s see this with a hypothetical example.

Assume that there is a company with 10 shares outstanding. There are only 2 shareholders both holding equal shares i.e. 5 shares each. We also assume that company earns Rs. 1000 profit.

This is the shareholding pattern (before equity dilution).

As we have assumed that company earns Rs. 1000 profit. So let’s see how the profit will be distributed among shareholders.

This is how profit will be distributed among both shareholders.

In this case, both shareholders are earning profit of Rs.500.

Now assume that company needs more capital and new 10 shares are issued by the company. Also we assume that new capital was used efficiently and Rs. 1200 was the profit generated on new capital.

This is the shareholding pattern (after equity dilution).

Now, let us see how the profit will be distributed.

This is how the profit will be distributed (after new shareholder enters)

So in this case, earlier there were only 2 shareholders, both having percentage ownership of 50%. Profit was equally divided among them. But as they raised new capital,their profits also went up and their earnings increased (from Rs. 500 to Rs. 550). Hence their percentage ownership decreased but their earnings increased. This is called EPS accretion i.e. equity dilutes but EPS increases.

Hence, equity dilution may not lead to EPS dilution every time.

Wait. What is EPS Dilution? Lets take the same example again to understand it.

The case is same as above i.e. there are two shareholders having ownership of 50% each in a company. Earlier the profit was Rs. 1000 which was equally distributed among them. But they raised additional capital by issuing new shares and hence their equity was diluted.

But this time the capital was not used efficiently and company ended up earning a profit of Rs.700 on additional raised capital.

So let us see how the profit will be distributed now.

This is the new scenario where there is EPS dilution.

In this case, as the capital was not used efficiently, profits were lower which lead to lower earnings for both ‘A’ and ‘B’. This is called EPS dilution i.e. earnings decrease.

Hence, two ratios are important in equity dilution and EPS dilution concept. First is RoIIC i.e Return on Incremental Invested Capital. This shows the return company generate on new invested capital. And second one is high P/E. If company’s P/E is high then equity dilution can be profitable because company is getting higher amount.

Operating Leverage

Operating leverage means company’s profit is increasing more than the increase in revenue (in percentage terms). Example – Revenue increased by 20% and profit increased by 40%.

Here we will take a hypothetical example of company ‘A’ to understand its good and bad side.

In this case company’s revenue and variable cost both are increasing. But as the fixed cost has not changed, overall percentage increase in total cost is lower than percentage increase in variable cost. Hence, profit margin is increasing.

In this case operating leverage is 2 i.e. For 1% increase in revenue, profit will increase by 2% and vice versa.

But this could turn out to be a bad scenario if market is not in a good position. Let’s see how.

In bad scenario, company’s revenue decreased by 25% and variable cost too decreased by 25%. Although, overall decrease in total cost was lower than decrease in variable cost (in percentage terms). Also, revenue decreased more (in terms of percentage) than cost. Hence profit margin was impacted.

Hence, operating leverage can be good when market is in a good position but if market takes a u-turn then this operating leverage can work in opposite manner i.e. if in good times profit was rising 2% for every 1% revenue increase (assuming operating leverage as 2) then in bad times the profit will decrease 2% for every 1% decrease in revenue.

Industries Beating Cost of Capital

Analyzing industry should be the first step for investing. There are many quotes by different iconic investors on the same.
If an investor finds a good industry, he has mitigated some of his risk. How? Because if we analyze any industry for a longer horizon, say 10 or 15 years, we can have a decent idea of how the industry is (in terms of profitability). Also by taking a longer horizon, we can be satisfied by the result because any economy can see different phase of economic cycle in 10 or 15 years.
Nature of industry doesn’t change overnight. Example – A tyre industry’s nature will not change in short term (until there is something better than tyre). Tyre is the essential part of vehicles so we can say that tyre industry’s business can’t go out of fashion (although companies working in industry can).
Also Charlie Munger, also known as wisest man alive on Earth, quoted “Fish where the fish are”. This means that if there are two ponds (take this as industry) which have 10 fishes each (fish could be taken as companies). But one pond has 7 rotten fishes and second one has 3 rotten fishes. Where would you fish? Obviously second one because the probability of getting a rotten fish is less in second one.
In the same way, if an industry has companies that are earning profits above their cost of capital for a longer time then it can be taken as a good industry and vice versa.

So, today we have analyzed almost 80 sectors of Indian economy. We have taken RoCE i.e. Return on Capital Employed for the calculation of economic profit. Although RoIC i.e. Return on Invested Capital would have been a better measure but we have taken RoCE as a substitute to RoIC. Also we have taken weighted average because there were some companies having extremely higher return but there overall weight in industry was low. This was portraying wrong image of industry. But with the help of weighted average, the return was normalized.

Here is the excel file where all the calculation is done.

Done by Sanjeel Kothari and Kusum Chaudhary

Summary of “How the Economic Machine works by Ray Dalio”

Here is the link to the video https://youtu.be/PHe0bXAIuk0

Economy looks complex but it works in simple, mechanical way. It is made up of simple transactions which are repeated over and over again. Transactions are done by everyone. It can be done in two ways either by paying cash or on credit. By adding both of these we get total spending. It can be said that transaction is the building block of economic machine. Economy consist different markets and every market has buyers and sellers that are doing transactions. Every bank, institution, individuals etc. are transacting in the same way i.e. either by paying cash or on credit. Government is the biggest buyer and seller in an economy. Now, government can be divided among two institutions – Central government and central bank. Government is collecting taxes and spending while central bank is managing the money in an economy. Central bank manages money by interest rates and by printing new money. Credit is the most important part of an economy. It can help both lenders and borrowers to fulfil their needs. Borrowers borrow money with a promise to repay the principal with interest. When interest rates are high, borrowing is lesser in economy because it is expensive (higher interest need to be paid) and when interest rates are low, borrowings are high because it is cheap (lower interest need to be paid).

Why is credit important?

Credit is important because when people borrow money they spend and this spending is income for a third person. This third person will further spend it and it will become income for another person and this is how this cycle works. A person having good income has good credit worthiness i.e. banks are ready to lend him as he has ability to repay or put some of his assets as collateral. This is how cycles are created in economy. The more a person earns the more he raises his living standards. This is productivity growth. Productivity matters most in long run because it doesn’t fluctuate much. Debt is a driver of economy because it allows us to consume more when we borrow and we pay it back we spend less.

Let’s assume two case where in one economy doesn’t have credit and other has. In the economy with no credit, the only way through which spending can be increased is by increasing our income which require us to produce more. But in the economy where credit is available there will be cycles. This cycle is because of the nature of credit. Every time a person takes credit he spends more in that time but in future when he will have to repay it he will spend less. Credit is different from money because in money the transaction is settled immediately but in credit transaction is settled in future. Credit can be both good as well as bad depending on its usage. Example – If the credit is used to buy a T.V. it will not generate any return but if it used to buy a tractor, it will help in farming through which you can earn money. Credit leads to short term debt cycle. This is how it works – one person earns, say for example, $100,000 with no debt. He can borrow $10,000. So he can spend a total of $110,000. This $110,000 becomes another person’s income with no debt. He will be able to borrow $11,000. His spending will rise to $121,000 which will later become another person’s income and the cycle goes up. But as the cycle goes up it will need to come down. Over the time as the people will spend more, expansion will be there. As the spending is increased the prices rise too, which we refer as Inflation. In this scenario the central bank increases interest rates. This will lead to fewer borrowings and also cost of existing borrowings will rise. As the borrowing will be lesser and debt payments will be higher, this will lead to lower spending and eventually lower incomes. This again will lead to drop in price and overall economy will face recession. In this scenario, when inflation is not a problem, central bank will decrease the interest rates so that economy pick up again. Overall, when the credit is available easily there is a credit expansion and when credit is not available easily there is a recession. This cycle is managed by central bank (generally by changing interest rates). Now comes in long term debt cycle. At the end of short term debt cycle, growth and debt rises. This is because people spend more rather than paying debt. Even if people have rising debts, lenders lend freely because things are going great. Incomes are rising, spending is rising, asset prices are rising, stock market is rising and overall there is a boom. If this is done in huge volumes it is called as bubble. Rising income and asset values help borrowers remain creditworthy. As this debt burden increases slowly over time, debt repayment grows faster than income. Due to this spending is cut and income for other people decreases which makes them less creditworthy. To pay back the debt, spending is to be cut even further. Economy begins deleveraging. In deleveraging there less income, assets price falls, credit is not available, the stock market crashes. As the income drops and there is a pressure of debt repayment, people are forced to sell their assets. Due to this stock market crashes, real estate market crashes. This appears to be recession but is not. In recession, central banks changes interest rates to stimulate the borrowing. But in deleveraging the interest rates are almost near to zero or zero only. Four ways through which deleveraging (repayment of debt burdens) can be done are – cut the spending, reduce debt through defaults and restructuring, redistribution of wealth and finally central banks printing new money. But it is not that easy. As the spending is cut, income also stops for other people. Businesses cut cost which leads to removal of employees. As the employees will have no income they won’t be able to repay the loans/debt. This will result in banks unable to pay the depositors and there will be defaults all around. When credit is given both assets and liabilities arises. Lender doesn’t want to see their assets disappearing. So here comes debt restructuring. Debt restructuring means lender gets lesser money or gets money over a longer time or at a lower interest rate (from the rate which was agreed earlier). As the incomes are less and unemployment is seen, government’s source of income i.e. collecting taxes is affected. At the same time government has to spend for those who lost their jobs. Basically, government spend more than they earn. In this case government has to borrow more. But the question is from whom to borrow. Now what government does is that they raise taxes on wealthy people and by this wealth is distributed from “haves” to “have not’s”. Then central bank steps in. Central bank prints more money and buy assets and asset prices starts rising. But this will only help those who have financial assets. So government and central banks cooperate. Central bank buys government bonds which result in inflow of money for government. Then government can buy goods and services and eventually help lending money in the hands of people.  If all the factors of deleveraging are done in balanced way then deleveraging can be beautiful. If the deleveraging is done beautifully growth will be slow but debt burden will be reduced. Overall, economy will start to rise again.

Few points to remember:

  • Don’t have debt rise faster than income.
  • Don’t have income rise faster than productivity.
  • Do all that you can to raise your productivity.