The Cloud Needs Land

The Cloud Needs Land

The massive rise of the cloud ecosystem has created a supply and demand strain on a very un-cloud resource – land. Put simply, physical data centers are needed to power the cloud computing world, and more are needed. Because of this, investments in data centers and their operational ecosystems have enjoyed enormous success, and are likely to continue to do so, despite their high fundamental valuations. I should know. Over a year ago, I built a data center real estate portfolio that has generated a realized return of 56%. Encouragingly, sophisticated investors in the space are betting this growth will only continue.

In late July, Iconiq Capital, an investment firm that represents some of Silicon Valley’s most notable tech titans, registered an investment subsidiary called Iconiq DC Management, LLC. The company will operate funds exclusively focused on data center real estate and data center assets. This niche investment thesis includes the acquisition of direct interests in real property, the formation of joint ventures, the acquisition of securities in entities that own or invest in data center-related assets, investment in real estate investment trusts (REITs), and the issuance or participation in financial instruments designed to further catalyze the data center ecosystem.

What’s special about Iconiq? As outlined in a recent article for Bloomberg by Miles Weiss, the Divesh Makan-led firm manages seed to late-stage investment capital for the likes of Mark Zuckerberg, Sheryl Sandberg, Reid Hoffman, Sean Parker, and Jack Dorsey. This rare combination of capital, control, and industry IQ creates a significant proprietary advantage for technology long-plays. Forget industry speculation, this group of household names is comprised of the de facto market-makers in high tech real estate.

Despite the reassuring move by Iconiq, financial rewards for investors at this stage may harbor risks, namely, steep prices for the current bottom line. The high valuation multiples, calculated from price/net income for data center REITs, should prompt caution for any investor with a sense of tech history. High multiples and bubbles are often synonymous with each other, and at times, a rising trend is hard to differentiate from investor herding or deliberate momentum trading. The .com boom and bust shook industry experts and casual market participants alike – acknowledging the absence of market certainty, this could be no different. Even with superstar backing, the data center REIT market is still subject to a pricing correction if the market deems current levels of projected growth to be unrealistic. Furthermore, a rate-hike, or unforeseen macroeconomic events can dramatically alter investor liquidity or perceptions of value.

To warrant a high valuation multiple, data center REITs inherently need to offer a market opportunity that matches lofty return expectations. The thesis of my data center REIT portfolio, and my assumption regarding Iconiq DC Management’s long-term strategy, is to capture growth generated through the data center market's shift from pure tech plays (i.e., Amazon, Facebook, and Google) to the rest of the Fortune 500. Large corporations, in every industry, stand to make tremendous gains from the cloud and robust ecosystem of analytics, operational controls, and customer resources that's spawning within it – when there’s gains or savings to be made, executive will seek them out. As the corporate world shifts from on-premises systems to the cloud, more and more data centers will need to be produced to accommodate the post-tech industry wave of demand.

Perhaps no example is stronger of the corporate transition from on-premises IT solutions to the cloud than SAP. Not only are 87% of the Forbes Global 2000 SAP customers, SAP’s own cloud revenue has grown 30%+, excluding acquisitions, for 12 consecutive quarters. With 110 million subscribers in the company’s growing cloud user base, the operational foundation for the enterprise cloud is clearly here to stay. Placing this information within the context of a long-term investment strategy for the corporate world's own big data needs, it’s reasonable to conclude the data center real estate market will continue sharp growth - Iconiq Capital and I are counting on it.

 What Singapore can Teach Business

What Singapore can Teach Business

There’s an overwhelming stream of published information on how companies can improve performance, grow, and increase their bottom line. Naturally, the vast majority of these insights are derived directly from the business world. But is business the only tributary for strategic information? From a credibility standpoint, it’s a problematic resource. Historically, the long-term health of corporate companies has been extremely difficult to manage. As the fortune 500 list indicates, only 57 companies remain from the original 500 that started in 1955. Perhaps even more disconcerting – the rate which companies are falling off the list is now greater than ever before. If big business cannot serve as a definitive model for long-term growth, is there another option? 

Yes. The city-state of Singapore.

Since 1960, only 5 years after the birth of the Fortune 500 list, Singapore has generated growth that would leave any shareholder in awe. Given such a brief time period, the country’s meteoric rise from humble beginnings to a 21st century powerhouse is nothing short of extraordinary.

When compared to other countries, there are key themes which Singapore has pursued with laser-like veracity. The country’s uncompromising attitude towards high-level ideals and big-picture strategy has been economic rocket fuel. When boiled down, Singapore's hyper focus centers on internal investment in human capital, collective motivation, and leveraging trade and strategic alliances as instruments of growth.

The Fortune 500 is more volatile than ever due to the speed with which modern markets can develop and implement disruptive technology. Much like a progressive company devoted to continuous innovation for offensive and defensive measures, Singapore views technological advancement as a national priority. The government also has a firm prospectus on what the future of technology looks like – and they have made bold moves to transition the country into an information and knowledge-based economy. As Clayton Christensen outlines in The Innovator’s Dilemma, companies often put too much emphasis on customers' current needs, and fail to adopt new technology or business models that will meet their customers' unstated or future needs. Developing and maintaining a long-term view has yielded tremendous results for Singapore – an approach which contradicts the short-term, EPS focus of corporate America.

From the ground up, the country has built an educational, political, and business infrastructure that propels technological research and development forward. This foundation has also served as a useful vehicle to attract and support high value industry. Within a corporate context, a similar approach would be the creation of a company culture focused on employee education, management structures, and partner strategies that drive long-term innovation and growth. Can we say that most Fortune 500 companies represent these strategic values? Not a chance. 

For Singapore-like growth, what level of dedication to continuous innovation is required? To start, gross expenditure on R&D is constantly rising for the country. From 2011 to 2015, the government invested  $16.1 billion SGD in research, investment, and enterprise. In 2011 alone, the country increased R&D by 14.8%. In contrast, public R&D grew .82% during the same year. From these numbers, it is clear there’s a sense of urgency in Singapore’s leadership to invest in research that will harbor further long-term economic and social improvements.

Culturally, there is an extremely heavy focus on educational excellence in Singapore. The country is home to some of the best universities in the world and the concentration of researchers per capita is second only to Finland. The corporate equivalent would be Google-like standards for employee education and development. Through exhaustive recruitment efforts and very high standards for new talent, companies can cement a culture of sustainable excellence – a feature that is notoriously difficult to implement at later stages in a company’s growth.  

Within a strategic alliances context, Singapore is a very business-friendly economy, ranking second in the Heritage Foundation and the Wall Street Journal’s Index of Economic Freedom. Determined through metrics that rank trade, investment, and labor freedoms, the ranking highlights Singapore’s willingness to treat foreign business on par with local companies.

It’s common for corporate companies to grow in unnecessary complexity - building internal teams and capabilities for needs that stretch far beyond the established core competencies. As Singapore indicates, a company shouldn’t attempt to do it all. Through cultural cross-pollination and enormous gains generated via trade, Singapore has created a transactional hub that generates enormous economic value. It’s not sustainable for any company to be operate in isolation. Like Singapore, they should constantly be assessing their comparative advantages and seeking ways to gain further economic efficiencies.

The country is currently ranked first in the World Bank’s standings for “ease of doing business” and has been dubbed the “world’s most innovative country” by the Global Innovation Index. From a corporate perspective in 2015, we would likely give that title to Google or Apple. However, despite the astounding success both these tech giants have had, they are far too young for us to determine how their policies, management structures, and strategic planning shapes sustainable, long-term growth. Fortunately, Singapore fills that referential void and provides a useful indicator that innovation-based economic structures do yield sustained growth.

 Investing in Second Movers

Investing in Second Movers

In entrepreneurial ventures, being the first to market is considered an enormous advantage and increases the likelihood for outside investment. This first mover position is not only coveted by venture capital, many entrepreneurs solely pursue business models that will create new markets. Conventional wisdom and widespread hype stress that being first to market is critical to securing market share and generating early sales.

However, depending on the industry, product, and business model, being the second to market can actually provide an upper hand over those eager first movers. In slower-paced industries, first mover advantage can exist when the rate of technological innovation is low, but for high-speed technologies, especially those digital in nature, second movers hold a positional edge. 

The power of the second mover advantage can be witnessed in the rise of social media. As recent history has shown, Friendster and Myspace were the first major social networks to market, but they did not hold a strategic edge over Facebook. On the contrary, there is compelling evidence that Facebook held a distinct set of advantages in the social media arms race.

As Venture Capitalists are painfully aware, there is never a guarantee that new markets will yield paying customers. With VCs only profiting from 10% of the companies they invest in, new ventures must be assessed within frameworks that carefully evaluate the risks of new markets. Extensive customer validation is one such method, and by allowing Myspace to conduct this valuable validation process, Facebook was able to avoid the costs associated with the risky and time intensive market testing process. After witnessing Myspace succeed, Facebook had the luxury of pursuing a proven concept.

In addition to added risk, first movers also face significant costs to educate customers about a new market. At its inception, social media was an unfamiliar concept – even to many progressive technology enthusiasts. By entering the market as a second mover, Facebook was able to benefit from Myspace covering the cost and responsibility of the market’s social media education.

As the first mover, the probability that Myspace had their social media strategy optimized was extremely low – near zero. Unlike Facebook, they didn’t have a useful point of reference to track or test without incurring internal development expenses. Furthermore, as Myspace users familiarized themselves with an unoptimized design, they became entrenched with the status quo. This negatively impacted the ability of Myspace to truly innovate or implement design modifications – or they risked alienating a fragile base of new users.

As the second mover, Facebook had a proverbial blank slate. By observing where Myspace was effective and where user needs weren’t being adequately addressed, they developed a website that leveraged cumulative insights gained from a year of observing social media - financed entirely at Myspace’s expense. Furthermore, it was much clearer to Facebook that they had to develop a superior user experience in order to overthrow the market incumbent. Regardless of what changes Myspace decided to make during their first year of operations, Facebook still had the opportunity to determine if they wanted their design to be a strategic alternative, a functional clone, or a complete change of market direction.  

When assessing first or second mover advantage within an investment context, the industry, product-type, and business model are critical components to consider. Because digital technology has such low switching costs and rapid rates of product development and implementation – no company is safe. As second movers capitalize on the numerous competitive advantages offered in fast-moving markets, they must remain cognizant of their vulnerability to even newer market entrants.

To create sustainable growth over the long term, investors who focus on second movers must ensure the management team and business model can accommodate continuous innovation in anticipation of future threats - or they risk being a Myspace.

 The Implications of “More”

The Implications of “More”

More is better, right? Well, not exactly.

Whether it’s new product features or burger selections on a menu, businesses around the world routinely take on unnecessary expenses to develop products they don’t need. Line expansions result in increased costs for design, manufacture, distribution, and promotion – expenses which may carry significant economic risk. What may surprise you, however, is that increased offering sizes can also hurt sales — which begs the question: are you offering your customers too many choices?

Thoughtful pricing and portfolio strategies are critical elements for any business model, but to truly optimize sales, a deep dive into the counter-intuitive is necessary. Consumer purchasing behavior is deeply irrational, so when planning your next great product iteration, it’s important to be cautious of the non-obvious.

Small is Big

If you've always assumed more customer choices are better, you’re not alone. However, there’s a nasty little phenomenon known as decision paralysis - and there’s a good chance it will affect your customers if product range restraint isn't taken seriously. Barry Schwartz has written extensively about this effect, and in a2005 TED talk, the author detailed research conducted on Vanguard and their selection of mutual funds. Schwartz revealed that for every 10 funds the company offered, client participation rates went down 2%. That is, if the company decided to offer 50 mutual fund options instead of 5, they experienced a 10% decline in sales.

In a similar study, psychologists Mark Lepper of Stanford and Sheena Iyengar of Columbia examined the implications of choice for gourmet jams. In one condition, 6 varieties of jam were available for customer sampling, and in the other, shoppers had 24 options to choose from. Both versions of the test had roughly the same number of people taste the jams, but when it came to actually purchasing the product, 30% of tasters exposed to the small set opted to buy, but a mere 3% of tasters opted to purchase when faced with 24 varieties.

Sheena Iyengar has also examined this effect within mass-market products. Head & Shoulders was offering 26 different shampoos to customers, but when Proctor & Gamble cut the line down to 15 options, brand sales increased by 10 percent. Iyengar noted when line reductions result in both increased sales and lower infrastructure costs, margins can explode. For instance, the Golden Cat Corporation eliminated 10 products from their portfolio and saw their profit increase a staggering 87%.

A different phenomenon related to the power of small sizes was observed in sports card bundling. In a perplexing study conducted by John List of The University of Chicago, two decks of sports cards were auctioned off. One deck had 10 cards and the other deck included those 10 along with 3 cards of lesser quality. Despite the fact the 10-card deck was worth $15 and the 13-card deck was worth $18, respondents consistently bid higher prices for the 10-card deck.

Yes, you read that correctly. Even though the 13-card deck contained all 10 cards, it was still viewed as less desirable and garnered less money.

Wise Choices

Understand that “MORE” can backfire. There’s many situations where increased product selection is completely warranted, but arbitrarily increasing product choice will not guarantee an increase in sales. As we've learned, sales can actually shrink if there’s too many options.

Structure for Simplicity. If you have an abundance of product options, have no fear; your portfolio can still be organized to reduce the effects of decision paralysis. For instance, the 50 mutual funds in the Vanguard study could be broken down into three sub-categories: Low Risk, Moderate Risk, and High Risk. The 24 jams could be separated on the display table by core fruit type. These simple organizational mechanisms will greatly increase a customer’s ability to pursue a purchasing decision.

Underline your opportunity cost. Beyond the plethora of fixed and variable expenses new products bring, examine other ways you can invest your company’s time and capital. Instead of product line extension, are there ways to operationally increase gross margin for existing products? Can you enhance production lines or re-engineer an expensive component to save you money in the long run? Alternatives for time and resource allocation need to be a central part of the product planning process, and the corresponding risk for each action should be closely considered. The full costs of implementing new choices need to be weighed against the expected returns the new option will bring. It is critical sales cannibalized from your own product lines aren't factored into this equation unless you are able to generate a higher margin with the new item or plan to phase out an existing item.

Let other companies do the heavy lifting. Are there points of market reference (even in outside industries) that you can lean on for product line direction? What colors are leading electronic companies using in their designs? How many product and color options are they offering in total? For food & beverage, how many menu options do the most popular restaurants or bars in town offer? While not definitive, there’s a world of analogs we can sample from if research budgets or human resources are limited.

And lastly…

Realize that consumer irrationality is testable, repeatable, and predictable. Analyze the limits of what your customers can mentally juggle and determine what product mix will generate the highest total ROI. In regards to the jams, it is completely plausible that 10 varieties is more effective than 6, but as the data indicated, 24 options was simply too much jam.

Whether it’s standard strawberry or a table full of marmalade, find that sweet spot.

 Time for a New Roman

Time for a New Roman

Straining my eyes late at night, I was nearly done proofreading my analysis of a medical technology valuation. To calculate the financial risk, my graduate program at the McCombs School of Business had me building Monte Carlo Simulations – a mathematical technique used to determine the probability of future scenarios.

Just as I was about to call it a day, I realized one of my charts had two different typefaces. Clearly, this was a minor issue, but after a day of simulation building, I couldn't help but ponder: “Can I statistically determine what typeface will give me the best grade?” This prompted another line of questioning: “Have previous typeface selections impacted my grades? If so, by how much?”

o my surprise (and fascination), I learned that yes, there seemed to be an undeniable correlation between typefaces and the perceived level of informational integrity that accompany them. That is, the exact same content written in different type will lead to altered levels of trust regarding said information.

Phil Renaud, a Halifax-based web developer and typography enthusiast, provided some of the first anecdotal evidence of such an effect. In college, Phil wrote 52 academic essays with three different fonts allotted between them. When categorized by font selection alone, he found the average grades were different, especially between Georgia and Trebuchet:

  • Georgia, 23 Essays, A Average

  • Times New Roman, 11 Essays, A- Average

  • Trebuchet, 18 Essays, B- Average

While thought-provoking, this was an uncontrolled experiment and too small of a sample size to serve as any real proof of the font effect. More statistical meat was needed, and that’s where Errol Morris, Benjamin Berman, and famed Cornell Psychology Professor, David Dunning come in.

In a New York Times Op-Ed, Morris, Berman, and Dunning agreed that more analysis was required beyond Renaud’s findings – so they proceeded to gather and interpret 150,000 datapoints concerning the impact of font selection. The team concluded that Baskerville carried a 1.5% advantage over the other tested typefaces (Helvetica, Computer Modern, Trebuchet, Georgia, and Comic Sans) with respect to perceived information credibility. These results were highly statistically significant, with less than a 1% chance that the results were from random variation.

While 1.5% may seem insignificant at first, that percentage could very well dictate the difference between an A- and a B+ - a realization that Phil Renaud was intimately aware of. Within a business context, an advantage or disadvantage of this magnitude would likely have a direct effect on sales and marketing efforts. For high volume plays in particular, font choice could equate to thousands, or even millions of dollars lost or gained.

Whether you’re trying to improve brand messaging, sales conversions, or simply maximize your odds of receiving a better grade, it’s clear that typeface selection is an area that shouldn't be overlooked. Furthermore, when small considerations regarding statistical advantage are stacked upon one another, the compound effect can have dramatic implications for an individual or organization. For example, alternate selections of typeface + color + size would have a greater range of advantage/disadvantage than typeface selection alone. In this regard, it would appear that crossing your t’s and dotting your i’s may have a whole new meaning.

I'll be starting with Baskerville.