Clear Predictions For Enterprise IT In 2017 — Chris Rust

Happy New Year everyone. Last year was a period of great innovation and activity for the information technology (IT) industry. I saw powerful mega trends such as:

  1. A global data center build-out arms race to support strong growth in cloud computing.
  2. The number of Internet-connected devices grew dramatically to the high Billions.
  3. An explosion of interest in applied big data and analytics powered by machine learning and artificial intelligence.
  4. Computer warfare was used as a weapon against industries, governments, and societies, giving rise to innovation across data/information security, application security, network security, encryption, threat assessment, breach analysis, compliance enforcement, automation, and other sub-fields within CyberSecurity.

Here are my predictions for what lies ahead for enterprise IT in 2017:

President-elect Trump Will Govern As He Campaigned — The Affordable Care Act (ACA) will be repealed and re-branded with something that retains key elements of the ACA. Campaign promises such as the trillion-dollar infrastructure re-build will be ratcheted back by economic realities and a Republican-controlled Congress that wants to reduce the federal deficit. Lower corporate tax rates and (maybe a one-time “tax holiday”) will help re-patriate IT industry profits that are currently held offshore. Immigration reform will focus on illegal immigration and bad actors, but will not de-rail legal immigration of skilled technology workers.

Innovative IT Startups Will Produce Great Companies And Great Exits— Large IT industry leaders such as Apple, Cisco, Facebook, Google, and Microsoft will use their large free cash war chests continue to acquire VC-backed IT startups that deliver best-in-class products and people. Repatriation of offshore profits may provide additional resources for IT M&A. The IPO window will open for high growth, high GM IT companies. Following record VC distributions in 2013–2014, large war chests from successful VC capital formations in 2015–2016 will be thoughtfully deployed.

Public Cloud Will Become A “Game Of Thrones” Between House Amazon, House Microsoft, and House Google — Now heading into its’ second decade, the mega trend of public cloud computing will continue. However, the dominance of Amazon Web Services (AWS) will be increasingly challenged by Microsoft Azure and Google Compute Cloud. Customers (many of whom compete with Amazon in eCommerce) will migrate to Azure and GCP to leverage expanded product suites. Azure and GCP will continue to fill in their product suites as they did in 2016. Hundreds of small cloud providers will fight among themselves for the crumbs around the edges to see who can be the tallest midget. Some great businesses will emerge from specialty clouds that address specific industry verticals (i.e. Cloud Security Service Providers, genomics/bioinformatics).

AWS’s High Operating Income Will Drive Private Cloud Investment — AWS has become the profit center for Amazon’s thin margin eCommerce business. In contrast, Microsoft and Google have much greater resources from their core franchises to invest aggressively in their public cloud offerings. Both have demonstrated a strong will to do so. AWS’s impressive 3Q16 revenues of $3.2 billion and operating income of $861 million (up 55% over 3Q15) will fuel growth in private clouds by large enterprise customers who want lower costs, greater agility, and more control of their apps.

Containers Will Challenge Hypervisors — VMWare’s first product VMWare Workstation was introduced in May of 1999. VMWare created the server virtualization industry in 2001 with the release of ESX. By enabling one physical server to look like multiple “virtual machines” (each with its’ own client O/S) hypervisor-based virtualization has been a key enabler of IT infrastructure efficiency gains for the past 15 years. However, a new approach is needed by modern distributed computing applications that need to make many machines look like one massively scalable process. Linux LXC containers and operating systems that support process isolation in a shared user space predate Docker by many years, but nearly four years have passed since Docker first released their container solution in March of 2013 to help DevOps with software portability. This year containers will emerge as an important IT infrastructure building block to avoid the performance, cost, and resource penalties associated with hypervisors. Container-based clouds will challenge hypervisor-based clouds on fundamentals such as cost, agility, and application performance. Container orchestration will improve due to competition between venture-backed companies. The container-based on-demand clouds from Robin Systems will, for the first time, provide robust application-to-spindle quality-of-service from multiple virtual private clusters running on a shared multi-tenant cloud infrastructure. Container-based clouds will differentiate from hypervisor-based clouds with robust support for both stateless and (arguably the more important) stateful enterprise applications. Warehouse-scale container-based enterprise clouds sans hypervisors will address the virtualization “IO blender” problem with storage tier innovations.

Improved Container Security Will Accelerate MicroServices — Containers will help address growing cybersecurity concerns by enabling complex services to be broken into microservices — modular pieces that perform a component of the workflow needed to realize the end service. VC-backed startups will pioneer new solutions for application security and container-based microservices.

Edge Computing Will Drive Distributed Data Centers — Conventional wisdom for the past few years has been the future of IT would belong to a relatively small number of warehouse-scale computing facilities strategically sprinkled around the globe. These would be interconnected with a vast global fiber backbone and connected to the Internet of Everything (IoE) by ubiquitous broadband mobile communications network. As Akamai demonstrated with their content distribution network over the past 18 years, latency matters. Many important applications emerge that cannot tolerate the latency required to backhaul data, perform complex computations, and provide the required response. Examples include image and sensor processing for autonomous vehicle navigation where near-real time computing is essential for safe operation, real-time remote control of drones, long-haul trucks, and industrial equipment, and a long list of industrial IoE use cases. A hierarchy of computing, storage, caching, and networking will be emerge from the core to the edge.

The Hyperconvergence Thesis Will Be Challenged — Popularized in 2015 and 2016, hyperconvergence is a type of IT infrastructure system with a software-centric architecture that tightly integrates compute, storage, networking and virtualization resources and other technologies from scratch in a commodity hardware box supported by a single vendor. IT infrastructure solutions that actively de-couple compute and storage resources will challenge the central premise of hyperconverged infrastructure which closely couples compute and storage into an appliance. The basis for this prediction is the practical challenge of getting the ratios between compute/storage/networking, and the capabilities/types of compute/storage/networking, correct for all time ahead of time. Infrastructure requirements to support modern data stack applications continuously change. Compute/storage/networking technologies are also on different cost and performance curves. Some applications are compute heavy and data light. Others are compute light and data-heavy. Some require both. Actively de-coupling compute and storage provides future-proofing and flexibility. Specific use cases with relatively stable application requirements (i.e. virtual desktop) will continue to be strong use cases for hyperconverged clouds that offer extreme ease-of-use via clever management software and good UI design, but many of the most important modern data stack applications will migrate to architectures with decoupled compute, storage, and network tiers.

Open Source Will Transition To Freemium Model For A Stronger P/L— From a capital formation perspective the last few years have been an incredible run for VC-backed open source IT operations management (ITOM) infrastructure software companies. Big dollar funding rounds for open source companies will become much more difficult as these companies begin to get measured less on developer downloads and more on ability to deliver growth AND profit. In several cases the financial performance of VC-backed open source companies that have achieved liquidity via IPO or acquisition has been inferior to those using a freemium model (i.e. Splunk). Large corporations will seek alternatives to do-it-yourself efforts based solely on open source software due to the true cost of DIY complexity and stability.

Real Threats Will Drive Spending On CyberSecurity Solutions — Over the past decade CyberSecurity has been a darling for VC investment, and with good reason. Venture-backed success stories like Palo Alto Networks (IPO PANW), Fortinet (IPO NTNT), FireEye (IPO FEYE), Trusteer/IBM and others have been strong earners for their investors. High profile thefts such as 1 billion subscriber profiles from Yahoo, emails from the Democratic National Committee, 31 million accounts from Ashley Madison, 152 million accounts from Adobe, and 21.5 million profiles from the US Government have transformed a hypothetical risk into a clear and present danger. Risks will go beyond data loss/exfiltration to data manipulation, which poses far scarier consequences. There will be a big push to protect data at the canonical file level with encryption and robust identity management. Keep an eye on Vera Security in this area. They have an important role to play in the integrity game — ensuring that only the right people have the right to manipulate critical data. Given the pervasiveness of ransomware attacks in 2016, there will be a raft of innovation to help defend against them. Applied big data using machine learning on massive real time data sets will power new approaches to CyberSecurity based on behavioral analytics. Cyphort is doing some great work here.

CyberSecurity Concerns Will Go Mass Market — Spurred on by the October 2016 attacks on Twitter Spotify, Reddit and others that used smart home devices like video surveillance cameras as distributed denial of service weapons, consumers will seek protection of their personal digital assets (photos, files, identity), accounts, and home networks. New forms of encryption, machine authorization, multi-factor authentication will be adopted. Mass market encryption will give rise to debates about national security versus personal privacy.

AI, M/L, And D/L Will Continue To Be Darlings Of VC & M&A — Much has been written over the past year about the importance of Artificial Intelligence (AI), and its’ sub-domains Machine Learning (M/L) and Deep Learning (D/L). As background, a “machine” is considered to be “learning” if given a task, experience doing a task, and a result produced by the machine from doing the task, the result improves with experience without human guidance. Many of the algorithms (i.e. logistical regression, gradient descent) and foundational concepts that are powering AI, M/L, and D/L today have been around for decades in the field of computational programming, statistical analysis, and linear algebra. The reality of near-infinite compute and storage at the fingertips of cloud-native developers has been a game changer for the practical use of AI, M/L and/or D/L as a source of strategic advantage in commercial endeavors. However, AI for the sake of AI will stall. The combination of AI, M/L and/or D/L and deep vertical domain expertise is where opportunities to disrupt some of the world’s largest industries will be found. One example is Reflektion’s use of cutting edge M/L to power billions of dynamically assembled web pages to help Retailers dramatically increase sales via extreme content personalization. Another example is Paysa’s use of M/L to help the world’s knowledge workers know their true market value and manage their skills development based on market needs. Applied big data analytics in combination of the new data streams that are being enabled by instrumenting the world in new and novel ways via Internet-of-Everything (IoE) will result in new levels of efficiency, visibility, and automation. Both public and private clouds will be augmented with specific capabilities to accelerate AI, M/L, and D/L workloads. Research in neural networks will be brisk. The combination of human plus machine will outperform either human or machine alone, but the volume of machine-to-machine traffic will skyrocket.

Customized Computing Hardware And Software Innovation Will Surge — A common meme within the IT industry for the past half-decade has been the future would belong to clever software running on commodity hardware. The insatiable appetite for more efficient computation, higher application performance, and changing workloads is proving this hypothesis to be false. High profile success cases such as Microsoft using FPGA acceleration to increase performance of Bing has been driving home the value of hardware acceleration of high value algorithms. Changing workloads from AI, M/L, and D/L discussed above will require efficient linear algebra functions like large matrix and vector multiplication and transposition. These are not well suited to general purpose CPUs and will give rise to new classes of highly efficient specialty processors. This phenomenon occurred in the past with the rise of the network processor, digital signal processor, and others. Founded by Dr. Jason Cong, Falcon Computing Systems is doing exciting work in customized computing by using machines to go directly from algorithms implemented in C to highly optimized gates…without writing a line of verilog. Other vendors will join Google, IBM, Intel (acquired Altera, Nervana and Movidius), and NVIDIA with M/L-specific processors. NVIDIA’s new chip, the Pascal architecture-based Tesla P100 GPU, which was announced in April of 2016, and their D/L supercomputer, the DGX-1 will motivate others to raise their game.

Anything That Can Be Run In The Cloud And Consumed On Mobile Will Be— Technology-enabled services, often powered by M/L-based big data analytics, consumed on mobile as the front end and running in the [public/private/hybrid] cloud, will continue to re-imagine and disrupt large vertical industries. Software-as-a-service (SaaS) will continue to displace traditional enterprise software models. Software giants such as Oracle and SAP will continue their migration from legacy on-premise enterprise software (often running on dedicated bare metal servers) to cloud consumption models, but their cycle time in doing so will result in leakage of important major accounts. The larger SaaS vendors like Salesforce, Workday will continue to build out their suites via acquisition.

Electrons In The Data Center Will Get An IQ Bump— As the construction of public and private clouds continues, and these clouds power mega trends like the IoE and microservices, limited availability of power will continue to create challenges for data center operators and end customers alike. M/L will be applied to the modern data center to actively monitor and manage the distribution of power as a scarce and valuable resource. Virtual Power Systems has built a new form of software-defined power control plane, and will emerge as a pioneer in this area.

Moore’s Law Will Surprise Us…Again — While much of the world has been focusing on application software, some of the world’s brightest minds have continued to re-define what is possible at the transistor level. In October of 2016 a research team led by faculty scientist Ali Javey at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) created a transistor with a working 1-nanometer gate. For comparison, a strand of human hair is about 50,000 nanometers thick. The semiconductor industry has long assumed that any gate below 5 nanometers wouldn’t work, so anything below that was not seriously considered. This exciting device breakthrough was achieved using carbon nanotubes and molybdenum disulfide (MoS2). This, and many other device innovations such as new forms of memory will be advanced in 2017 to further expand the capabilities of solid state devices. Moonshot programs in quantum computing will also advance.

Thank you for reading my list of IT predictions for 2017. Given how fast things change in the world of technology innovation, odds are high that I will be wrong more than I am right. My hope is this list will provide some insight into the areas that our firm Clear Ventures is currently focusing our thesis-driven early stage technology venture investment strategy on. I would be delighted to hear from you if you have a passion for building great products, the comments above resonated with you, and you are ready to break out on your own.


Chris Rust is a Founder and General Partner of Clear Ventures. His bio can be found at and Follow him on Twitter @chrisrust, or email him at Clear Ventures is a Silicon Valley venture capital firm that is purpose-built to help startup teams win in business technology and technology-enabled services. With $121m of committed capital from a world-class investor base, Clear provides capital, connections, and company building assistance from the earliest stages of company creation to bold startup teams so their ideas can grow into market-leading companies.

Share this