When Snowflake Computing was founded 10 years ago, the big data market was very different from what it is today. Behind something called Hadoop, momentum was building, while cloud computing was viewed with suspicion. Despite these headwinds, the founders of Snowflake stayed true to their original vision and ultimately played a major role in flipping the Big Data script and achieving massive success along the way. But where will this success lead over the next decade?
The first two years of Snowflake Computing were spent in stealth mode. Co-founders Benoit Dageville, Thierry Cruanes, and Marcin Zukowski were all data warehousing veterans who previously worked at Oracle, IBM, and Actian, so they had an insider’s view of the limitations of data warehousing. By running a data warehouse in the cloud and separating compute from storage, they believed they could overcome these limitations.
Snowflake’s first entry into the public eye was quite modest. On October 21, 2014, the company simultaneously sneaked out and announced a Series B funding round led by Redpoint worth $26 million, which is not a large amount by today’s standards. capital risk. Former Microsoft executive Bob Muglia, who was chosen to be its first CEO, unveiled its first commercial offering, dubbed Elastic Data Warehouse, which ran exclusively on Amazon Web Services and became generally available in 2015.
The following years were spent in headlong mode, iterating on this first version of the Snowflake data warehouse running on AWS, and banging the drums for its style of big data processing in the cloud. The company won first place in the Strata + Hadoop World Startup Showcase in 2015 and raised a $45 million Series C round later that year (it would later be expanded to $76 million). It has partnered with BI vendors like Tableau, Looker, and MicroStrategy; railed against failed Big Data projects; and promoted its cost savings compared to other nascent cloud providers.
A fiery Muglia fired at competitors including Hadoop. The startup’s CEO tore the open-source software project to shreds in 2017 DataName interview, before Hadoop’s implosion seemed imminent. “I can’t find a satisfied Hadoop customer. It’s kind of as simple as that,” he said. “It’s very clear to me, technologically speaking, that this is not the technological foundation on which the world will be built in the future.”
Momentum started building for Snowflake in 2017 with a pair of new features. The first newly launched product was Snowpipe, a streaming data ingest capability, and the second is the start of data sharing.
That year, Snowflake laid “the basic underlying building block that allows two separate Snowflake accounts to collaborate on shared data resources in a meaningful, secure, and well-governed way,” says Torsten Grabs, director of managing Snowflake products for the data lake, data pipelines, and data science.
The company’s growing customer base gave confidence to potential investors, and in April 2017 the company closed a $100 million Series D funding round, bringing the startup’s total funding to $205 million. .
With solid growth, Snowflake expanded into Microsoft Azure, which opened the company up to many other companies that run on this cloud infrastructure. At the end of 2018, the company passed the 1,000 customer mark, barely three years after its existence. It’s also started to take how it partners more seriously with the launch of the Partner Connect program, “which allows users to create a Snowflake account through a partner experience and vice versa,” Grabs says. Snowflake closed two massive funding rounds in 2018, including a $263 million Series E round in January and a $450 million round in October. At that time, it was valued at $3.5 billion.
2019 will mark a transition period for the company, which is still called Snowflake Computing and is still based in San Mateo, California. Product-wise, it has expanded to Google Cloud. He also launched the Snowflake Data Marketplace, as well as the cross-cloud building blocks that would eventually be known as Snowgrid. Commercially, Muglia was replaced by Frank Slootman, a former ServiceNow executive, as CEO in May 2019.
The COVID-19 pandemic was a blow to many companies’ early 2020 plans, but Snowflake seemed to ride the punches. The company launched several new products at its annual user conference, including the availability of Snowsight, a new graphical interface designed to allow users to get closer to data. It also introduced Snowpark, which would give users the ability to work with Snowflake data in a language other than SQL. Finally, in a nod to the importance of partners, it launched a new formal partner program, dubbed Snowflake Partner Connect.
And who could forget Snowflake’s big debut on the New York Stock Exchange under the symbol SNOW? That September IPO raised $3.4 billion (giving the company a valuation of $33 billion) and was dubbed by the mainstream press as “the biggest IPO ever for a tech company.” software” (even if it is a cloud service provider). The company also doubled its customer base from 1,550 in July 2019 to 3,100 in July 2020.
2021 has been another busy year for the new public company, with the launch of Snowpark for Python, which is still in public preview. Snowflake also extended support for unstructured data, which is more important for the types of AI use cases where Python would be used (whereas traditional SQL queries run on structured tabular data) . Snowflake has also begun enabling Snowflake Marketplace participants to monetize their data. It also launched data clean rooms, as well as the first two vertical clouds, for media and financial services. We can’t forget the brief period in early 2021 when Snowflake identified as “no seat”, before Slootman moved to Bozeman, Montana.
Snowflake’s evolution continued in 2022, with several notable unveilings at its June user party, including Unistore, its first storage repository for transactions; improvements to Snowpipe for streaming data pipelines; and a private preview of its new data application framework based on its acquisition of Streamlit for $800 million in March. The company also announced support for Apache Iceberg, the growing open table format; launched new vertical clouds for healthcare and life sciences, as well as commerce; and also launched a new security offer.
Snowflake now has 6,000 customers and, with a market capitalization of over $55 billion, is considered one of the cloud giants, a title it should share with another post-Hadoop big data star. , Databricks. The company generated more than $1.2 billion in revenue in fiscal 2022, but it’s struggling to please Wall Street, which has driven its stock price down to around $180 per share, or less than half of its all-time high in November 2021. And while customers are complaining about unexpected costs, it’s clear that Snowflake’s customer numbers show it’s doing something right.
No longer content to provide customers with instant access to unlimited SQL computation on large data sets in a data lake environment, Snowflake is playing the big data long game and positioning itself for the next big thing. For Grabs, who joined the company in 2017, it’s less about moving away from traditional data warehousing and more about continuing the company’s original journey.
“To me, it doesn’t look as such as a drastic change from where we were initially, because already in the beginning, Benoit and Thierry were thinking of Snowflake as a data lake offering,” he says. “They intended to consider Hadoop as another big data processing platform that Snowflake should compete well with from the start.”
Hadoop was the big competitor in those early days, and Snowflake was spending as much time replacing Hadoop as it was installing entirely new enterprise data warehouses, Grabs says. The fact that Snowflake soared while Hadoop fell is definitely relevant to this conversation. “We’re the best Hadoop,” Grabs jokes.
But where will the company go next? The company has invested in several areas immediately adjacent to the world of advanced analytics, including AI, streaming data, data applications, converged OLAP/OLTP, data cleanrooms, and data clouds. vertical data. Which of these will define Snowflake in 10 years?
This answer is unclear, but one thing is this: the company will not stand still. “We need to innovate every day,” says Grabs. “We cannot rest on the laurels of what has been done in the past.”
Grabs likes to remind its clients that each year the window of time they have to process new data and make a decision is getting smaller and smaller. As this window decreases, the volume of data increases and latency demands become more and more stringent. These are some of the business challenges that drive many investments in streaming data analytics and real-time databases. Snowflake is also following this challenge and looking for ways to keep customers on top.
“We are getting very creative about different storage layouts and how we physically represent storage. We are by no means a column store,” says Grabs. “It’s also driving our investment in materialized tables, dynamic tables that essentially update as new data comes in and also quite frankly for hybrid tables, with our Unistore workloads, which give a different latency profile , a response time profile, than a typical Snowflake table does.”
Snowflake appears in the biggest software IPO of all time
Hadoop Has Let Us Down, Tech Experts Say
Hotshots Database Building a Warehouse from Scratch for the Cloud
10 years, Benoit Dageville, big data, Bob Muglia, data cloud, Data engineering, data science, decade, Elastic Cloud, Frank Slootman, Hadoop, Marcin Zukowski, Snowflake, Snowflake Data Cloud, Snowflake Data Marketplace, Snowpark, Snowpipe, Thierry Cruanes
#Snowflake #reflects #years #ponders #years