Top 17 Public Cloud Platforms
Last updated: June 24, 2022
Public Cloud platforms provide on-demand storage and computer resources for enterprise data and applications that allow to save money and enhance data security.
Access a reliable, on-demand infrastructure to power your applications, from hosted internal applications to SaaS offerings. Scale to meet your application demands, whether one server or a large cluster. Leverage scalable database solutions. Utilize cost-effective solutions for storing and retrieving any amount of data, any time, anywhere.
Microsoft Azure is an open and flexible cloud platform that enables you to quickly build, deploy and manage applications across a global network of Microsoft-managed datacenters. You can build applications using any alternative language, tool or framework. And you can integrate your public cloud applications with your existing IT environment.
Google Cloud Platform is a set of modular cloud-based services that allow you to create anything from simple websites to complex applications. Cloud Platform provides the building blocks so you can quickly develop everything from simple websites to complex applications. Explore how you can make Cloud Platform work for you.
Heroku is the leading platform as a service in the world and supports Ruby, Java, Python, Scala, Clojure, and Node.js. Deploying an app is simple and easy. No special alternative tools needed, just a plain git push. Deployment is instant, whether your app is big or small.
The developer cloud helping millions of developers easily build, test, manage, and scale applications of any size – faster than ever before.
Dell's Virtustream Enterprise Class Cloud provides a secure, highly available, Infrastructure as a Service (IaaS) to enterprises and government customers.
Rackspace Cloud offers four alternative hosting products: Cloud Servers for on-demand computing power; Cloud Sites for robust web hosting; Cloud Load Balancers for easy, on-demand load balancing and high availability; and Cloud Files for elastic online file storage and CDN.Rackspace Cloud hosting customers never need to worry about buying new hardware to meet increasing traffic demands or huge traffic spikes.
Oracle Public Cloud provides customers and partners with a high-performance, reliable, elastic, and secure infrastructure for their critical business applications and offers customers a complete range of business applications and technology solutions, avoiding the problems of data and business process fragmentation when customers use multiple siloed public clouds.
Get the best of both worlds – the power of real time + the simplicity of the cloud – with our cloud-based deployment option for SAP Business Suite powered by SAP HANA, SAP NetWeaver BW powered by SAP HANA, and the SAP HANA platform.
Salesforce Lightning Platform is the proven cloud platform to automate and extend your business and deliver the social enterprise. Salesforce Lightning Platform is an extremely powerful, scalable and secure cloud platform, delivering a complete technology stack covering the ground from database and security to workflow and user interface. Build the social, mobile apps you need to power your Social Enterprise.
on Live Enterprise
IBM Cloud offers open cloud infrastructure services for IT operations. The IBM Cloud gives you the flexibility to have public, private or hybrid clouds, depending on your business needs. With the IBM Cloud you can unlock more value in your business and in the technology you already have. It’s the cloud that can integrate enterprise-grade services and help speed up the way you innovate.
Skytap provides Environments as a Service to the enterprise. Skytap Environments as a Service remove the biggest constraints slowing development teams down. Our solution removes the inefficiencies and constraints that companies have within the software development lifecycle.
Alibaba Cloud offers a integrated suite of cloud products and services that are reliable and secure, to help you build cloud infrastructure, data centers in multi regions empower your global business.
Joyent is the high-performance cloud computing infrastructure and big data analytics platform, offering organizations of any size the best public and hybrid cloud infrastructure for today's demanding real-time web and mobile applications.
CloudShare provides a secure, self-service public cloud that extends internal IT capabilities. CloudShare enables you to build complete production-like environments in minutes. Deliver fully-functional demos, proof-of-concepts, and evaluation environments on demand and online. Provide effective hands-on technical training to employees, partners, and customers. Create virtual environments while gaining access to IT resources at the speed of agile development.
AppHarbor is a fully hosted .NET Platform as a Service. AppHarbor can deploy and scale any standard .NET application to the cloud.
SuiteCloud is a comprehensive offering of cloud development tools, applications and infrastructure that enables customers and software developers to maximize the benefits of cloud computing. SuiteCloud comprises a multi-tenant cloud platform that consists of Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), and Software-as-a-Service (SaaS). The SuiteCloud Developer Tools are uniquely built on NetSuite's leading cloud business management suite.
Latest news about Public Cloud Platforms
2022. Google expands Vertex, its managed AI service, with new features
Roughly a year ago, Google announced the launch of Vertex AI, a managed AI platform designed to help companies to accelerate the deployment of AI models. Today the company announced new features heading to Vertex, including a dedicated server for AI system training and “example-based” explanations. As Google has historically pitched it, the benefit of Vertex is that it brings together Google Cloud services for AI under a unified UI and API. Customers including Ford, Seagate, Wayfair, Cashapp, Cruise and Lowe’s use the service to build, train and deploy machine learning models in a single environment, Google claims — moving models from experimentation to production.
2021. Microsoft launches Azure Container Apps, a new serverless container service
Microsoft today announced the preview launch of Azure Container Apps, a new fully managed serverless container service that complements the company’s existing container infrastructure services like the Azure Kubernetes Service (AKS). Microsoft notes that Azure Container Apps was specifically built for microservices, with the ability to quickly scale based on HTTP traffic, events or long-running background jobs. In many ways, it’s probably most like AWS App Runner, one of Amazon’s small fleet of serverless container services, with App Runner also specifically focused on microservices. Google meanwhile also offers a set of container-centric services, including Cloud Run, its serverless platform for running container-based applications.
2021. IBM Cloud Satellite brings IBM public cloud on premises
While IBM's primary cloud message over the past few years has been about multi-cloud, it has continued to offer its own public cloud and has targeted it at complex enterprise workloads that typically run on mainframes or other systems. The new platform IBM Cloud Satellite is intended as an extension of the IBM Public Cloud that can run inside the customer's data center or out at the edge. Like IBM's other hybrid cloud offerings, underneath the hood IBM Cloud Satellite runs on Red Hat OpenShift, the Kubernetes management environment. It works by adding the mechanism of a Location that signifies an instance of IBM Public Cloud outside IBM's data centers.
2020. Koyeb raises $1.6M for its serverless data-processing engine
French startup Koyeb, that focuses on data-processing workflows across multiple cloud providers, has raised a $1.6M pre-seed round. Koyeb believes that companies will take advantage of the best cloud-native APIs and storage services going forward. In order to mix-and-match those various providers, Koyeb provides the serverless glue that ties everything together. For instance, you can store videos on an object storage managed by DigitalOcean, transcribe the audio from those video files on Google Cloud using Google’s speech-to-text API and save the results on another object storage bucket.
2020. Cloudflare launches Workers Unbound, the next evolution of its serverless platform
Cloudflare has launched Workers Unbound, the latest step in its efforts to offer a serverless platform that can compete with the likes of AWS Lambda. The company first launched its Workers edge computing platform in late 2017. Today it has “hundreds of thousands of developers” who use it, and in the last quarter alone, more than 20,000 developers built applications based on the service, according to the company. Cloudflare also uses Workers to power many of its own services, but the first iteration of the platform had quite a few limitations. The idea behind Workers Unbound is to do away with most of those and turn it into a platform that can compete with the likes of AWS, Microsoft and Google.
2020. Cloud for developers DigitalOcean raises $50M
DigitalOcean, the cloud for developing modern apps, today announced it has closed a $50 million Series C funding round led by Access Industries, with participation from Andreessen Horowitz. DigitalOcean Cloud simplifies modern app creation for new generations of developers — from individual developers to startups and SMBs. Its infrastructure and platform-as-a-service (IaaS and PaaS) solutions provide a “no DevOps required” experience, allowing developers to focus their energy on creating innovative software. The new $50 million values the company at $1.15 billion, meaning it was worth $1.1 billion pre-money.
2020. CloudShare extends virtual IT Labs solutions to Google Cloud Platform
CloudShare, a provider of specialized cloud environments, announced that it is an official Google Cloud Partner and that its scalable, hands-on virtual training solution is now available on Google Cloud Platform (GCP). Software companies can now run comprehensive training on top of their commodity cloud. CloudShare is also developing transparent training solutions for AWS and Azure, which are scheduled for release in the coming months.
2020. Cloud infrastructure provider DigitalOcean raises $100M
DigitalOcean, a cloud infrastructure provider targeting smaller business and younger companies, announced today that it has secured $100 million. Because DigitalOcean is a self-serve SaaS business, folks can show up and get started without hand-holding from sales. Sales cycles are expensive and slow. But, while allowing small companies to sign up on their own sounds attractive, companies that often lean on this acquisition method struggle with churn. DigitalOcean is working to carve out an SMB and developer-focused cloud infra niche, keeping its economics in a good place by using low-CAC, self-serve revenue generation. The margins from that are paying for the company’s development, and its overall economics are good enough to allow it to leverage debt to invest in itself instead of equity. Overall, not what I expected to hear this morning, but that’s the fun part of news.
2019. Google Cloud gets a new family of cheaper general-purpose compute instances
Google Cloud announced the launch of its new E2 family of compute instances. These new instances, which are meant for general-purpose workloads, offer a significant cost benefit, with saving of around 31% compared to the current N1 general-purpose instances. The new system is also smarter about where it places VMs, with the added flexibility to move them to other hosts as necessary. To achieve all of this, Google built a custom CPU scheduler. Google says that “unlike comparable options from other cloud providers, E2 VMs can sustain high CPU load without artificial throttling or complicated pricing. It’ll be interesting to see some benchmarks that pit the E2 family against similar offerings from AWS and Azure.
2019. Google Cloud adds a managed service for Microsoft’s Active Directory
Microsoft’s Active Directory remains one of the most-used identity services in the enterprise. Google Cloud Platform has long allowed you to manually set up an Active Directory deployment, but today, Google is taking this a step further by announcing the beta of a managed service. As the name implies, Google will manage this service and automate everything from server maintenance to security configurations. Given Google’s recent focus on hybrid-cloud deployments, you also can use this service to extend your existing on-premises Active Directory domains to the cloud.
2019. Google launched its coldest storage service yet
Google launched a new archival cold storage service. This new service, which doesn’t seem to have a fancy name, will complement the company’s existing Nearline and Coldline services for storing vast amounts of infrequently used data at an affordable low cost. The new archive class takes this one step further, though. It’s cheap, with prices starting at $0.0012 per gigabyte and month. That’s $1.23 per terabyte and month. What makes Google cold storage different from the likes of AWS S3 Glacier, for example, is that the data is immediately available, without millisecond latency. Glacier and similar service typically make you wait a significant amount of time before the data can be used. The new service will become available later this year.
2019. AWS launches fully-managed backup service for business
Amazon’s AWS cloud platform has added a new service Backup, that allows companies to back up their data from various AWS services and their on-premises apps. To back up on-premises data, businesses can use the AWS Storage Gateway. The service allows users to define their various backup policies and retention periods, including the ability to move backups to cold storage (for EFS data) or delete them completely after a certain time. By default, the data is stored in Amazon S3 buckets. Most of the supported services, except for EFS file systems, already feature the ability to create snapshots. Backup essentially automates that process and creates rules around it, so it’s no surprise that the pricing for Backup is the same as for using those snapshot features (with the exception of the file system backup, which will have a per-GB charge).
2018. Microsoft Azure gets new high-performance storage options
Microsoft Azure is getting a number of new storage options that mostly focus on use cases where disk performance matters. The first of these is Azure Ultra SSD Managed Disks, which are now in public preview. Microsoft says that these drives will offer “sub-millisecond latency,” which unsurprisingly makes them ideal for workloads where latency matters. Standard SSD Managed Disks are now generally available after only three months in preview. To top things off, all of Azure’s storage tiers (Premium and Standard SSD, as well as Standard HDD) now offer 8, 16 and 32 TB storage capacity. Also new today is Azure Premium files, which is now in preview. This, too, is an SSD-based service. Azure Files itself isn’t new, though. It offers users access to cloud storage using the standard SMB protocol. This new premium offering promises higher throughput and lower latency for these kind of SMB operations.
2018. Rackspace acquired Salesforce specialist RelationEdge
Rackspace has acquired RelationEdge, a Salesforce implementation partner . Rackspace is still best known for its hosting and managed cloud and infrastructure services. So the company clearly wants to expand its portfolio, though, and add managed services for SaaS applications to its lineup. It made the first step in this direction with the acquisition of TriCore last year, another company in the enterprise application management space. Today’s acquisition builds upon this theme.
2018. Google Compute Engine adds simple machine learning service
Google launched AutoML - a new service on Google Compute Engine that helps developers — including those with no machine learning (ML) expertise - build custom image recognition models. It’s no secret that it’s virtually impossible for businesses to hire machine learning experts and data scientists these days. There is simply too much demand and not enough supply. The new service allow virtually anybody to bring their images, upload them (and import their tags or create them in the app) and then have Google’s systems automatically create a customer machine learning model for them. The whole process, from importing data to tagging it and training the model, is done through a drag and drop interface. We’re not talking about something akin to Microsoft’s Azure ML studio
2017. AWS launched browser-based IDE for cloud developers
2017. Kubernetes comes to Amazon Web Services
Amazon Web Services added long-awaited support for the Kubernetes container orchestration system on top of its Elastic Container Service (ECS). Kubernetes has become something of a de facto standard for container orchestration. It already had the backing of Google (which incubated it), as well as Microsoft and virtually every other major cloud player. So AWS is relatively late to the party here but it does already have over 100,000 active container clusters on its service and that these users spin up millions of containers already. AWS’s users are clearly interested in running containers and indeed, many of them already ran Kubernetes on top of AWS, but without the direct support of AWS. But with this new service, AWS will manage the container orchestration system for its users. ECS for Kubernetes will support the latest versions of Kubernetes and AWS will handle upgrades and all of the management of the service and its clusters.
2017. Google Cloud Platform cuts the price of GPUs by up to 36 percent
Google is cutting the price of using Nvidia’s Tesla GPUs through its Compute Engine by up to 36 percent. In U.S. regions, using the somewhat older K80 GPUs will now cost $0.45 per hour while using the newer and more powerful P100 machines will cost $1.46 per hour (all with per-second billing). Thus Google is aiming this feature at developers who want to run their own machine learning workloads on its cloud, though there also are a number of other applications — including physical simulations and molecular modeling — that greatly benefit from the hundreds of cores that are now available on these GPUs.
2017. Cloud Foundry adds native Kubernetes support
Cloud Foundry, the open-source platform as a service (PaaS) offering for enterprise made an early bet on Docker containers, but with Kubo, which Pivotal and Google donated to the project last year, the project gained a new tool for allowing its users to quickly deploy and manage a Kubernetes cluster (Kubernetes being the Google-backed open-source container orchestration tool that itself is becoming the de facto standard for managing containers). The project is now taking Kubo, renaming it to “Cloud Foundry Container Runtime” (because who needs cute names, after all), and making it a core part of the Cloud Foundry platform. Unsurprisingly, Google and Pivotal worked with Cloud Foundry on building this integration.
2017. Following AWS, Google Compute Engine also moves to per-second billing
A week ago Amazon Web Services added per-second billing for users of its EC2 service. And Google today announced a very similar move. Google Compute Engine, Container Engine, Cloud Dataproc, and App Engine’s flexible environment virtual machines (VMs) will now feature per-second billing. This new pricing scheme extends to preemptible machines and VMs that run premium operating systems, including Windows Server, Red Hat Enterprise Linux and SUSE Enterprise Linux Server. With that, it one-ups AWS, which only offers per-second billing for basic Linux instances and not for Windows Server and other Linux distributions on its platform that currently feature a separate hourly charge. Like AWS, Google will charge for a minimum of one minute.
2017. AWS introduced per-second billing for EC2 instances
Over the last few years, some alternative cloud platforms moved to more flexible billing models (mostly per-minute billing) and now AWS is one-upping many of them by moving to per-second billing for its Linux-based EC2 instances. This new per-second billing model will apply to on-demand, reserved and spot instances, as well as provisioned storage for EBS volumes. Amazon EMR and AWS Batch are also moving to this per-second model. it’s worth noting, though, that there is a one-minute minimum charge per instance and that this doesn’t apply to machines that run Windows or some of the Linux distributions that have their own separate hourly charges.
2017. AWS offers a virtual machine with over 4TB of memory
Amazon’s AWS launched its largest EC2 machine (in terms of memory size) yet: the x1e.32xlarge instance with a whopping 4.19TB of RAM. Previously, EC2’s largest instance only featured just over 2TB of memory. These machines feature quad-socket Intel Xeon processors running at 2.3 GHz, up to 25 Gbps of network bandwidth and two 1,920GB SSDs. There are obviously only a few applications that need this kind of memory. It’s no surprise, then, that these instances are certified to run SAP’s HANA in-memory database and its various tools and that SAP will offer direct support for running these applications on these instances. It’s worth noting that Microsoft Azure’s largest memory-optimized machine currently tops out at just over 2TB and that Google already calls it quits at 416GB of RAM.
2017. Rackspace acquires multi-platform hybrid IT management solution Datapipe
Rackspace to acquire Datapipe, one of its largest competitors in the managed public and private cloud services business. While Datapipe has been extremely successful in the enterprise and with government customers, Rackspace has traditionally focused more on the mid-market segment. The two companies didn’t typically compete on every deal and he stressed that even their product portfolios are quite different, too. While Rackspace could have gained similar technical capabilities by making a number of smaller acquisitions, that process would have taken much longer and wouldn’t necessarily have given Rackspace access to the kind of customers that Datapipe currently works with. Those customers include a large number of large public-sector companies, but also the U.S. departments of defense, energy and justice, in addition to the U.K.’s cabinet office, ministry of justice and department of transportation.
2017. VMware Cloud is now live on Amazon Web Services
Last fall VMware announced partnership with AWS, and now the two companies uveiled combined solution for Enterprise - VMware Cloud on AWS. VMware Cloud on AWS gives customers a seamlessly integrated hybrid cloud that delivers the same architecture, capabilities and operational experience across both their vSphere-based on-premises environment and AWS. While AWS runs its own VMs, it’s not the same as those that VMware runs in a data center, and that creates a management headache for companies trying to run both. By letting companies move to AWS and continue to run the VMware VMs in the public cloud, they get the best of both worlds without the management problems.
2017. Google App Engine gets a firewall
Google App Engine is finally getting a fully featured firewall. Until now, developers couldn’t easily restrict access to their applications on the service to only a small set of IP addresses or address ranges for testing, for example. Instead, they had to hard-code a similar solution into their applications and — because those requests would still hit their applications in some form — even those rejected requests would still incur costs. Now, they’ll be able to use the Google Cloud Console, App Engine Admin API or even the gcloud command-line tool to set up access restrictions that block or allow specific IP addresses. Because the firewall obviously sits in front of the application, rejected requests never touch the application and App Engine never needs to spin up an idle resource only to then reject the request.
2017. Microsoft launched new archival storage option for Azure
Microsoft introduced a new storage option for its Azure cloud computing platform - Azure Archive Blob Storage. This will give developers a cheaper alternative for the long-term storage of large amounts of archival data like logs, raw camera footage, audio recordings, transcripts and medical documents and images. The main difference between the cool and archive tiers is that while archival storage is cheaper, the data retrieval costs are higher. Data that’s stored in the archive tier is also not immediately available for retrieval. The blobs first have to be “rehydrated” and that can take up to 15 hours for blobs that hold less than 50GB of data. It’s worth noting, though, that alternative cold strorage services Amazon Glacier and Google Near have been around for years now.
2017. Google Cloud Platform gets a cheaper, lower-performance networking tier
Google is giving its Cloud Platform users a new, cheaper networking option. Developers can now choose between a premium tier, which routes traffic to their users over Google’s own high-speed networks for as long as possible to minimize hops and distance, and a standard tier, which routes traffic over the public internet, with all the potential slowdowns and extra hops this entails. Pricing for the standard tier is 24-33 percent lower than for the premium tier in North America and Europe. Google uses different pricing models for these two tiers, though. Prices for premium traffic is based on the traffic’s source and destination, so you pay for the distance your traffic travels over Google’s network, while the standard tier’s prices are only based on where the source is.
2017. Skytap raises $45M for its enterprise cloud
Alternative enterprise cloud provider Skytap has raised a $45 million Series E funding round led by Goldman Sachs Private Capital Investing, with participation from existing investors. This brings Skytap’s total funding to date to $109 million. Skytap is going up against all of the major cloud service providers like AWS, Azure and Google Cloud Platform, but Skytap CEO Thor Culverhouse argues that his competitors’ clouds were built for greenfield applications while Skytap was specifically built to serve Fortune 500 companies that want to slowly modernize their enterprise applications. The idea, of course, is for these enterprises to then stay on the Skytap cloud as they adopt new development paradigms like microservices and technologies like software containers.
2017. GoDaddy kills off its AWS-style cloud services
Web hosting and domain registration business GoDaddy is making some moves to reorganize its business. It will sell its European PlusServer business to London-based private equity firm BC Partners for $456 million. Besides GoDaddy is shutting down Cloud Servers, a business it launched only last year as an AWS-style service for building, testing and scaling cloud solutions on GoDaddy’s infrastructure. Cloud Servers business was originally launched in March 2016. The idea was to tap into the recent vogue for cloud services, capturing new business from existing customers who were considering or starting to make early moves into cloud-based apps and services, before they made the leap to AWS, Google, Microsoft. So now GoDaddy will focus on competition with Wix, Jimdo and other alternatives.
2017. Microsoft launches new tools to help enterprises move to its Azure cloud
Microsoft says that 80 percent of the companies it talks to want to use a hybrid cloud approach and to help them move to its cloud platform Azure, the company is launching a number of new tools. The most important of these is the new Cloud Migration Assessment service. With this, companies can scan their existing IT infrastructure and get an estimate for what it would cost to move these services to Azure (and how much they could save in the process). Azure users can now also get a discount for moving their Windows Server licenses (with Software Assurance) to Azure. This new Azure Hybrid Use Benefit can save them up to 40 percent and is obviously meant to make it more attractive for existing Windows Server users to move their workloads to the cloud. For those who want to make that move, the Azure Site Recovery (ASR) tool is also getting a minor update. This service is mostly meant to help enterprises orchestrate their disaster recovery plans, however, it can also be used to migrate existing virtual machines to Azure.