Amazon Web Services vs Microsoft Azure
Last updated: November 02, 2021
Access a reliable, on-demand infrastructure to power your applications, from hosted internal applications to SaaS offerings. Scale to meet your application demands, whether one server or a large cluster. Leverage scalable database solutions. Utilize cost-effective solutions for storing and retrieving any amount of data, any time, anywhere.
Microsoft Azure is an open and flexible cloud platform that enables you to quickly build, deploy and manage applications across a global network of Microsoft-managed datacenters. You can build applications using any alternative language, tool or framework. And you can integrate your public cloud applications with your existing IT environment.
Amazon Web Services vs Microsoft Azure in our news:
2021. Microsoft launches Azure Container Apps, a new serverless container service
Microsoft today announced the preview launch of Azure Container Apps, a new fully managed serverless container service that complements the company’s existing container infrastructure services like the Azure Kubernetes Service (AKS). Microsoft notes that Azure Container Apps was specifically built for microservices, with the ability to quickly scale based on HTTP traffic, events or long-running background jobs. In many ways, it’s probably most like AWS App Runner, one of Amazon’s small fleet of serverless container services, with App Runner also specifically focused on microservices. Google meanwhile also offers a set of container-centric services, including Cloud Run, its serverless platform for running container-based applications.
2020. AWS launches Amazon AppFlow, its new SaaS integration service
AWS launched Amazon AppFlow, a new integration service that makes it easier for developers to transfer data between AWS and SaaS applications like Google Analytics, Marketo, Salesforce, ServiceNow, Slack, Snowflake and Zendesk. Like similar services, including Microsoft Azure’s Power Automate, for example, developers can trigger these flows based on specific events, at pre-set times or on-demand. Unlike some of its competitors, though, AWS is positioning this service more as a data transfer service than a way to automate workflows, and, while the data flow can be bi-directional, AWS’s announcement focuses mostly on moving data from SaaS applications to other AWS services for further analysis. For this, AppFlow also includes a number of tools for transforming the data as it moves through the service.
2019. Microsoft launched cloud APIs for form and handwriting recognition
Microsoft introduced several new cognitive services on its Azure Machine Learning cloud platform. First, these are gifts for companies dealing with documents, forms and office notes with handwritten text. The Ink Recognizer and Form Recognizer services allow to transform all these paper documents into digital text and data. Conversation Transcription service - transforms phone dialogs into text with each phrase author recognition. Another new service Personalizer allows you to provide personalized recommendations for website or online store visitors basing on behavioral factors. In addition, Microsoft introduced a new visual interface to create machine learning models. Now even marketers can play with ML. You just need to load the database and specify which parameter you want to predict.
2019. Microsoft launched own Windows Virtual Desktop service
Virtual desktop services have been long provided by Microsoft's numerous cloud partners, and now the company has realized that it can do it alone. The new Windows Virtual Desktop service (which is now available for companies on Microsoft Azure cloud platform) allows to install Windows, Office and other software licenses on the cloud, but not on employees' computers. And employees will be able to work with their software via a virtual desktop. What is the sense of this? First, it allows even an old Win7 computer to work fast, and provide Windows 10. Second, it is more convenient for the administrator to create new workplaces, maintain them and ensure security. The service itself is free. You only pay for the additional Azure resources (memory, CPU time) that you consume.
2019. AWS launches fully-managed backup service for business
Amazon’s AWS cloud platform has added a new service Backup, that allows companies to back up their data from various AWS services and their on-premises apps. To back up on-premises data, businesses can use the AWS Storage Gateway. The service allows users to define their various backup policies and retention periods, including the ability to move backups to cold storage (for EFS data) or delete them completely after a certain time. By default, the data is stored in Amazon S3 buckets. Most of the supported services, except for EFS file systems, already feature the ability to create snapshots. Backup essentially automates that process and creates rules around it, so it’s no surprise that the pricing for Backup is the same as for using those snapshot features (with the exception of the file system backup, which will have a per-GB charge).
2018. Microsoft Azure gets new high-performance storage options
Microsoft Azure is getting a number of new storage options that mostly focus on use cases where disk performance matters. The first of these is Azure Ultra SSD Managed Disks, which are now in public preview. Microsoft says that these drives will offer “sub-millisecond latency,” which unsurprisingly makes them ideal for workloads where latency matters. Standard SSD Managed Disks are now generally available after only three months in preview. To top things off, all of Azure’s storage tiers (Premium and Standard SSD, as well as Standard HDD) now offer 8, 16 and 32 TB storage capacity. Also new today is Azure Premium files, which is now in preview. This, too, is an SSD-based service. Azure Files itself isn’t new, though. It offers users access to cloud storage using the standard SMB protocol. This new premium offering promises higher throughput and lower latency for these kind of SMB operations.
2017. AWS launched browser-based IDE for cloud developers
2017. AWS introduced per-second billing for EC2 instances
Over the last few years, some alternative cloud platforms moved to more flexible billing models (mostly per-minute billing) and now AWS is one-upping many of them by moving to per-second billing for its Linux-based EC2 instances. This new per-second billing model will apply to on-demand, reserved and spot instances, as well as provisioned storage for EBS volumes. Amazon EMR and AWS Batch are also moving to this per-second model. it’s worth noting, though, that there is a one-minute minimum charge per instance and that this doesn’t apply to machines that run Windows or some of the Linux distributions that have their own separate hourly charges.
2017. AWS offers a virtual machine with over 4TB of memory
Amazon’s AWS launched its largest EC2 machine (in terms of memory size) yet: the x1e.32xlarge instance with a whopping 4.19TB of RAM. Previously, EC2’s largest instance only featured just over 2TB of memory. These machines feature quad-socket Intel Xeon processors running at 2.3 GHz, up to 25 Gbps of network bandwidth and two 1,920GB SSDs. There are obviously only a few applications that need this kind of memory. It’s no surprise, then, that these instances are certified to run SAP’s HANA in-memory database and its various tools and that SAP will offer direct support for running these applications on these instances. It’s worth noting that Microsoft Azure’s largest memory-optimized machine currently tops out at just over 2TB and that Google already calls it quits at 416GB of RAM.
2017. Microsoft launched new archival storage option for Azure
Microsoft introduced a new storage option for its Azure cloud computing platform - Azure Archive Blob Storage. This will give developers a cheaper alternative for the long-term storage of large amounts of archival data like logs, raw camera footage, audio recordings, transcripts and medical documents and images. The main difference between the cool and archive tiers is that while archival storage is cheaper, the data retrieval costs are higher. Data that’s stored in the archive tier is also not immediately available for retrieval. The blobs first have to be “rehydrated” and that can take up to 15 hours for blobs that hold less than 50GB of data. It’s worth noting, though, that alternative cold strorage services Amazon Glacier and Google Near have been around for years now.