Azure Cosmos DB Conf

Live Stream Schedule

Azure Cosmos DB Conf - Keynote

Join Director of Product Management for Azure Cosmos DB, Kirill Gavrylyuk to kick off Azure Cosmos DB Conf. Kirill will highlight key Cosmos DB capabilities and recent announcements. Kirill will also be joined by some notable guests who will share their stories using Cosmos DB, including Anand Krishnamurthy, Group Engineering Manager, Microsoft Teams, Guillermo Rauch, CEO of Vercel and Meg Holzinger, Software Design Engineer from Yammer.

Kirill Gavrylyuk

Online and Real-Time Database Migration to Azure Cosmos DB with Striim

There is significant demand for moving and continuous data integration as workloads shift to the cloud. Modernizing databases by offloading workloads to cloud requires building real-time data pipelines from legacy systems. The Striim® platform is an enterprise-grade cloud data integration solution that continuously ingests, processes, and delivers high volumes of streaming data from diverse sources, on-premises or in the cloud. Cloud architects, data architects, and data engineers can use Striim to move data into Azure Cosmos DB in a consumable form, quickly and with sub-second latency to easily run critical transactional and analytical workloads in Microsoft Azure. We will provide a live demo showing how Striim is used to migrate enterprise databases to Azure continuously and in real time with no downtime. In this session you will learn how to: Prepare Azure Cosmos DB for data integration with automatic target schema and tables creation that reflects the source database Set up in-flight transformations right in the GUI to minimize end-to-end latency enable real-time analytics & operational reporting Deploy zero-downtime migrations to Azure Cosmos DB from existing enterprise databases anywhere Increase IT productivity and reduce cost of ownership by integrating and enriching data from multiple sources

Andrew Liu

Alok Pareek

Data and Different System Architectures to Build a Truly Unified View

Digital components enterprises need to develop a 360-degree view of their customers and the capabilities and limitations inherent in the technologies available to help them do that. In this session, Sandeep Nawathe will cover a new approach — one that unifies disparate data sources and system architectures to develop a Unified Profile that is actionable in real-time.

Sandeep Nawathe

Bridge to Azure: Streaming data into Cosmos DB with Confluent

Confluent enables large scale, big data pipelines that automate real-time data movement across any systems, applications, and architectures at massive scale. Empowering organizations to aggregate, transform, and move data from on-premises legacy services, private clouds, or public clouds and into Azure. Confluent and Microsoft’s Commercial Software Engineering group have worked together to build a self-managed connector. The Azure Cosmos DB Connector provides a new integration capability between Azure Cosmos DB and Confluent Platform. Microsoft provides enterprise support for the Azure Cosmos DB Connector. Want to know more about your options for Bridge to Cloud with Confluent? During this session, you will learn more about: -Getting started with Confluent on Azure -Design patterns for Bridge to Cloud with Confluent -Quickly install the Azure Blob Storage Connector without having to spin up additional infrastructure -Tips and tricks for installing and configuring the Cosmos DB Self Managed Connector

Alicia Moniz

Data Modeling and Partitioning in Azure Cosmos DB

For many newcomers to Cosmos DB, the learning process starts with data modeling and partitioning. How should you structure your model? When should you combine multiple entity types in a single container? Should you de-normalize your entities? What’s the best partition key for your data? In this session, we discuss the key strategies for modeling and partitioning data effectively in Cosmos DB. Using a real-world NoSQL example based on the AdventureWorks relational database, we explore key Cosmos DB concepts—request units (RUs), partitioning, and data modeling—and how their understanding guides the path to a data model that yields the best performance and scalability. Attend this session, and acquire the critical skills you’ll need to design the optimal database for Cosmos DB.

Leonard Lobel

Developing HTAP Analytical Solutions with Azure Cosmos DB and Azure Synapse Analytics

Using Azure Synapse Link for Azure Cosmos DB, we can enable cloud-native hybrid transactional and analytical processing on our Cosmos DB data. This enables data engineers, analysts and data scientists to perform advanced analytics on data living in Cosmos DB without affecting the performance of transactional workloads inside Cosmos DB. In this session, I'll show you how you can use Azure Synapse Link for Cosmos DB to get near real-time insights into operational data with no ETL required and no impact on operational workloads. By the end of this session, you will know what Synapse Link for Cosmos DB is and how it works with the Cosmos DB Analytical Store, how to enable Synapse Link for your Cosmos DB containers, how you can analyze your data using Synapse Apache Spark or SQL Serverless and in what situations to use Synapse Link for Cosmos DB.

Will Velida

Cosmos DB for the SQL Professional

Knowing data is crucial when it comes to design - and SQL Professionals have this skill refined. But knowing implementation detail is also important. While there are similarities between these two technologies, there are also some large differences - some of which may surprise the SQL Professional. Sometimes an implementation oversight can bite you hard at the wrong time - and at a time when correcting that oversight might be costly and problematic. Come along to this session and find out why a Primary Key is not what you think, tables are not equal to containers and a few other details that might surprise you. This short and sharp session should set you on your way to becoming as good at Cosmos as you are at SQL.

Martin Catherall

GraphQL over CosmosDB, where to start

The popularity of GraphQL can't be denied and it fits really nicely with a schemaless data model like we can achieve with GraphQL. But how do we get started with it? What options do we have when it comes to running GraphQL in Azure? How can we get rich type safety between our GraphQL schema and CosmosDB models? In this session we'll go from zero to hero and get you running your first GraphQL server underpinned with CosmosDB, ready for your next application.

Aaron Powell

Real-Time Scoring to Improve Personalization with Adobe Experience Platform

For an Insurance company, we generated real-time sales predictions for anonymous, unknown website visitors using Adobe Experience Platform. We discuss how we used Adobe Experience Platform and Data Science Workspace services to provide brands with the ability to use ML Models to score customer behavior in real-time to personalize and contextualize personalization in order to increase conversions.

Kalaiyarasan Venkatesan

Design and implementation of Cosmos DB Change Feed-centric architecture

This session will present actual development cases of design based on Lambda architecture using Cosmos DB Change Feed and implementation patterns using Azure Serverless technology. This session will be presented by two Microsoft MVPs (Azure) based on their experience. (Delivered in Japanese).

Kazuyuki Miyake

Tatsuro Shibamura

Modern infrastructure for modern apps with Cosmos DB, AKS and Pulumi

Building scalable applications begins with adopting scalable architecture patterns in Azure. Pulumi lets you define and deploy resources in Azure using .NET, Node.js, Python, and Go. The next-generation Azure provider gives you access to 100% of Azure with same-day support for all new features. Join Azure MVP, Mikhail Shilkov as he shows you how to build a foundation for your modern apps with CosmosDB, AKS, and Pulumi.

Mikhail Shilkov

Cosmos DB + Azure Functions: A serverless database processing

With the native integration between Cosmos DB and Azure Functions, you can create database triggers, input bindings, and output bindings directly from your account. Using Azure Functions and Cosmos DB, you can create and deploy event-driven serverless apps with low-latency access to rich data for a global user base. In this session, we'll explore what it takes to setup a serverless environment capable of performing CRUD operations on a Cosmos DB account, as well as some recommendations and use cases.

Luis Beltran

Data modeling and schema design for Cosmos DB

Facilitate migration from RDBMS to Cosmos DB through denormalization to leverage the scalability, flexibility, and easy evolution of schemas for Cosmos DB. In this session, we will demonstrate how data modeling helps harness the power of Cosmos DB, resulting in reduced development time, increased application quality, and lower execution risks across the enterprise. The Hackolade data modeling tool also supports forward- and reverse-engineering use cases with the Core SQL API, the MongoDB API, and the Gremlin API.

Pascal Desmarets

Integrating Azure Cosmos DB in your Cloud Solution

When there is a NoSQL requirement in your cloud solution on the Azure Cloud Platform, there is a managed service available with Cosmos DB. Azure Cosmos DB is Microsoft’s offering for a distributed NoSQL service in the Cloud. As a Cloud Architect, I designed solutions leveraging Azure Cosmos DB, and I like to share my experience in this session. You can create cloud solutions with Azure Cosmos DB integrating them with other services such as Azure Search, Functions, and Logic Apps. In the session, I will show the integration, combined with real-world use cases and demos.

Steef-Jan Wiggers

Vercel + CosmosDB: The Journey

In this session I'd like to go through the most important challenges we had to solve at Vercel using CosmosDB. During an almost 4 years journey we had to face a variety of challenges that could be of interest for attendees such as: - Tracing operations. - Custom metrics reporting. - Repartition of migrations. - Custom indexing across partitioned collections. - Many to many relationships. - Performance optimizations. - Helper functions. I'd mostly focus on the features we have built that worked pretty well for us and sharing the stunning throughput we have had with CosmosDB.

Javi Velasco

Operational Triumph with Cosmos DB

Since its official release in 2017 ASOS has been using Cosmos DB to reimagine our Commerce platform. Let’s take a look at how Cosmos DB has been vital to conquer our operational data needs while remaining flexible for peak periods such as Black Friday. By combining Cosmos DB and Event Hubs we have developed robust hassle-free Big Data ingestion pipelines. We'll finish by showing how we have adopted Data Mesh principles and developed a distributed lambda architecture.

Gary Strange

Amrish Patel