Azure Cosmos DB – Multi model, Globally Distributed Database Service

CATEGORIES

BlogTech Community

Introduction to AZURE COSMOS DB

Azure Cosmos DB  is a multi-model database service by design, which can be easily globally distributed. Database engine supports storing of data in documents, key-value pairs and even graphs. Scaling of Cosmos DB across any number of available locations is extremally easy –  just press the appropriate button in the Azure portal. In modern web-based applications low latency is expected by the end users. With Cosmos DB you can store data closer to application users. Database can be distributed and available in 50+ regions. This creates enormous opportunities. Regions management can be involved at any time in application lifecycle.

Based on the above, global distribution of data with Cosmos DB provides a set of benefits such as:

  • support for NoSQL approach for data management,
  • easy management of massive data amounts (read, writes operations close to end users),
  • simplicity of cooperation with mobile, web or even IoT solutions,
  • low latency,
  • high throughput and availability.

For development purpose Microsoft provide Azure Cosmos DB emulator. Functionality is close to native cloud version of Cosmos DB. Developer will be able to create and query JSON documents, deal with collections and test stored procedures or triggers on database level. We need to understand that some features are not fully supported locally. These are among others multi-region replication or scalability.

Later in this post I will try to explain more details about supported data models. All of them use main, cool features provided by Azure Cosmos DB.

Supported data models

1. SQL API

This kind of Cosmos DB API provides capabilities to dealing with data for users, which are familiar with SQL queries standards. In general, data is stored as a JSON, but we can query them in easy way with SQL-like queries. Communication is handled by HTTP/HTTPS endpoints which process several requests. Microsoft provide dedicated SDK’s for this kind of API, for most of popular programming languages like .NET, Java, Python or JavaScript. Developers can load dedicated library in their application and start very fast read/write operations directly to Cosmos DB. Sample flow has been shown below.

 

2. MongoDB API

Existing instances of MongoDB can be migrated to Azure Cosmos DB without huge effort for this activities. Both standards are compatible. If new environment is created, change between native MongoDB instance and Cosmos DB instance (by MongoDB API) comes to change a connection string in application. Existing drivers for application written for MongoDB are fully supported. By design all properties within documents are automatically indexed.

Let’s check, how simple queries for identical documents collection as used in previous point will look like:

 

 

As a result, specified sub-JSON contains data will be returned. If query doesn’t return results, empty object will be send as a response to query.

3. Table API

This kind of API can be used by applications prepared natively for close working with Azure Storage tables. Of course Cosmos DB provides some premium capabilities comparing to Storage tables e.g. high availability or global distribution. Migration to new DB source for application doesn’t require changes in code. User can query data in a few ways. Also lot of SDKs are provided by design. Below sample will show how to query data by .NET SDK with LINQ. During execution LINQ query will be translated to ODATA query expression.

 

4. Cassandra API

Azure Cosmos DB Cassandra API is dedicated data store for applications created for Apache Cassandra. User is able to interact with data via CQL (Cassandra Query Language). In many cases action for changing DB source from Apache Cassandra to Azure Cosmos DB ‘s Cassandra API is just changing a connection string in application. From code perspective integration with Cassandra is realized via dedicated SDK (NuGet -> Install-Package CassandraCSharpDriver). Sample code for connecting to Cassandra cluster from .NET application is presented below.

 

5. Gremlin API

The last API provided by Azure Cosmos DB (on the day of writing this article 😉) is Gremlin API. This kind of interface can be used for storing and operation on graph data. API supports natively possibilities to graph modeling and traversing. We can query the graphs with millisecond latency and evolve the graph structure and schema in easy way. For querying activities we can use Gremlin or Apache TinkerPop languages. Step by step process from NuGet package installation to run first query is has been shown below.

Summary

From the developer perspective, Azure Cosmos DB is very interesting service. Huge range of available APIs allows for using mentioned database in various scenarios. Below you can find information from official Azure Cosmos DB site about availabilities of APIs per programming language.

Source: Azure Cosmos DB Documentation

***

This post is the last part in our Azure DevOps series. Check out the previous posts:

#1: Azure DevOps Services – cloud based platform for collaborating on code development from Microsoft

#2: Web application development with .NET Core and Azure DevOps

 

Blog

Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

Blog

Building better SaaS products with UX Writing (Part 3)

UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

Blog

Building better SaaS products with UX Writing (Part 2)

The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

Get in Touch

Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








    Global Azure Bootcamp in Poland

    CATEGORIES

    Blog

    Global Azure Bootcamp is organized all over the world, this time in 5 cities in Poland including Poznan where’s our Polish office and where we hire Azure geeks like Sławek Stanek who co-organizes the event so we are more than happy to support such a bootcamp!

    Come and learn more about the public cloud, Azure Active Directory, DevOps on Azure, Artificial Intelligence & Big Data, Power BI in Azure and SAP on Azure!

    Registrate here: https://gabc2019poznan.evenea.pl/

    Venue: Centrum Wykładowe Politechniki Poznańskiej – Sala CW8 Piotrowo 2, Poznań

    Date: 27.04.2019

    Time: 09:00 – 16:00

    Blog

    Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

    When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

    Blog

    Building better SaaS products with UX Writing (Part 3)

    UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

    Blog

    Building better SaaS products with UX Writing (Part 2)

    The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

    Get in Touch

    Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








      Web application development with .NET Core and Azure DevOps

      CATEGORIES

      Blog

      Initial setup of the environment

      In my previous post “Azure DevOps Services – cloud based platform for collaborating on code development from Microsoft” I presented the basic information about Azure DevOps capabilities. Now, I will focus on the details. Mainly, I will build together a full CI/CD pipeline for automated builds and deployments of sample .NET Core MVC web application to Azure App service.

      I assume that based on my previous post, a new project in Azure DevOps is created and you have access to it :). Before the fun begins, let’s setup a service connection between our CI/CD server and the Azure Portal. Navigate to the Azure portal. Service connections can be limited to subscription or resource group level. In our sample case we will set scope to the resource group. First, create a new resource group in Azure:

      Azure DevOps

       

      Ok, the resource group is ready, navigate now to the settings panel in Azure DevOps, then select the “Service connection” sub-category, and add a new “Azure Resource Manager” service connection:

      Azure DevOps

      After the confirmation, the provided values will be validated. If everything was provided correctly, the newly created service connection will be saved and ready to use in CI/CD pipelines configuration.

      First important thing is ready, now let’s focus on accessing to the Azure Repos GIT repository. In the Azure DevOps user profile you can easily add your existing SSH public key. To do this, navigate to the “SSH public keys” panel in your profile and add your SSH key:

      Azure DevOps

       

      Later on, clone our new empty repository to the local computer. In the “Repos” section in Azure DevOps portal you can find the “Clone” button which shows a direct link dedicated for cloning. Copy the link to the clipboard and run your local GIT bash console, and run “git clone” command:

       

      Azure DevOps

       

      One warning occurs, but it is ok – our repository is new and empty 🙂 Now, our environment is ready, then we will create a new project and its code will be committed to our repo.

       

      Creation of a development project and repository initialization

      I will use the Visual Studio 2017 but here we do not have a limitation to a specific version. When IDE will be ready, navigate to the project creation action, then select “Cloud” and “ASP.NET Core Web application”:

      Azure DevOps

       

      In next the step, we can specify the framework version and target template. I will base on the ASP.NET Core 2.1 Framework and MVC template:

      Azure DevOps

      After a few seconds a new sample ASP.NET Core MVC project will be generated. For our purpose it is enough to start the fun with Ci/CD pipeline configuration on the Azure DevOps side. This application will be used as a working example. Dealing with code and implementation of magic stuff inside the application is not a purpose of this post. When the project will be ready, you can run it locally, and after testing commit the code to our repository:

      Azure DevOps

      Build the pipeline (CI) by visual designer

      Our source code is in the repository, so let’s create a build pipeline for our solution. Navigate to the “Pipelines” section, next “Builds” and click “Create new”. In this example we will create our build pipeline from scratch with visual designer. If the pipeline would be committed to the repo in a YAML file (Pipeline as a code) then we have a possibility to load the ready pipeline from the file as well.

       

      Azure DevOps

       

      The full process contains three steps. At the beginning, we must specify where our code is located. The default option is the internal GIT repository in our project in the Azure DevOps. In our case this selection is correct. In second step, we can use one of the pre-defined build templates. Our application is based on the .NET Core framework, so I will select the ASP.NET Core template. After that the pre-defined agent job will be created for us. The job will contains steps like NuGet packages restoring, building of our solution, testing (in our app test doesn’t exist) and final publishing of the solution to package which will be used in the release pipeline. Here we can also export our pipeline to a YAML file:

      Azure DevOps

       

      On the “Triggers” tab the “Continuous integration” feature is not activated by default. Let’s enable them manually:

       

      Azure DevOps

       

      After saving all changes in the pipeline, our setup CI build pipeline is ready. Below you can see the sample action triggered by committing new changes in the repository. The new “CI build” runs immediately after new changes in the application code have been committed. Also email notifications will be sent to all subscribers. In the reason field, we can see “Continuous integration”. The icons in the email and in the build panel in the portal are an easy way to show the status of the build:

      Azure DevOps

       

      Release pipeline (CD)

      Now we will take care of the automatic release of our application to App Service on Azure. At the beginning, we need to create an App Service instance in the resource group, which has been selected to the service connection:

       

      Azure DevOps

      Ok, when the deployment is finished, we can start the creation process of the release pipeline in Azure DevOps. First we have to decide which template from the pre-defined list we will use. For our solution the “Azure App Service deployment” will be a good choice:

       

      Azure DevOps

      Next we must specify the source of artifacts. In our case, we will use the results of our build pipeline created in the previous steps:

      Azure DevOps

       

      Stages in the pipeline can be also renamed. This is a good practice in huge pipelines. We can observe what’s happened in each stage:

      Azure DevOps

       

      In the general settings for our new stage-agent, we must fill three very important parameters. First one is to specify which service connection the release pipeline will use, secondly, we must select the type of the application deployed to the  App Service. In the third parameter, we must put in the name of the existing App Service instance. We already created one at the beginning of this part of post.

       

      Azure DevOps

       

      Next in the configuration is to select the localization of the published application package. In our case this localization looks like this:

      Azure DevOps

       

      In a similar way like in the CI pipeline we need to enable the trigger in our newly created CD pipeline. To do that, we must click on the light icon on the artifact source and enable the trigger for the build:

      Azure DevOps

       

      It was the last step in the configuration. Looks like we are ready to commit some changes and check the final functionality 🙂

       

      Final check

      Let’s test our new full CI/CD pipeline then:

      1. Add the new code in the application:

      In the controller:

      Azure DevOps

      In the view:

      Azure DevOps

      2. Commit the changes to the repository:

      Azure DevOps

       

      3. Our CI build pipeline started automatically and is ready:

      Azure DevOps

       

      4. I received an email with confirmation of the correct building:

      Azure DevOps

       

      5. The release CD pipeline started automatically after success. The build from the CI pipeline and deployment is ready:

      Azure DevOps

       

      6. Changes have been deployed to Azure App Service and are visible:

      Azure DevOps

       

      As we can observe, the configuration of the full CI/CD pipeline for ASP.NET Core MVC web application is pretty easy. However, you must have some knowledge related to the configuration this stuff on the Azure DevOps side.

      We hope you will enjoy this post and this step-by-step guide will be useful for your future experience with Azure DevOps!

      ***

      This post is the 2nd part in our Azure DevOps series. Check out the other posts:

      #1: Azure DevOps Services – cloud based platform for collaborating on code development from Microsoft

      #3: Azure Cosmos DB – Multi model, Globally Distributed Database Service

      Blog

      Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

      When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

      Blog

      Building better SaaS products with UX Writing (Part 3)

      UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

      Blog

      Building better SaaS products with UX Writing (Part 2)

      The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

      Get in Touch

      Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.