Implementing an Azure Data Solution (DP-203)

Logo Enterprise Skills InitiativeDuration: 4 days

Teaching method: classroom live and virtual classroom

Objectives | Audience | Prerequisites | Topics | Schedule

In this course, the candidates will design and implement various data platform technologies into solutions that are in line with business and technical requirements. This can include on-premises, cloud, and hybrid data scenarios which incorporate relational, No-SQL or Data Warehouse data. They will also learn how to design process architectures using a range of technologies for both streaming and batch data.

The candidates will also explore how to design data security including data access, data policies and standards. They will also define, design and implement Azure data solutions which includes the optimization, availability and disaster recovery of big data, batch processing and streaming data solutions. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing and streaming data solutions.

OBJECTIVES

Rampup in order to pass for for exam DP-203: Data Engineering on Microsoft Azure. When you pass for this exam you obtain the Microsoft Role-Based Microsoft Certified: Azure Data Engineer Associate certificate.

AUDIENCE

Azure Data Engineers help stakeholders understand the data through exploration, and they build and maintain secure and compliant data processing pipelines by using different tools and techniques. These professionals use various Azure data services and languages to store and produce cleansed and enhanced datasets for analysis.

Azure Data Engineers also help ensure that data pipelines and data stores are high-performing, efficient, organized, and reliable, given a set of business requirements and constraints. They deal with unanticipated issues swiftly, and they minimize data loss. They also design, implement, monitor, and optimize data platforms to meet the data pipelines needs.

PREREQUISITES

Microsoft Azure Fundamentals (AZ-900) or an equivalent of this course.
Candidates for this course (and the related exam) should have subject matter expertise integrating, transforming, and consolidating data from various structured and unstructured data systems into a structure that is suitable for building analytics solutions. In addition to their professional experience, candidates who take this course should have technical knowledge equivalent to the DP-900 Microsoft Azure Data Fundamentals course.

TOPICS

  • Module 1: Azure for the Data Engineer
    • Explain the evolving world of data
    • Survey the services in the Azure Data Platform
    • Identify the tasks that are performed by a Data Engineer
    • Describe the use cases for the cloud in a Case Study
  • Module 2: Working with Data Storage
    • Choose a data storage approach in Azure
    • Create an Azure Storage Account
    • Explain Azure Data Lake storage
    • Upload data into Azure Data Lake
  • Module 3: Enabling Team Based Data Science with Azure Databricks
    • Explain Azure Databricks and Machine Learning Platforms
    • Describe the Team Data Science Process
    • Provision Azure Databricks and workspaces
    • Perform data preparation tasks
  • Module 4: Building Globally Distributed Databases with Cosmos DB
    • Create an Azure Cosmos DB database built to scale
    • Insert and query data in your Azure Cosmos DB database
    • Provision a .NET Core app for Cosmos DB in Visual Studio Code
    • Distribute your data globally with Azure Cosmos DB
  • Module 5: Working with Relational Data Stores in the Cloud
    • SQL Database and SQL Data Warehouse
    • Provision an Azure SQL database to store data
    • Provision and load data into Azure SQL Data
  • Module 6: Performing Real-Time Analytics with Stream Analytics
    • Explain data streams and event processing
    • Querying streaming data using Stream Analytics
    • How to process data with Azure Blob and Stream Analytics
    • How to process data with Event Hubs and Stream Analytics
  • Module 7: Orchestrating Data Movement with Azure Data Factory
    • Explain how Azure Data Factory works
    • Create Linked Services and datasets
    • Create pipelines and activities
    • Azure Data Factory pipeline execution and triggers
  • Module 8: Securing Azure Data Platforms
    • Configuring Network Security
    • Configuring Authentication
    • Configuring Authorization
    • Auditing Security
  • Module 9: Monitoring and Troubleshooting Data Storage and Processing

    • Data Engineering troubleshooting approach
    • Azure Monitoring Capabilities
    • Troubleshoot common data issues
    • Troubleshoot common data processing issues
  • Module 10: Integrating and Optimizing Data Platforms
    • Integrating data platforms
    • Optimizing data stores
    • Optimize streaming data
    • Manage disaster recovery

SCHEDULE

Contact

    Pageloader
    Onderwerpen
    Actieve filters: Wis alle filters
    Pageloader
    PRIVACY VOORWAARDEN

    Jouw persoonsgegevens worden opgenomen in onze beschermde database en worden niet aan derden verstrekt. Je stemt hiermee in dat wij jou van onze aanbiedingen op de hoogte houden. In al onze correspondentie zit een afmeldmogelijkheid