Data source connectivity API integration -
Formatting and cleaning Source code
About this concert
Professional Azure Data Engineer with 1 year of experience and hands-on development of highly complex ETL pipelines.
- Data ingestion, transformation and pipeline orchestration.
- Experience migrating data from legacy (on-premises) systems to Azure Data Lake , performing some transformations on data using DataBricks (Python or Apache Spark), moving them to Azure SQL Database (SSMS) and writing SQL stored procedures which can transform the data and perform some calculations based on some business logic.
- I also have experience in the development and maintenance of architectures delta live charts using Databricks and taking advantage if ADLS to store the underlying data.
- ETL will be done using Azure Data Factory .
- Custom Dashboard to view the data using PowerBI .
Technology:
apache kafka
•
apache spark
•
Azure Data Factory
•
Piton
•
sql
•