Priyanka Mishra
@priyanka_mishra
Senior Data Engineer at IBM India Pvt Ltd
Bangalore, India
Professional with around 7.5 years of IT Experience and business expertise in Data migration, Data processing. Proficient with MS Azure products such as Azure Data Factory, Azure DataBricks, Azure Data lake, Azure Database,Azure Analytical services in performing end to end Extract Transform Load processes.
Experience
Senior Data Engineer
IBM India Pvt Ltd
Working in agile scrum methodology and using Jira tool for story tracking purposes. As an individual developer my responsibilities are to perform ETL operations on source data. Write transformation logic, reading and writing csv and parquet files in databricks and scheduling data factory pipelines as per business requirements. Performing unit testing, giving testing support and production deployment. Attending daily scrum calls, updating tasks in Jira board and creating pull requests and maintain code version. We ingest data from various sources with the help of Azure Data Factory pipelines. Data is customer data, customer transaction data, Site data, site employee data etc. Saving raw and transformed data with the help of scheduled datafactory pipelines in azure datalake storage. Having 2 different layers of data. 1st where we deduplicate our data and change column names and 2nd where we apply some business logic and transformations. Applying business logic and transforming data with the help of pyspark, spark sql, python and azure databricks. Using transformed data to fetch data insights and create interactive dashboards and reports with the help of power-bi tool. Also, this data is being used to send offers to the customer after analyzing our data. Maintaining code version and merging latest code changes with the help of Azure Devops and Github.
Sr Software Engineer (Data Engineer)
Covalense Global Technologies Pvt. Ltd
Working in agile scrum methodology and using Jira tool for story tracking purposes. As an individual developer my responsibilities are to perform ETL operations on source data. Write transformation logic, reading and writing csv and parquet files in databricks and scheduling data factory pipelines as per business requirements. Performing unit testing, giving testing support and production deployment. Attending daily scrum calls, updating tasks in Jira board and creating pull requests and maintain code version. Requirement analysis through logic and gathering all requirements from client side for all the KPIs related information and implemented those logic in notebook using pyspark and Adf pipeline.
Senior Data Engineer
Capgemini Technology Services India limited
Working in agile scrum methodology and using Jira tool for story tracking purposes. As an individual developer my responsibilities are to perform ETL operations on source data. Write transformation logic, reading and writing csv and parquet files in databricks and scheduling data factory pipelines as per business requirements. Performing unit testing, giving testing support and production deployment. Attending daily scrum calls, updating tasks in Jira board and creating pull requests and maintain code version. Having 2 layers for data, called UDL and BDL. UDL team makes raw data available for us in azure datalake gen-2 storage with the help of azure data factory pipelines. This data is nothing but unilever's worldwide country wise product data, employee data, sales data etc. We pull the data which is in the form of CSV or Parquet file format from UDL and then using Azure databricks notebook, pyspark and sparksql we perform transformation as per the business requirement and save data in BDL layer i.e., azure datalake gen-2. Executing these databricks notebooks with the help of scheduled datafactory pipelines. BDL data is further used by Using Azure DevOps for maintaining source code version control. power-bi team for monitoring and creating interactive dashboards. Project: Meijer(Ecommerce) Environment : C#.Net Meijer is a traditional grocery store with general merchandise like merchandise of clothes, electronic goods, supporting goods, pharmacy and others.
Software Developer
Compassites Software Solution Pvt. Ltd
Online OMH Tool is a Finance management system which provides daily transaction reports for the client(various persons) in organizations individual users. Environment : C#.Net
Education
Biju Patnaik University
Master Degree in Computer Application
Computer Application