Sriramenane Chowdary
@sriramenanechowdary
Data Engineer at Royal Cyber Pvt Ltd
Bangalore
Sriramenane is an adaptable and results-oriented IT professional with 4 years of experience in data engineering. He possesses a proven ability to design and implement efficient data solutions, leveraging skills in leadership and problem-solving. He is experienced in managing complex projects and collaborating with cross-functional teams to drive data-driven business objectives.
Experience
Data Engineer
Royal Cyber Pvt Ltd
Designed and developed utility tools based on business requirements to streamline data engineering processes. Created mock-up files and implemented Python scripts to replicate functionality from VB scripts while maintaining mapping requirements. Built dynamic pipelines in Azure Data Factory (ADF) to ingest and process data from diverse source systems into Azure Data Lake Storage (ADLS), ensuring compliance with business requirement mock-ups. Designed and executed Extract, Transform, and Load (ETL) processes to transform source data and load it into target tables for the creation of data marts. Developed and optimized Data Flows in Azure Data Factory to implement complex business transformation rules. Provided support to UAT and UI teams, ensuring proper testing and seamless integration of data pipelines with front-end applications. Configured and managed Azure Databricks clusters (Spark and Standard) to enhance data processing and analytics capabilities. Created and maintained Azure Databricks notebooks for processing large volumes of data (e.g., point-of-sale files) and loading them efficiently into DBFS. Facilitated the migration of Azure SQL Databases from development to production environments, ensuring minimal downtime and data integrity. Automated the preparation of high-quality data pipelines to support downstream reporting and analytics. Collaborated with cross-functional teams to identify and resolve data discrepancies, optimize pipeline performance, and meet SLAs. Monitored and fine-tuned pipeline execution and data storage costs to improve system efficiency and cost-effectiveness. Ensured adherence to data governance, compliance, and security standards during all stages of data processing and transformation. Documented processes, best practices, and technical solutions to support knowledge sharing and ongoing improvements.
Advanced Info-scan Private Limited
Proficient in Azure technologies such as Azure Data Factory (ADF), Azure Data Bricks (ADB), Azure Synapse Azure Active Directory, Azure Storage, Azure data Lake Services (ADLS), Azure key vault, Azure SQL DB. Proficient in Azure Data Factory to perform Incremental Loads from Azure SQL DB to Azure Synapse. Having good knowledge on creating triggers for scheduling and monitoring pipelines. Having experience in implementing Azure Data Factory components such as pipelines, Linked Services, dataset, Lookup, Get metadata for each and more activities. Good experience working with Azure Blob and DataLake storage loading data into Azure SQL DB. Extensive experience in Developing Spark applications using Pyspark/spark SQL on Azure Databricks for data extraction, transformation from multiple file formats for analyzing and transforming data. Extracted data from various sources like SQL server, Oracle, SFTP, Manual files, Blob storage. used Azure Data Lake Gen 2 for storing data. Working Experience PySpark Programming Language. Used Logic Apps to send Email notifications to Dev team and Users when pipeline fails. Good exposure on Agile SDLC and participated in sprint ceremonies (Sprint planning, Sprint review, Sprint retrospective and Daily Scrum). Extensively used parameterize in data sets, linked services, pipelines. Trouble shooting experience in pipeline run failures and linked services. Created the dynamic pipelines to ingest the data from different source systems to ADLS as per the Business Requirement. Implemented the Optimization Techniques and Best Practices in the Azure Data Factory. Integrated the child pipelines into the Main Pipeline. Created the Data Flows in Azure Data Factory for Transformations to implement Business Rules. Used the For Each Loop Activity for iterating the multiple loads as per the business requirement. Created the Dynamic Stored Procedures for ETL and to Load the data dynamically and call these stored procedures in ADF. Writing Stored Procedures
Education
Priyadarshini Degree & PG College, Kakatiya University
B.com
Computers