Default profile banner
LD

Laxman Dixit

@laxmandixit

Data Platform Engineer at Publicis Sapient

Bengaluru, India

Publicis SapientR.V.College of Engineering

Laxman Dixit is a Data Engineer with experience in deploying and managing scalable applications using Java Spring Boot on AWS EC2. He specializes in data pipelining, utilizing tools like Airflow and Dremio to manage data flow and create VDS/CTAS. His technical expertise spans multiple cloud platforms (AWS, Azure, GCP) and includes proficiency in Python, Java, and various SQL/NoSQL databases.

Experience

Data Platform Engineer

Publicis Sapient

•Invalid Date - Invalid Date•Bengaluru, India

In this asset and trading management project, we hosted the application on the cloud using AWS infrastructure. To streamline the data flow, we implemented Kafka and Dremio for efficient data pipelining. The data was then stored and managed in PostgreSQL, allowing for seamless analysis and decision-making processes.

Engineering Intern

Publicis Sapient

•Invalid Date - Invalid Date

TSD is an application that displays aggregated Information across all WMC Trading Desks (Cross Asset View), functionality for Trading Supervisors KRI (Key Risk Indicator Reports) and near Real time Alert detection (For Wellington Trading Policies). TSD was used for getting an oversight of Wellington Trading desks, Trading teams ,individual traders and provided alerting based on pre defined set of rules and or custom rules written by supervisors and also allowed supervisors to log comments for the exceptions/alerts and see a historical log of comments and historical trend analysis. TSD also provided supervisors the ability to run/view/export predefined or adhoc reports. Deployed and managed Java Spring Boot applications on AWS EC2, ensuring high availability and scalability. Configured Airflow DAG logs in AWS CloudWatch for comprehensive application monitoring and logging. Utilized Airflow's DAG concept to initiate Dremio reflection via Airflow by creating dependency among the VDS based on upstream and downstream graph. Created VDS,CTAS on source datasets and modified existing VDS as per requirements. Worked closely with Business Analysts in the team to clarify business doubts on the various alerts logic created in TSD. Modified existing alerts based on business requirements. Worked on exposing new TSD datasets in Dremio and also worked on minor AWS infrastructure changes and defects in the application.

Education

R.V.College of Engineering

Bachelor of Engineering

Electronics and Telecommunication Engineering

Jan 2022•Grade: 7.8 CGPA

Secured 7.8 CGPA in overall academics. Represented college in VTU fests and other inter-college level fests. Part of the Musical club called "ALAAP".

Skills

Python
Java
CSS
HTML
SQL
C++
MySQL
Oracle SQL
PostgreSQL
MongoDB
SpringBoot
Pandas
Numpy
Matplotlib
Airflow
Pyspark(Apache Spark)
Hadoop
Mage
S3
EC2
Lambda
cloudwatch
RDS
CloudFormation
Azure DataBricks
DataLakeStorage
Blob Storage
Pipelines
DataFlows
Google cloud storage
Google compute engine
Big Query
GitHub
Git
Jenkins
Power BI
Tableau
Dremio
Snowflake
Data modelling