Suraj Shinde
@surajshinde
Sr Software developer at e-Zest Solutions
Aurangabad
Suraj Shinde is an experienced Software Developer specializing in Big Data solutions. He possesses strong skills in Java, Python, and various big data technologies, including Kafka, Spark, and Snowflake. He has a proven history of building scalable data pipelines, optimizing data transformation processes, and working with global tech teams to solve complex business problems.
Experience
Sr Software developer
e-Zest Solutions
Executes on Big Data requests to improve the accuracy, quality, completeness, speed of data, and decisions made from Big Data analysis. Worked on Kafka stream application which transforms real-time events on the fly. Build an analytical dashboard to analyze the Pax travel data and personalize the experience. Work closely together with global tech, product and data science teams to develop new ideas, implement and test them, and measure success. Manages various Big Data analytic tool development projects with midsize teams. Work with the architecture team to define conceptual and logical data models. Identifies and develops Big Data sources & techniques to solve business problems. Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation. Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members.
Software developer
Datametica
Built the data migration pipeline from on premise to cloud infrastructure. Written the spark application to meet with the business transformation needs. Have experience in fixing production data issues and monitoring.
Software developer
Netcracker
Build Airflow DAG’s and scheduled ETL pipelines. Built Utility packages.
Software developer
e-Zest Solutions
Built an end-to-end data pipeline for different systems using Kafka. Hands-on knowledge of Kafka streams applications. Hands-on knowledge of Spark, Spark streaming. Analyzed the clickstream data from the Snowflake data warehouse. Build an ETL application by performing complex logics on large data sets (TBs+ range). Experience on serialization frameworks like Avro, Protocol buffers. Building highly scalable, robust & fault-tolerant systems. Familiarity with data loading tools like Sqoop. Pre-processing of data using Hive. Have hands-on experience in building REST API’s. Have good knowledge different of design patterns. Hands-on knowledge of building Spring boot application. Have good knowledge of relational databases like Oracle. Worked on Jasper reports creating pdf version/excel version of website/ reporting engine.
Education
M.I.T
Bachelor of Engineering
A.I.T
Diploma in Computer Engineering
St. Lawrence school
SSC (10th)