0% found this document useful (0 votes)
66 views2 pages

Resume 1

This individual is a highly motivated and determined data engineer seeking a role that utilizes their academic and professional experiences. They have 4.5 years of experience as a technology analyst with expertise in solving complex business problems through analytics. Their technical skills include Azure, Big Data, Azure Data Factory, and SQL. They have worked on multiple projects involving designing data pipelines and warehouses in Azure, ingesting data from various sources, and providing end user assistance. They have a master's degree in technology and certifications in Azure fundamentals and data solutions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views2 pages

Resume 1

This individual is a highly motivated and determined data engineer seeking a role that utilizes their academic and professional experiences. They have 4.5 years of experience as a technology analyst with expertise in solving complex business problems through analytics. Their technical skills include Azure, Big Data, Azure Data Factory, and SQL. They have worked on multiple projects involving designing data pipelines and warehouses in Azure, ingesting data from various sources, and providing end user assistance. They have a master's degree in technology and certifications in Azure fundamentals and data solutions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

Objective:

Being a highly motivated and determined individual, wish to work in a company that would utilize my academic and
professional experiences with organization realities for creating a motivated work force and environment, which
encourages learning creativity and growth of self and organization.

Professional Summary:
• Having 4.5 years of IT Experience as a “Technology Analyst” with expertise in solving complex business
problems through analytics and systematic approach.
• Designing, creating, testing and maintaining the complete data management & processing systems.
• Having good experience of ETL, database objects and performance tuning.
• Developing multiple metadata-driven pipelines that source data from external partners and delivers
enriched products with reference data assets to end external customers.
• Good in grasping knowledge and identifying opportunities for improvements and resolution of issues.
• An effective performer, skilled in enlisting the support of all team members in aligning with project and
organizational goals.

Technical Skills:
Primary skills: Azure, Big Data, Azure Data Factory, SQL
Azure tech: Azure Data lake, SQL DW, Azure Functions

Career Highlights:
• Project Details:
• Client:
• Duration: May’19 to till date
• Role: Working as a Data Engineer and responsible for implementing the architecture, setting up the Azure
SQL DW and ensure the smooth running of scheduled ADF Jobs and provide end-user assistance.
Responsibilities:
• Media Data Store, responsible for implementing the architecture, setting up the Azure SQL Data
warehouse. Using Azure Functions (C#) the logs from different media store sources like Google Campaign
Manager, Facebook Ads, Amazon are ingested into Azure SQL data warehouse (Azure Data Factory) for
data science team.
• As part of data management track, responsible for analyzing/performance improvement of the existing
jobs, ensure the smooth running of scheduled ADF Jobs and provide end-user assistance.
• Creating packages as per business requirements to load data to downstream Database.
• Incorporated pipeline to import files (JSON, CSV or TSV) uploaded by the users or logs generated to new or
existing SQL DW tables.
• Experienced in creating pipelines in Azure Cloud ADFv2 using different activities like Move
&Transform, Control Flow, Data Transformation etc.
• Supply Chain Data Store (Canada), Designed and built a scalable Data store solution on Azure by
leveraging Azure’s Synapse Analytics data service to ingest data received from SAP APO, residing in
Teradata to SQL DW using ADF that would enable Self Service reporting capabilities using Power BI for
Sales and CFR report for data science team.
• First Party Data Store, responsible for implementing the architecture, setting up the Azure SQL Data
warehouse. Data from different data store sources are ingested into Azure SQL data warehouse using
Azure Data Factory for data science team.
• Project Details:
• Client:
• Duration: Apr’17 to Apr’19
Role: Worked with WAS deployment team, being involved with deployment of updates and operations on
applications hosted on Websphere Application Servers.
Responsibilities:
• Provided 24*7 support on rotational basis, for monitoring the applications hosted on WAS.
• Worked closely with development, messaging, network support and database teams, as the application
was integrated with other products.
• Involved extensively in trouble shooting the issues and finding out root cause analysis.
• Configured JDBC connections, JMS servers and security settings for WAS.

Academics and Projects:


[Link] from Delhi Technological University, Delhi with 8.15 CGPA in year 2016.
Certifications & Publications:
• AZ-900 : Microsoft Azure Fundamentals
• DP-900 : Microsoft Azure Data Fundamentals
• DP-200: Implementing an Azure Data Solution
• DP-201: Designing an Azure Data Solution

Personal Details:

Declaration:
I hereby confirm that the above information in this form is true to best of my knowledge.

Date:

Common questions

Powered by AI

The individual holds several certifications that demonstrate their expertise in Azure technologies, including AZ-900: Microsoft Azure Fundamentals, DP-900: Microsoft Azure Data Fundamentals, DP-200: Implementing an Azure Data Solution, and DP-201: Designing an Azure Data Solution .

The individual employed strategies like developing metadata-driven pipelines to source data from external partners and deliver enriched products. They used Azure Data Factory to ingest data from different data store sources into Azure SQL Data Warehouse and created pipelines to import user-uploaded or log-generated files into SQL DW tables, supporting seamless data integration .

The individual set up Azure SQL Data Warehouses as part of multiple projects. For the Media Data Store project, they implemented the architecture and ingested logs from sources like Google Campaign Manager into the warehouse for the data science team . In the Supply Chain Data Store project, they designed a scalable data store by using Azure's Synapse Analytics to ingest data from SAP APO to SQL DW, enabling self-service reporting with Power BI .

The individual facilitated self-service reporting capabilities by designing a scalable data store solution using Azure Synapse Analytics, which enabled data ingestion from SAP APO into SQL Data Warehouse. This allowed for the use of Power BI for sales and CFR reporting by the data science team .

The individual's effectiveness in project and organizational settings can be attributed to their ability to grasp knowledge quickly, identify opportunities for improvements, resolve issues, and enlist the support of team members to align with organizational goals. These attributes facilitated a motivated work environment that encourages learning and growth .

The individual utilized technologies such as Azure, Big Data, Azure Data Factory, and SQL in their role to solve complex business problems. They have experience designing, creating, testing, and maintaining data management and processing systems, as well as expertise in ETL processes, database objects, and performance tuning .

The individual completed their M.Tech from Delhi Technological University, Delhi, with a CGPA of 8.15 in 2016, providing them with a strong technical foundation that likely supported their proficiency in handling complex technology systems and roles such as Technology Analyst and Data Engineer .

The individual effectively aligned their work with organizational goals and project objectives by enlisting the support of all team members to align with these goals. They identified opportunities for improvement and resolution of issues, thereby enhancing team collaboration and contributing to the success of project objectives .

As a Data Engineer, the individual focused on analyzing and improving the performance of existing jobs, ensuring the smooth operation of scheduled Azure Data Factory jobs, and providing end-user assistance. They also created packages to load data according to business requirements and incorporated pipelines to import files into SQL Data Warehouse tables .

In their role involving Websphere Application Servers, the individual provided 24*7 support on a rotational basis, monitored applications, collaborated with development and support teams, and was extensively involved in troubleshooting and root cause analysis of issues. They also configured JDBC connections, JMS servers, and security settings for WAS .

You might also like