Before you apply to a job, select your language preference from the options available at the top right of this page.
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow-people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.
Job Description:
JOB SUMMARY
This position supervises and participates in the development of batch and real-time data pipelines utilizing various data analytics processing frameworks in support of data science and machine learning practices. This position participates in and supports the integration of data from various data sources, both internal and external. This position performs extract, transform, load (ETL) data conversions, and facilitates data cleansing and enrichment. This position performs full systems life cycle management activities, such as analysis, technical requirements, design, coding, testing, implementation of systems and applications software. This position participates and contributes to synthesizing disparate data sources to create reusable and reproducible data assets. This position contributes to the data science community working through analytical model feature tuning.
RESPONSIBILITIES
- Supervises and supports data engineering projects and builds solutions by leveraging a strong foundational knowledge in software/application development.
- Develops and delivers data engineering documentation.
- Gathers requirements, defines the scope, and performs the integration of data for data engineering projects.
- Recommends analytic reporting products/tools and supports the adoption of emerging technology.
- Performs data engineering maintenance and support.
- Provides the implementation strategy and executes backup, recovery, and technology solutions to perform proof of concept (POC) analysis.
- Performs ETL tool capabilities with the ability to pull data from various sources and perform a load of the transformed data into a database or business intelligence platform.
- Builds data APIs to enable data scientists and business intelligence analysts to query the data.
- Codes using programming language used for statistical analysis and modeling such as Python/Java/Scala/C++.
QUALIFICATIONS
Requirements :
- Literate in the programming languages used for statistical modeling and analysis, data warehousing and Cloud solutions, and building data pipelines.
- Strong understanding of database systems and data warehousing solutions.
- Strong understanding of the data interconnections between organizations' operational and business functions.
- Strong understanding of the data life cycle stages - data collection, transformation, analysis, storing the data securely, providing data accessibility
- Strong understanding of the data environment to ensure that it can scale for the following demands: Throughput of data, increasing data pipeline throughput, analyzing large amounts of data, Real-time predictions, insights and customer feedback, data security, data regulations, and compliance.
- Strong knowledge of algorithms and data structures, as well as data filtering and data optimization.
- Strong understanding of analytic reporting technologies and environments (e.g., PBI, Looker, Qlik, etc.)
- Strong understanding of a cloud services platform (e.g., GCP, or AZURE, or AWS) and all the data life cycle stages.
- Understanding of Machine learning algorithms which help data scientists make predictions based on current and historical data.
- Understanding of distributed systems and the underlying business problem being addressed, as well as guides team members on how their work will assist by performing data analysis and presenting findings to the stakeholders.
- Knowledge of algorithms and data structures with the ability to organize the data for reporting, analytics, and data mining and perform data filtering and data optimization.
- Bachelor's degree in MIS, mathematics, statistics, or computer science, international equivalent, or equivalent job experience.
This role is classified as a 20I grade-level and Supervisor management level at UPS Capital.
This is a hybrid position out of our Sandy Springs office (onsite Tuesday, Wednesday and Thursday; remote on Monday's and Friday's only. Relocation is not available.
We do not have visa transfer or sponsorship capability now nor in the future for this position.
Employee Type:
Permanent
UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Other Criteria:
UPS is an equal opportunity employer. UPS does not discriminate on the basis of race/color/religion/sex/national origin/veteran/disability/age/sexual orientation/gender identity or any other characteristic protected by law.
Basic Qualifications:
Must be a U.S. Citizen or National of the U.S., an alien lawfully admitted for permanent residence, or an alien authorized to work in the U.S. for this employer.
Company: UNITED PARCEL SERVICE
Category: Info Services Management
Requisition Number: R24020570
Location: Atlanta,Georgia