Big Data Architect – DHL Hiring


Website DHL

Position Summary:

As a leader in DHL eCommerce Data Architecture topics, you will provide best-fit architectural solutions to manage our global data ecosystem. You will work with key stakeholders, business partners, IT counterparts and the Enterprise Architect function to define requirements, identify source data, assist with detailed data models, analyze and validate data, troubleshoot data issues, and promote data quality. You will design, develop and support data services that ensure the fulfillment of business information needs and support our digitalization journey.

Key Responsibilities:

  • Designing new data movements: Having documented the status quo – the sources of data and how the data is moved around – you will then look at how this movement can be improved.
  • Keep current on big data and data visualization technology trends, evaluate, work on proof-of-concept and make recommendations on the technologies based on their merit.
  • Defining integrative views of data: These views will draw together data from across the domains.
  • Some views will use a database of extracted data – others will bring together data in “near real time.” You will work with business people and data engineers and application designers to identify and model these integrative views and determine the quality of service requirements — data currency, availability, response times and data volumes.
  • Create the Master Data Management strategy as well as the data governance and data security.
  • Act as a subject matter expert for technical guidance, solution design and best practices within the Business Analytics organization.
  • Mapping data sources: An understanding of where data is stored and maintained. Although the application teams are responsible for documenting the details of application data, you must know where data is maintained and accessed.
  • Design and implement redundant systems, policies, and procedures for disaster recovery and data archiving to ensure effective availability, protection, and integrity of data assets.
  • Documenting existing data movement: record how data movement is happening. You will coordinate efforts and foster consistency while documenting the frequency of movement, the source and destination of each step, how the data is transformed as it moves, and any aggregation or calculations.
  • Develop a technical center of excellence within the analytics organization through training, mentorship, and process innovation.

Required Education & Experience:

  • Bachelor’s degree in Computer Science, Information Systems, Engineering, Mathematics.
  • 5 to 8 years of professional experience with Data Management technologies and a minimum of 3 years of Hadoop, NoSQL, and Open-Source data management technologies.
  • Ability to elicit requirements and communicate clearly with non-technical individuals, development teams, and other ancillary project members.
  • 10+ years of experience of IT platform implementation in a highly technical and analytical role
  • Desire to mentor younger team members and develop their skills.
  • Excellent written and oral communication skills.
  • Experience working on multiple concurrent projects.
  • Experience leading customer workshop sessions to educate customers on the latest technology trends and best practices.
  • Proven track of leadership roles delivering solutions within defined timeframes.
  • 5 to 8 years of designing Big Data Pipelines supporting Data Science Use Cases.
  • Ability of authoring, editing and presenting technical documents.
  • 3 to 5 years of Hadoop/NoSQL project implementations that exploit the full capabilities (discover, design, implement and optimize) is necessary.
  • Experience working with a globally distributed team and managing offshore teams.
  • Demonstrated success in performing work and managing complex and/or large consulting projects.
  • Ability to communicate effectively with technical and non-technical staff.