Data Engineer - Technopolis

Informatique / Electronique - Secteur Informatique

  • De 5 à 10 ans
  • 1 poste(s) sur Rabat et région - Maroc
  • Bac +4 - Master

Recherche de nouveauté Ambition Besoin de réflexion Besoin d'autonomie Implication au travail

  • CDI
  • Poste avec Management
Publiée il y a 26 jours sur ReKrute.com - Postuler avant le 27/10/2021

Entreprise :

DXC Technology est née en Avril 2017 d'une fusion entre CSC (Computer Sciences Corporation), Géant du Conseil et HP Enterprise Services, Leader en Services IT.

Le Groupe DXC enregistre 25 milliards $ de revenus, bénéficie de 60 ans d’expérience et emploie 170.000 collaborateurs dans plus de 70 pays qui servent plus de 6000 clients.

DXC Technology au Maroc, implantée depuis 2007, est une joint-venture entre DXC Technology, leader mondial des services d'intégration & d'infogérance et le Groupe CDG ( Caisse de Dépôt et de Gestion).

Avec 1200 collaborateurs répartis sur 2 sites de production (Technopolis et Casa NearShore), DXC Technology au Maroc a pour vocation d'accompagner les très grands comptes et donneurs d'ordre publics et privés dans leur transformation digitale et d'adresser ainsi l'ensemble des enjeux IT que sont l’Hébergement, l'Infogérance, la Modernisation Applicative, Business Intelligence & Analytics, Workplace & Mobilty Services, Business Process Services, la Cyber Sécurité et le Conseil. 

Poste :

• We are looking for an experienced Data Engineer who will join our Data Analytics team and will assume the Dev tech Lead role to support the development activities of the client data platform.
• The core of the work delivered by our team is to provide data engineering, development, testing, platform support and application support services for the data platform.
• Focus of the data engineering and development team is build data pipelines to properly ingest data from source systems into the data platform, make the necessary data transformations (ELT), store and make the data available for the consumption (reporting, self-serve BI, AI/ML models)
• Data platform is fully based on Azure Cloud and Databricks is heavily used for processing and transforming the data.
• Data platform contains all kind of (business) data to build an advanced analytics platform aiming at delivering better insights and applications to the business.
• The platforms are continuously being enhanced to support (additional) CI/CD and validated learning environment for science, machine learning and AI capabilities for all areas customer-facing like digital omni-channel interaction and commerce, commerce relevance, personalization, loyalty and marketing and non-customer-facing like assortment optimization, supply chain optimization, external parties and IoT.
• We will be working on end to end functionality including architecture, data preparation, processing and consumption by systems.
• All development is done by agile teams that are working with SAFe framework.

Responsibilities:
• Work in collaboration with each development team to;
o provide guidance, consultation, best practices and standard approach for the development
o when necessary, deepdive into the code and provide hands-on support to data engineering teams to overcome any roadblocks
o be the single point of contact across the development teams
• Work closely with platform architecture team to
o be the link between architectural design and day-to-day development
o define the technical backlog items to improve/enhance the platform
• Scope of the development teams can be summarized as
o Designing and implementing batch and real-time data ingestion pipelines for Azure Data Lake
o Implementing data quality checks
o Implementing data lineage and reconciliation mechanisms (including monitoring dashboards and alerts)
o Implementing data cataloguing, archiving and disaster recovery
o Working with data source and data consumption teams to align on data structures and schemas
• Participate in client sessions for business backlog and technical backlog refinement sessions and take part in PI planning sessions
 

Profil recherché :

  • Mandatory Skills:
    • Mastery in modern, cloud based data/BI technologies (data lakes, delta lakes, data warehouses, ETL/ELT, Spark)
    • 7 years Experience with implementing data ingestion pipelines / ETLs / ELTs
    • In depth knowledge of Azure platform and ecosystem - Storage Account, CosmosDB (SQL & Gremlin API), Event Hub, App Services
    • Experience with at following Azure data technologies
    o Azure SQL
    o Azure Data Lake
    o Azure Databricks (SQL or Python, Delta Lake)
    o Azure Data Factory
    o Azure Analysis Services
    o Experience with SQL
    • Good communication
    • Client Facing
  • Nice-to-Have Skills:
    • Python, C#/.NET
    • Azure Stream Analytics
  •  

Adresse :

Technopolis-Rabatshore , batiment B9 11100 Sala Al Jadida

Traits de personnalité souhaités :

Recherche de nouveauté Ambition Besoin de réflexion Besoin d'autonomie Implication au travail

Ici, vous pouvez retrouver un taux de recommandation de cette offre pour vous, ainsi que des informations pour mettre en avant votre candidature. Connectez-vous / Inscrivez-vous pour consulter ces informations personnalisées.

ReKrute vous offre ce test de personnalité pour mieux vous connaitre et valoriser vos candidatures. Passez-le dès maintenant, cela ne prend que 5 minutes maximum.