AWS Data Ops Specialist

Entreprise:

End Customer


Formation:

Bac + 3


Expérience:

0-2 ans


Ville de travail:

Beirut


Début du travail:

16.11.2020


Durée:

Unlimited


Offres d'emploi:

1


Créée le:

16.11.2020


Clics:

580


Our client is a leading worldwide shipping group with a huge fleet of vessels and serving more than 400 ports in the world on 5 continents With a presence in 150+ countries and through its 750+ agencies network, the Group employs 100,000+ people worldwide.

The mission is to ensure data exhaustivity and quality across all IT systems. It includes Data Migration, Data Integrity, Data Quality and Cloud Data Platform.


The AWS Data Ops Specialist will be part of the Digital Centre at Beirut. The team in Beirut will support the Cloud Data Platform owner in order to ensure platform quality and evolutions The CDP is based on AWS and will be used for Analytics & Machine Learning purposes. The AWS Data Ops will operates data Ingestion, implements data processing (ETL) and manage users data access, and will work with various teams (business & users, architects, DevOps, DataOps, Cloud team)

Responsibilities
  • You implement Data Ingestion from various sources and Data Processing (incl. controls, catalog, security access) based on architecture guidance & available mechanisms.
  • You work closely with other team members and interact with other teams to ensure that data is well defined and having sufficient quality.
  • You provide support to any issue related to data management (outside of AWS infrastructure issues). This activity will most probably require on-duty assignment in order to analyze and solve data scripting problems. The on-duty periods will be defined and implemented later. Note that infrastructure support (with on-duty) will be handled by another team.
  • You work on a continuous improvement mode and propose new process or tools to increase efficiency and quality.

Skills & Qualifications
  • You graduated Bachelor's degree in computer science or engineering.
  • You have a strong knowledge of Data Modeling and ETL and very good skills on SQL.
  • You are experienced on Python and especially PySpark & SparkSQL
  • You have a minimum of 2 years of experience in AWS tools including DMS, Lambda, Glue, Athena SQL, Spectrum, CloudWatch and overall on AWS S3 technology.
  • You have skills and experience using open source tools like Elasticsearch.
  • Experience in Hadoop, Hive, Horton Works, Redshift, GitLab, JIRA is a plus
  • You are a team player with a good collective spirit and empathy, demonstrating autonomy, listening, analysis and communication skills
  • You are at ease in English

Our selection criteria
  • Very curious
  • Team spirit
  • Open for discussions and new ideas
  • Oral & written communication skills
CV + cover to info {at} net - recrute . com