WE ARE LOOKING FOR A DATA INTEGRATOR TO BE RELOCATED TO MALAGA, SPAIN.
MISSION
The data engineer/integrator is a key role in any of the different data squads in charge of the integration and
maintenance of new or existing big data data platforms and data products. His/her main mission is to
configure and maintain the scheduling of data pipelines, manage the integration of data from files and
maintain storage of data.
ROLES / RESPONSIBILITIES
Mantain big data storage solutions in the platform (HDFS, IBM Cloud Storage, Parquet, SQL/NOSQL
databases)
Maintain scheduling of file exchanges and data pipelines with a combination of shell scripting and
autosys, oozie or AirFlow
Maintain Dremio Data Virtualization to interface with Parquet or as a way to expose the data in the
different data products
Maintain Dataiku pipelines as a data preparation tool for loading and transformation of data from our
big data platform into specific datamarts for data or business products
Create file exchange flows using pgp, CFT and Artemis
Provide N3 level support to end users
Requisitos Tecnicos:
IT TOOLS
Maintenance of legacy big data storage solutions (HDFS and Parquet)
Maintenance of big data storage solutions (IBM Cloud Object Storage and Parquet)
Maintenance of SQL/NO SQL database schemas and queries (MongoDB, Oracle, Postgres)
Ksh, Autosys and Oozie as legacy data pipeline scheduling solution
Ksh and Airflow as data pipelie scheduling solution
Dremio as data virtualization tool
Dataiku as data preparation tool
Complete el siguiente formulario y háganos llegar sus consultas, dudas o sugerencias. A la brevedad responderemos su mensaje