Full-time, Remote
Data Engineer with Python/Databricks
UPD: This position is closed.
Requirements

  • 8+ years of IT experience incl. 4+ y. of extensive experience in designing & building Data Engineering solutions using Azure Data Services
  • Designing and building End to End data engineering solution using Python, Databricks, ADF, Synapse, SQL
  • Designing & Building Data Ingestion & Transformation pipelines for big scale datasets coming from multiple int/ext source systems
  • Designing and developing data models, extracting, and cleansing data that utilize data quality, cleansing & transformation functions
  • Have modelled Data Layer on top of Data Lake as per Analytics and Reporting use cases
  • Knowledge of handling large data sets and experience with performance tuning of data engineering process during data extraction & transformation
  • Experience of working with, and being able to peer review, data engineering pipelines (data bricks, python, synapse), automation technologies and platform engineering
  • CI-CD process using Azure DevOps
  • Experience and desire to work in a Global delivery environment with team spread across different locations
  • Ability to communicate complex tech solutions to diverse teams namely, technical, business and management teams
  • Experience managing team size of 6 to 8 people for building and operating ingestion and transformation pipelines in DevOps environment
  • Comfortable working in Agile-model and good understanding of data testing approach
Skills

Agile, Azure DevOps, Microsoft Azure, Python, SQL
Location

EU, Georgia, Asia, Ukraine, South America
Apply for this job or
Recommend a candidate
Your LinkedIn profile
Attach CV/Resume
Subscribe to know about our vacancies
If you'd like to be updated about our vacancies and receive referral bonuses for successful recommendations, then we are waiting for you on our channels on Telegram and LinkedIn.