Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailTechnical Skills:
Data Warehousing: Snowflake Data Lake
Programming: SQL Python Shell Scripting
Core Competencies:
Design develop and maintain robust scalable data pipelines to support business operations and analytics using Python.
Integrate APIs using Python to fetch process and transform data for various use cases.
Work with a variety of SQL databases including PostgreSQL MySQL MS SQL Server and Oracle to perform data transformations migrations and optimizations.
Utilize Snowflake to build and manage data warehouses ensuring high performance and scalability.
Implement and manage workflows using Apache Airflow to automate data pipeline orchestration.
Develop and maintain Shell scripts to automate tasks and streamline processes.
Ensure data quality consistency and security across all platforms.
Collaborate with data analysts and other stakeholders to understand data requirements and deliver solutions.
Monitor troubleshoot and optimize data pipelines for efficiency and reliability.
Qualifications and Skills:
Programming: Strong proficiency in Python with experience in API integration and data processing.
Data Warehousing: Handson experience with Snowflake including schema design data modeling and performance optimization.
Workflow Automation: Proficiency in Apache Airflow for managing and orchestrating workflows.
Scripting: Strong skills in Shell scripting for automation and system management.
ETL/ELT Processes: Solid understanding of designing and maintaining ETL/ELT pipelines.
ProblemSolving: Strong analytical and problemsolving skills to handle complex data challenges.
Communication: Excellent communication skills to work effectively with crossfunctional teams.
Full Time