Azure Data Engineer
Location: On-Site, new jersey, NJ

Job Description:

Position: Senior Azure Data Engineer with Retail & Python Programming Experience

Location: Positions might go HYBRID next year

Duration: 12+ Months

Job Description:

Must have 15+ years of Total IT experience

Calgary700 2nd Street SW
Toronto134 Peter Street
San Francisco350 Bush Street
Arlington1515 North Courthouse Road
Westminster11030 Circle Point Road
Chicago35 West Wacker Drive
Boston40 Water Street
Birmingham205 Hamilton Row
Houston1111 Bagby Street
Atlanta384 Northyards Boulevard NW
Miami2911 Grand Avenue
New York375 Hudson Street
Minneapolis500 North 3rd Street
Los Angeles13031 West Jefferson Boulevard
SeattleUS WA Seattle 1448 NW Market Street
Dallas/Irving6021 Connection Drive
Irvine5301 California Ave

Required Skills and Qualifications:

  • Strong proficiency in Python programming language.
  • Experience in building data pipelines and ETL processes.
  • Familiarity with Snowflake and SAP systems, including data extraction methods (e.g., APIs, database connectors).
  • Knowledge of data transformation techniques and tools.
  • Proficiency in Python libraries and frameworks for data manipulation (e.g., Pandas, NumPy, PySpark).
  • Understanding of database systems and SQL queries.
  • Experience with data integration and synchronization.
  • Familiarity with data governance and compliance principles.
  • Strong problem-solving and analytical skills.
  • Excellent communication and collaboration abilities.
  • Attention to detail and ability to work with large datasets efficiently.

Job Description:

  • Data Pipeline Development: Design, develop, and maintain robust data pipelines using Python programming language to extract data from Snowflake and SAP systems and transform it into the desired format for the target system.
  • Implement efficient data ingestion, transformation, and loading processes to ensure accurate and reliable data transfer between systems.
  • Collaborate with stakeholders to understand data requirements, source systems, and target systems, and design appropriate data pipelines accordingly.
  • Write clean, optimized, and scalable code to handle large volumes of data and ensure efficient data flow throughout the pipeline.
  • Monitor and optimize data pipeline performance, identifying and resolving issues, bottlenecks, and data quality problems.
  • Data Transformation: Define and implement data transformation rules and logic to clean, filter, aggregate, and transform data from Snowflake and SAP into the required format for the target system.
  • Leverage Python libraries and frameworks such as Pandas, NumPy, or PySpark to manipulate and process data efficiently during the transformation process.
  • Ensure data quality and integrity by applying data validation, normalization, and standardization techniques.
  • Develop data mapping and conversion scripts to handle schema differences and ensure data consistency across systems.
  • Collaborate with data analysts and business stakeholders to understand data semantics and requirements for accurate transformations.
  • Data Integration and System Connectivity: Establish connectivity with Snowflake and SAP systems, extracting data through APIs, database connectors, or other relevant methods.
  • Integrate and synchronize data from multiple sources, ensuring data consistency and coherence.
  • Collaborate with IT teams to implement secure and efficient data transfer mechanisms, adhering to data governance and compliance policies.
  • Develop error handling and exception management strategies to handle data transfer failures and ensure data integrity during the integration process.
  • Documentation and Collaboration: Document the data pipeline design, architecture, and implementation details, including data source specifications, transformation rules, and target system requirements.
  • Collaborate with cross-functional teams, including data analysts, data scientists, and business stakeholders, to understand their data needs and provide necessary support.
  • Participate in meetings and discussions to align data engineering initiatives with business goals.
  • Stay up-to-date with emerging technologies, tools, and best practices in data engineering and make recommendations for process improvements.

Key Skills:

  • azure, Retail
    Cloud