Technology
·
Remote - LatAm
·
Fully Remote
Senior Data Engineer (Cloud)
The Role:
The Senior Cloud/Data Engineer will build, optimize, and maintain our modern data infrastructure. The ideal candidate will have hands-on experience with cloud-native data platforms, particularly Snowflake and AWS services, and will develop and maintain scalable data pipelines and feature engineering pipelines. This role will also analyze and amend software errors in a timely and accurate fashion and provide status reports where required. The position responsibilities outlined below are not all encompassing. Other duties, responsibilities, and qualifications may be required and/or assigned as necessary.
Responsibilities:
- Work with Product team to determine requirements and propose approaches to address users' needs.
- Analyze requirements to determine approach/proposed solution.
- Design and Build Solutions using relevant programming languages.
- Thoroughly test solutions using relevant approaches and tools
- Bring out-of-box thinking and solutions to address challenging issues.
- Effectively prioritize and execute tasks in a fast-paced environment.
- Work both independently and in a team-oriented, collaborative environment.
- Highly self-motivated and directed.
- Demonstrate a commitment to our core values.
Requirements:
- Strong verbal and written communication skills.
- Experience designing and developing enterprise applications.
- Hands-on software troubleshooting and problem-solving abilities.
- Experience across the software development lifecycle, including requirements gathering, analysis, design, development, and testing.
- Experience with cloud-native data platforms, especially Snowflake and AWS services.
- Continuously learn and adapt to new technologies
Must Haves:
- Proficiency with databases (e.g., Snowflake, DB2, Redshift) and dimensional modeling.
- Hands-on experience with AWS architecture and services (Lambda, Glue, Kinesis, Firehose, Athena, S3, Cloudwatch, Dynamodb, API Gateway).
- Proficient in Python, SQL, and scripting (e.g., Unix shell scripts).
- Experience building feature engineering pipelines.
- Experience with CI/CD tools such as GitHub, GitHub Actions, CodePipeline, and CloudFormation.
- Knowledge of user authentication and authorization across systems, servers, and environments.
- Experience with Tecton or Sagemaker or similar feature stores.
- Experience with NoSQL databases.
- Ability to take ownership and proactively ensure delivery timelines are met.
Great to Haves:
- Experience in data pipeline development using modern ETL tools, specifically Informatica PowerCenter and/or Informatica IICS.
- Experience with Airflow.
- Category
- Technology
- Locations
- Remote - LatAm
- Remote status
- Fully Remote