Senior Data Engineer - Cloud
The Role:
We’re working to advance care through data-driven decisions and automation. This mission serves as the foundation for every decision as we shape the future of travel and hospitality. We can’t do that without the best talent . innovative, curious individuals driven to create exceptional experiences for guests, customers, owners, and colleagues.
We’re seeking an experienced Senior Cloud Data Engineer to join our growing Data Platform Architecture team. This role will collaborate closely with data engineering, data product managers, and data science teams to meet the data requirements of various strategic initiatives.
As a Senior Cloud Data Engineer, you’ll contribute to the creation of an Enterprise Data Platform, helping to shape and build new analytical solutions that support internal stakeholders. You’ll recommend tools and capabilities based on your research and understanding of on-premises, cloud-based, and hybrid environments.
You’ll work with engineering, architecture, and migration teams to inform strategy and design. This is an opportunity to stay at the forefront of cloud technologies, prototype with emerging tools, and expand your skillset in automation, scripting, and containerization—all while developing critical systems for internal clients.
Join a hands-on, highly visible team that’s collaborative, passionate about data, and positioned for growth.
We’re looking for a highly motivated AWS expert who’s excited to leverage their skills and strategic thinking to improve mission execution. Someone passionate about creating an amazing user experience that delights end-users and simplifies their work.
This candidate builds strong relationships across all levels of the organization and is recognized as a problem solver who elevates the work of those around them.
Responsibilities:
- Provide expert guidance to ensure project deliverables align with target state architecture.
- Design and develop enterprise data solutions using advanced data technologies and architectural best practices.
- Implement solutions for data migration, delivery, and machine learning pipelines.
- Configure Identity Access Management roles and policies.
- Build resilient, reliable, performant, and secure data platforms and applications.
- Collaborate with application development and data engineering teams on daily tasks and project execution.
- Automate deployments of AWS services and BI applications.
Requirements:
- 6+ years of experience in application/platform engineering, business intelligence, or analytics.
- 4+ years of experience in AWS cloud data engineering, architecture, and implementation.
- Experience with data warehousing platforms such as Snowflake, Redshift, or similar.
- Strong knowledge of AWS services: S3, EC2, RDS, Lambda, CloudFormation, Kinesis, Data Pipelines, EMR, Step Functions, VPC, IAM, Security Groups.
- Proficiency in DB technologies: SQL, Python, PostgreSQL, Aurora, MongoDB, Redis.
- Experience with CI/CD tools and scripting: GitHub Actions, Jenkins, AWS CodePipeline, CloudFormation, Terraform.
- Expertise in IAM roles and policies.
- Strong scripting skills in PowerShell and/or Python.
- Familiarity with cloud monitoring and alerting for resource availability.
- Understanding of PaaS/SaaS application performance and enterprise-level architecture and security.
Skills:
- Comfortable solving problems in dynamic, ambiguous environments.
- Tenacity to thrive in fast-paced settings and inspire change.
- Experience designing scalable, robust data pipelines for business decision-making.
- Strong analytical and problem-solving skills; ability to manage multiple projects and stakeholders.
- Attention to detail and accuracy.
- Demonstrated troubleshooting ability.
- Passion for programming and learning new technologies.
- Experience with on-premises to AWS migrations.
- BA/BS in Computer Engineering, Computer Science, or related field.
- Strong verbal and written communication skills.
Great to Have:
- Large-scale enterprise streaming services (e.g., Kafka).
- Kubernetes, Docker, or AWS Fargate.
- Applications on both Windows and Linux server OS.
- Networking, security groups, and policy management across UNIX, Linux, and Windows.
- ETL development using Informatica PowerCenter and/or Informatica IICS.
- Experience with Airflow.
- Advanced CS degree.
Must Haves:
- Proficiency with databases (Snowflake, DB2, Redshift) and dimensional modeling.
- Hands-on experience with AWS services: Lambda, Glue, Kinesis, Firehose, Athena, S3, CloudWatch, DynamoDB, API Gateway.
- Proficient in Python, SQL, and Unix shell scripting.
- Experience building feature engineering pipelines.
- CI/CD tools: GitHub, GitHub Actions, CodePipeline, CloudFormation.
- Knowledge of user authentication and authorization across systems.
- Experience with feature stores like Tecton or SageMaker.
- Experience with NoSQL databases.
- Ownership mindset and proactive delivery.
- Category
- Technology
- Locations
- Remote - LatAm
- Remote status
- Fully Remote