We think you also hate when travel app is giving you a headache, right? A slight misinformation can ruin the trip.
That is exactly what we are tackling as t-fam! Making sure that our 50+ million users have the best experience in crafting their own adventure.
Your main duties in flying with us :
Data Infrastructure Design and System AutomationDesign, implement, optimize, and automate systems for large-scale data architectures, ensuring scalability, availability, and reliability.Work with teams to build and manage data lakes, data warehouses, and ETL pipelines.Architect systems that handle batch, real-time, and streaming data processing.Build tools and frameworks to streamline data ingestion, transformation, and storage processes.Optimize and monitor data workflows to ensure high performance, reliability, and fault tolerance.Automate repetitive tasks such as testing, deployment, and monitoring of data systems.Data Pipeline DevelopmentBuild and optimize data pipelines to move data across systems and transform it into usable formats.Implement data governance policies to ensure consistency, security, and complianceDatabase Management:Expertise in managing relational (e.g., PostgreSQL, MySQL), NoSQL (e.g., MongoDB), and distributed databases (e.g., Hadoop, Spark, BigQuery).Cloud Infrastructure & DevOps:Proficient with cloud platforms such as Google Cloud, AWS, and Azure for data storage, compute, and processing.Experience with containerization (Docker) and orchestration tools (Kubernetes, Airflow).CI/CD pipelines for deploying data infrastructure code.Collaboration:Work closely with data engineers, data analyst, and DevOps teams to implement scalable automation solutions.Mentor junior engineers, code review, and provide guidance on best practices for coding and infrastructure design.Performance Monitoring & Troubleshooting:Monitor and optimize data infrastructure for performance, cost, and resource utilization.Troubleshoot performance bottlenecks and data inconsistencies across systems.,
Mandatory belongings that you must prepare :
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.Have 3 years of experience in data infrastructure or similar roles.Proven track record of managing complex data systems at scale.Strong communication skills to collaborate with other teams and stakeholders.Always staying updated with new technologies, including AI, machine learning, and emerging trends in data infrastructure.Proficiency in Python, Java, SQL, and Bash, depending on the technology stack.Familiarity with data systems such as Hadoop, Spark, and Kafka.Experience working with ETL tools such as Apache Airflow, Apache Nifi, or custom Python-based pipelines.Strong understanding of both OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing) systems.Hands-on experience with Google Cloud Platform (GCP), particularly BigQuery and Pub/Sub.Proficiency in working with Docker and Kubernetes.Familiarity with monitoring tools such as Prometheus and Grafana.Experience with Terraform, AWS CloudFormation, or other infrastructure-as-code tools.In the event that you haven’t received any updates after 3 weeks, your data will be kept and we may contact you for another career destination. Meanwhile, discover more about tiket.com on Instagram, LinkedIn, or YouTube.