Bachelor’s degree in Computer Science, Engineering, or related field.
3-5 years of experience in backend development and data processing with Java and Scala.
Experience in developing RESTful APIs.
Experience with Hadoop ecosystem (e.g., HDFS, Hive, Spark).
Familiarity with GCP services (e.g., BigQuery, GCS, Cloud Run).
Basic understanding of ETL tools and data integration techniques.
Experience with CI/CD tools such as Jenkins.
Familiarity with Confluence/Jira, Agile & Scrum Practice
Good problem-solving skills and attention to detail.
Fluent English proficiency for technical communication.
Preferred Qualifications:
Knowledge of other cloud platforms (e.g., AWS, Azure).
Familiarity with monitoring and logging tools (e.g., Prometheus, Grafana).
Ability to collaborate and share knowledge with team members.
Experience with containerization tools like Docker.
Fundamental AI knowledge