Scala Software Engineer
ref nr: 237/5/2025/AD/91151
In Antal we have been dealing with recruitment for over 20 years. Thanks to the fact that we operate in 10 specialised divisions, we have an excellent orientation in current industry trends. We precisely determine the specific nature of the job, classifying key skills and necessary qualifications. Our mission is not only to find a candidate whose competences fit the requirements of the given job advertisement, but first and foremost a position which meets the candidate’s expectations. Employment agency registration number: 496.
Scala Software Engineer – Scala + Java + Spark
📍 Location: Poland (2 days in office in Kraków per month)
Are you passionate about big data, building core components, and developing complex business logic with modern technologies like Spark and cloud platforms?
We’re looking for a Scala Software Engineer to join a global team that’s shaping a next-generation analytics and surveillance platform within a highly regulated environment. If you enjoy hands-on development, end-to-end ownership, and working with both Scala and Java, this is the role for you.
🔍 Key Responsibilities:
-
Design and implement core components for large-scale data processing using Scala and Java
-
Build and optimize data pipelines to consolidate and transform data from multiple sources into meaningful business datasets
-
Leverage Apache Spark (in both Scala and Java) to develop scalable processing solutions
-
Onboard new data sources and develop ingestion and transformation logic
-
Collaborate with Data Scientists to productionize analytics models
-
Ensure development aligns with CI/CD and TDD best practices
-
Cooperate with architects and business stakeholders to ensure scalable, high-quality solutions
-
Participate in end-to-end delivery including deployment and monitoring in production
✅ Requirements:
-
Hands-on experience with Apache Spark using both Scala and Java (not PySpark)
-
Strong background in building core applications and implementing complex data processing logic
-
Solid understanding of HDFS and YARN – including their purpose and real-world usage
-
Broad experience with ETL processes and Big Data technologies
-
Knowledge of version control and CI/CD tools (Git, Bitbucket, Jenkins, Maven)
-
Comfortable working in distributed, international teams; strong communication in English (min. B2)
🌟 Nice to Have:
-
Experience with Google Cloud Platform (GCP) for deploying and managing data services
-
Hands-on familiarity with Kubernetes and Airflow
-
Exposure to the ELK stack
-
Working knowledge of Linux, Bash scripting, basic Python
-
Familiarity with the Spring framework
💼 What We Offer:
-
A key role in a globally important initiative, focused on compliance and surveillance
-
Modern tech stack with real ownership of your code in production
-
A collaborative, DevOps-oriented culture
-
The opportunity to shape technical decisions early in a long-term project
-
Work with highly skilled professionals in Poland and globally