Hiring,
Software Developer (5+yrs) n Lead Developer (7+yrs) in Data Pipeline and Big Data
Mandatory Skills - Java8 and above, Kafka streams.
The Digital Products team aims to bring successful technology-based products to market in a high-growth environment. The teams mission is focused on accelerating technology adoption in commercial real estate by bringing creative, innovative and technical solutions to solve large, complex problems for our clients.
The Software Developer is a key position on the Digital Products team, responsible for defining the key framework components for our cloud environment.
Key Responsibilities:
As a Software Developer, you will be responsible for:
Build highly scalable data processing pipeline for various applications such as Bulk Ingestion, Real Time Streaming Applications and AI/ML based insights
Build configurable data pipeline based on Kafka Streams/KSQL to ingest bulk and real time stream data
Build Kafka Stream applications that meets the high-quality standards and scale horizontally in a cloud environment
Collaborating with the Dev Leads and Product Manager to ensure that the requirements are meet
Ensure applications are tested through usage of automated test tools
Delivering well instrumented system that provides insights in Operational Metrics and helps in resolving issues
Utilize Agile practices to manage and delivers features
Qualifications:
In order to achieve the key objectives of role, you will need the following skills and experience
Deep expertise with Java server-side development (Java8 and above) using Spring Projects specifically through usage of microservices
Expertise in Spring Projects such as Spring Data, Spring Streams, Spring integration, Spring for Apache Kafka and Spring Security
Knowledge of schema definition such as AVRO, Thrift, Protocol Buffers and Parquet
Experience building test automation for data loading, performance and API validation through tools such as JMeter, apigee, SoapUI,etc
Hands on experience with Kafka Stream Processing and KSQL
Experience in building large scale distributed systems and knowledge of enterprise architecture patterns. Knowledge of distributed system concepts such as Leadership, Consensus, Replication and Partitioning
Experience in build automation tools such as Gradle, Jenkins, Docker Images and Artifactory
Experience in working with Container Management/Orchestration Systems such as Kubernetes, Apache Mesos, AWS ECS, etc
Experience with Stream Analysis and ML with tools such as Spark, Apache Flink, etc
Experience with building instrumented, scalable, highly available, and secure systems
Hiring and retaining top engineers in the Integration team
Experience with one or more public clouds (AWS, GCP, Azure)
5+ years of hands on Java development with 2 years in building data processing applications using Spring Boot
BS in EE or CS