By applying and enforcing standard and best practices, you will design and develop scalable, 24×7, real-time data and reporting services to process billions of events per day. Specific duties will include developing enhancements to our Hadoop/Spark based data infrastructure and extend to streamed data processing; contributing to data query, access and reporting projects and data distribution / reporting components, real-time optimization and predictive analysis algo's using a combination of Scala, Java, SQL, Ruby, R, Go, Bourne shell scripts, cron jobs, and other relevant technologies. You are expected to work in a cloud based/AWS environment using the latest technologies.
The ideal candidate has both strong software engineering experience in JVM based technologies and bid data / distributed processing experience. Candidate must be familiar with current distributed computing data technologies on commodity servers and have experience with 24×7 production needs.
- Develop new Data Services
- Develop new Software Service
- Use and teach best practices,
- Strong Software Development and Programing experience in JVM based languages, with recent experience in Scala.
- Strong SQL skills on large-scale databases, knowledge of any of the following RDBMS such as MySQL, Postgres, Redshift, Hive or others.
- Experience in the Hadoop Data Processing ecosystem, and preferably EMR
- Experience with Linux or Unix based systems – including Bourne shell, cron and other Unix utilities.
- Experience in Cloud Development, and AWS in particular.
- Degree in Computer Science or a related field.
Desired Technology Experience
- Experience with at least one of Ruby/Python, Bourne Shell or other scripting languages.
- ETL Experience maintaining multiple data systems.
- Experience with Luigi or other Hadoop workflow solutions and experience developing complex data processing pipelines, including experience developing regressions tests and deployment strategies for such environments.
- Working experience developing and supporting 24×7 production data services and pipelines on Linux systems – including experience being on-call supporting such services. Experience with AWS preferred.
- Practiced in Agile Process
Comp & Benefits
Apply for the job
- Competitive comp based on experience level
- Healthcare HMO & PPO
- Stock options and 401k
- Flexible Spending and Transit Reimbursement Accounts