Key Responsibilities:
• Be part of developing scalable, reliable and highly available production level data platforms, data pipelines to meet various business needs.
• Should be able to design (high level / low level) software solutions for the new requirements.
• Coding independently and also with other team members with proper software industry standard best practices.
• Collaborate with data analysts/ product managers in understanding the data.
• Mentoring junior developers, code reviews and deployments.
• Work closely with QA and deliver the product end to end.
Skills & Qualifications:
• B.E/MTech in computer science.
• 3 - 14 years of experience in software development.
• Experience in building scalable products with preferably big data.
• Good coding skills in python, experience in spark.
• Experience in either data warehouses or relational database is mandatory.
• Good to have experience in kafka, stream/batch processing applications like storm, kstreams.
• Experience in aws cloud.
• Ability to work collaboratively across multiple products and application teams.
Perks and Benefits:
• Competitive compensation.
• Generous stock options.
• Medical Insurance coverage.
• Work with some of the brightest minds from Silicon Valley’s most dominant and successful
companies.
Bachelors
B.E
AWS,Big Data,Kafka,Python,Spark,Data warehouse,KStreams,
IT-Software- Software services