Director of Engineering
Graduates/Postgraduate in CSE or related field
The successful candidate will lead Big Data Engineering team for Impetus from Gurgaon / Bengaluru (India) office. The selected candidate will be fully responsible for solution design delivery achieving engineering excellence of services.
- 15+ year track record of relevant work experience and a Computer Science or related technical discipline is required
- Dynamic leader who has directly managed team of highly competent developers in fast paced work environment
- Ability to work on multiple projects, with complex and challenging software requirements, and a keen sense of the solution design and architecture
- Working familiarity with the entire Software/Product development lifecycle including version control, build process, testing, and code release
- Development and support experience in the Big Data domain.
- Architecting, developing, implementing and maintaining Big Data solutions using Apache Hadoop, HDFS, Sqoop, Pig,Hive, Oozie, Impala, Spark, Kafka.
- Strong technical and architectural skills in a distributed computing system using Java/J2EE
- Strong people management skills—fairness and ability to motivate and manage engineers and project leads/project managers to meet targets on time within budget
- Experience in Architecture and delivery of Enterprise scale applications.
- Good perceptive and analytical skills, with a keen business sense and a future perception of technology evolution
- Possess confidence in self and in the solution, along with good negotiation and client convincing skills
- Finisher with a demonstrable track record of timely and successful execution
- Attend presales tech meetings on the need basis
- Create/review/edit technical proposals and RFI responses
- Inspire, trust, and confidence in the first meeting with the customer/prospect.
- Develop and nurture team of developers by motivating and mentoring, and assist in hiring critical talent.
- Inspire leadership and encourage innovation of ideas.
- Help project teams in conceiving, designing, and implementing n-tier architectures
- Design, develop and evolve highly scalable and fault-tolerant distributed components using Big data technologies.
- Excellent experience in Application development and support, integration development and data management.
- Guiding developers in a day-to-day design and coding tasks, stepping in to code if needed.
- Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems
- Experience with Hadoop ecosystem (HDFS, MapReduce, Oozie, Hive, Impala, Spark, Kerberos, KAFKA, etc)
- Understand and own component security analysis, including code and data flow review. Collaborate with the security team to implement and verify secure coding techniques
- Ensure proper metrics instrumentation in software components, to help facilitate real-time and remote troubleshooting/performance monitoring
- Contribute to an efficient development process pipeline by leveraging best-in-class tools
- Experience in defining technical requirements, conceptual, logical and system architectures.
- Identify and utilize best practices in the industry to maximize efficient and elegant solutions while minimizing cost
- Keep the customer satisfied