We are an
Award Winning Workplace

Pune Recruitment Drive

Open Positions

Director Of Engineering
Location:
Gurgaon
Qualification:
BE/ B.Tech/ MCA/ MS- IT/ CS
Experience:
14 – 18 years
Technology:
RFP / RFQ / Pre-sales / Big Data / Java / Microservices
Role/Skills:
  • 15+ year track record of relevant work experience and a Computer Science or related technical discipline is required
  • Dynamic leader who has directly managed team of highly competent developers in fast paced work environment
  • Ability to work on multiple projects, with complex and challenging software requirements, and a keen sense of the solution design and architecture
  • Working familiarity with the entire Software/Product development lifecycle including version control, build process, testing, and code release
  • Development and support experience in Big Data domain.
  • Architecting, developing, implementing and maintaining Big Data solutions using Apache Hadoop, HDFS, Sqoop, Pig,Hive, Oozie, Impala, Spark, Kafka.
  • Strong technical and architectural skills in distributed computing system using Java/J2EE
  • Strong people management skills—fairness and ability to motivate and manage engineers and project leads/ project managers to meet targets on time within budget
  • Experience in Architecture and delivery of Enterprise scale applications.
  • Good perceptive and analytical skills, with a keen business sense and a future perception of technology evolution
  • Possess confidence in self and in the solution, along with good negotiation and client convincing skills
  • Finisher with a demonstrable track record of timely and successful execution
  • Attend presales tech meetings on the need basis
  • Create / Review/ Edit technical proposals and RFI responses
  • INSPIRE TRUST AND CONFIDENCE IN THE VERY FIRST MEETING with the customer / prospect.
Technical Architect
Location:
Noida/Bengaluru/Indore/Gurgaon
Qualification:
BE/ B.Tech/ MCA/ MS- IT/ CS
Experience:
9 – 15 years
Technology:
HLD / LLD / Big Data / Java / Microservices
Role/Skills:

A successful candidate with 10+ years of experience in the role of implementation of high end software products.

  • Provides technical leadership in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc..NoSQL stores like Cassandra, HBase etc) across Engagements and contributes to open source Big Data technologies.
  • Visualize and evangelize next generation infrastructure in Big Data space (Batch, Near Real-time, Real-time technologies).
  • Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms
  • Expert-level proficiency in at-least one of Java
  • Strong understanding and experience in distributed computing frameworks, particularly Apache Hadoop 2.0 (YARN; MR & HDFS) and associated technologies -- one or more of Hive, Sqoop, Avro, Flume, Oozie, Zookeeper, etc.Hands-on experience with Apache Spark and its components (Streaming, SQL, MLLib)
  • Operating knowledge of cloud computing platforms (AWS, especially EMR, EC2, S3, SWF services and the AWS CLI)
  • Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common tasks
Technical Architect Cloud
Location:
Noida/Bengaluru/Indore/Gurgaon
Qualification:
BE/ B.Tech/ MCA/ MS- IT/ CS
Experience:
9 – 15 years
Technology:
AWS / Cloud / Big Data / Java / EMR
Role/Skills:
  • Hands-on experience with AWS, Azure, GCP ( Google Cloud Platform )
  • Hands-on experience with cloud platform evaluation, cost estimation and crafting roll-out plans
  • Hands-on experience with using Cloud Platform provided Big Data technologies and equally non-big data technologies
    • e.g. EC2, EMR, RedShift, S3 in Amazon Web Services
    • e.g. Azure Stack in Microsoft Azure
  • Hands-on experience in setting up Cloud platforms for Client’s use-cases
  • Good and practical scripting knowledge in Java ( core java ) and J2EE technologies, Python
  • Good knowledge in Big Data technologies – Hadoop, Spark, Pig, Hive, HBase
  • Experience using programming frameworks - MapReduce, Spark, Scala
  • Experience or learning ability in NoSQL Technologies – MongoDB, Cassandra, HBase
  • Good knowledge of Big Data, Relational Database, Data Architecture concepts
  • Strong analytical, problem-solving, data analysis and research skills
  • Good people management skills to manage, guide and mentor a technical team
  • Demonstrable ability to interact, collaborate, drive consensus and confidence among different (of our) groups
  • Demonstrable ability to think outside of the box and not be dependent on readily available tools
  • Excellent communication, presentation and interpersonal skills are a must
  • If someone has experience with Data Science tools & technologies on Cloud (e.g. Azure ML ) that’s welcome, but not mandatory
Cloud Engineers
Location:
Noida/Bengaluru/Indore/Gurgaon
Qualification:
BE/ B.Tech/ MCA/ MS- IT/ CS
Experience:
4 – 9 years
Technology:
AWS / Cloud / Big Data / Java / EMR
Role/Skills:
  • Hands-on experience with AWS services from Compute, Storage, Networking and Security components
  • Having deep knowledge of VPC configuring, Private/Public Subnet, NAT Gateway, ELB with EC2, Lambda, S3, EBS, EFS, Glacier, RDS, SQS, SNS Cloud watch and Cloud Formation
  • Hands-on experience with using Cloud Platform provided Big Data technologies (i.e. EC2, EMR, RedShift, S3, Kinesis)
  • Having deep understanding of IAM Roles, Policies, Users and Groups
  • Hands-on experience in setting up Cloud platforms for use-cases
  • Good hands on experience of Java (core java) and J2EE technologies, Bash Scripts
  • Strong analytical, problem-solving, data analysis and research skills
  • Demonstrable ability to think outside of the box and not be dependent on readily available tools
  • Excellent communication, presentation and interpersonal skills are a must

Good to Have:

  • Knowledge in Big Data technologies – Hadoop, Pig, Hive and Spark
  • Experience in migrating workload from on-premise to cloud and cloud to cloud migrations
  • Experience in programming of - MapReduce, Spark
Big Data Engineers
Location:
Noida/Bengaluru/Indore/Gurgaon
Qualification:
BE/ B.Tech/ MCA/ MS- IT/ CS
Experience:
4 – 9 years
Technology:
Spark / Hive / HDFS Cassandra / Map Reduce Java
Role/Skills:
  • Software development experience in Java/Scala/Python - Must have
  • Experience on entire software development lifecycle from determining requirements and concepts, through evaluations, design, development, test and integration. - Must have
  • Ensure designs follow specifications - Must have
  • Exposure to Build and release management - Must have
  • Build and Implement the solution. This will need to be hands on to build in quick prototypes/proof of concepts data processing benchmarks – Must Have
  • Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review - Must have
L2 L3 Support Engineers
Location:
Noida/Bengaluru/Indore/Gurgaon
Qualification:
BE/ B.Tech/ MCA/ MS- IT/ CS
Experience:
4 – 6 years
Technology:
Python / Java / Big Data Exposure
Role/Skills:

    Proactive monitoring of production

  • Providing L2 and L3 support in identifying the issue, fixing the issue (bug and break fixes).
  • Providing optimization solution from support perspective and implementing.
  • Progressively reducing the # of issues by fixing the root cause.
  • Should be able to support applications through infrastructure upgrades
  • Mandatory skills: Core Java, Hive, Map-reduce, python, Unix, pyspark
  • Preferred offshore location is Gurgaon.
  • Good communication skills.
  • Out of 3 offshore resources 1 would be a lead kind of profile.
  • Configure monitoring of infrastructure and applications using Splunk.
  • Have a strong command of all Big Data components, including but not limited to: Hadoop, HDFS, Hive, HBase, Kafka, Map Reduce, YARN, Oozie, Zookeeper, Spark.
  • Optimize time spent on maintenance activities and automate routine tasks.
  • Linux/Unix administration, ability to navigate through a system, look at logs (Hadoop jobs and Yarn jobs logs), etc.
  • Experience with configuration management tools like Git.
  • Experience of Hadoop platform administration is plus.
  • Knowledge of networking, firewalls and load balancers is plus.
  • Familiarity with Docker containers, working experience a plus.
  • Strong understanding of Kerberos, Windows AD and security practices for Big data technologies.