Hadoop Administrator

Get Referred

Job Description

Accertify Inc., a wholly owned subsidiary of American Express, is a leader in providing software, tools and strategies for preventing online fraud. Accertify is committed to providing e-commerce companies with comprehensive and cost-effective solutions to fraud.

 

Accertify is seeking a highly motivated DevOps Engineer with hands-on big data development and big data infrastructure administration experience. The incumbentwill report to Director of Big Data (DBD) and will work toward implementing initiatives proposed by DBD pertinent to Big Data infrastructure, operations and maintenance. The candidate will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. This position will also be responsible for integrating them with the architecture used across the company.

 

Responsibilities

  • Collaborate with internal/external business partners on big data projects
  • Utilize technical expertise in Hadoop administration
  • Evaluate new big data tools, frameworks and technologies, explore Proof of Concept (POC) to identify optimum solutions for requested capabilities
  • Ensure holistic understanding of the BIG DATA Ecosystem
  • Install, maintain, and administer software on Linux servers
  • Automate manual processes using tools such as Python, Ruby, Unix Shell (bash, ksh) etc.
  • Monitor Big Data Application/Infrastructure Performance and availability
  • Implement ETL processes from various data sources to Hadoop cluster

Qualifications

  • Bachelors’ of Science in Computer Science or related field
  • 4+ years’ experience in the following:
  • Automating build/deployment, software configuration, continuous integration/continuous delivery, release engineering related tasks in a big data Environments
  • Automating manual processes and Developing using Python/Java, Unix Shell (bash, ash), SQL etc.
  • Big Data Components/Frameworks such as Hadoop (MapR), Spark, Yarn, Kafka, Flink, ELK etc.
  • NoSQL databases such as HBase, Cassandra, MapR DB
  • Big Data querying tools such as Drill, Presto, Hive etc
  • Infrastructure automation tools like Ansible
  • Monitoring tools like Grafana, Splunk etc
  • Monitoring Application/Infrastructure Performance and availability.
  • Experience or understanding of developing machine/deep learning systems in a distributed environment.
  • Development tools such as GIT, and familiarity with collaboration tools such as Jira and Confluence or similar tools.
  • Containerization (Docker) and resource scheduling (Kubernetes).
Employment eligibility to work with American Express in the U.S. is required as the company will not pursue visa sponsorship for this position.
 
 

ReqID: 19021018
Schedule (Full-Time/Part-Time): Full-time
Date Posted: Dec 20, 2019, 4:30:25 PM