This site uses cookies. To find out more, see our Cookies Policy

Big Data Engineer in Atlanta, GA at HUNTER Technical Resources

Date Posted: 3/5/2019

Job Snapshot

Job Description

Big Data Engineer

Job Overview

Our client is recruiting a Big Data Engineer, a newly created position within the Information Technology Department. This position is responsible for refining and creating the next step in technology for our client.


  • Leverage sophisticated Big Data technologies into current and future business applications
  • Lead infrastructure projects for the implementation of new Big Data solutions
  • Design and implement modern, scalable data center architectures (on premise, hybrid or cloud) that meet the requirements of our business partners
  • Ensure the architecture is optimized for large dataset acquisition, analysis, storage, cleansing, transformation and reclamation
  • Create the requirements analysis, the platform selection and the design of the technical architecture
  • Develop IT infrastructure roadmaps and implement strategies around data science initiatives
  • Lead the research and evaluation of emerging technologies, industry and market trends to assist in project development and operational support actives


  • Bachelor's degree in Computer/Information Science/Information Technology/Management Information System (MIS)

Must have 5 – 7 years of experience in the following:

  • Architecture, design, implementation, operation and maintenance of Big Data solutions
  • Hands-on experience with major Big Data technologies and frameworks including Hadoop, MapReduce, Pig, Hive, HBase, Oozie, Mahout, Flume, ZooKeeper, MongoDB, and Cassandra.
  • Experience with Big Data solutions deployed in large cloud computing infrastructures such as AWS, GCE and Azure
  • Strong knowledge of programming and scripting languages such as Java, Linus, PHP, Ruby, Phyton
  • Big Data query tools such as Pig, Hive and Impala
  • Project Management Skills:
  • Ability to develop plans/projects from conceptualization to implementation
  • Ability to organize workflow and direct tasks as well as document milestones and ROI’s and resolve problems

Proven experience with the following:

  • Open source software such as Hadoop and Red Hat
  • Shell scripting
  • Servers, storage, networking, and data archival/backup solutions
  • Industry knowledge and experience in areas such as Software Defined Networking (SDN), IT infrastructure and systems security, and cloud or network systems management