Big Data/Cloud Engineer in Alpharetta, GA at HUNTER Technical Resources

Date Posted: 7/9/2020

Job Snapshot

  • Employee Type:
    Contractor
  • Job Type:
  • Experience:
    Not Specified
  • Date Posted:
    7/9/2020
  • Job ID:
    5057144

Job Description


Responsibilities
  • Responsible for working with global teams on the technical design and implementation of Analytical platforms on Onprem and in GCP using cloud native big data technology stack
  • Be able to install and update various analytical tools required (ex. RStudio, Anaconda Notebooks, Tableau, SAS, and Spotfire).
  • Be able to solve issues with analytical tools and clearly explain the same to other admins and tool vendors.
  • Document installations, user on-boarding, and standard methodologies. Ensure that procedures and infrastructure details are properly documented and shared among the team.
  • Interact with users continuously to address their issues and search for the optimum usage pattern to propagate.
  • Solving issues, performance tuning, security remediation/patch management, Upkeep of the platforms
  • Deploy/implement new instances of our financial product (Customer facing Analytical platform) in the cloud through cloud formation templates and/or on-prem setup.
  • Ensure that every cluster and services are always available without performance issues, using monitoring and alerting tools
  • Learn about all the technologies involved in the project. We expect our engineers to be well-rounded and be able to support end-to-end solutions.
  • Work in a privileged account security architecture, caring about all the security policies involved
  • Propose new and better ways to tackle problems from technology view or process view. Be part of the technical and procedural solutions being worked by the team.

Requirements:
  • Bachelor' s degree (Computer Science or a related field) and have minimum 3 years of experience working in Technology in building, designing and maintaining enterprise class applications.
  • AWS or GCP certification is preferred.
  • Should have experience with integrating bigdata tools with Hadoop and solving issues related to tools like R, Python, Hive, Hue, Anaconda Notebooks, Tableau, and Spotfire.
  • You have experience with core Java development
  • Should have experience solving queries, jobs, and performance issues on the platform.
  • Hands on experience Configuring and Administering SCM(GIT, SVN), Build (CMake, Make files, Maven), Nexus, CI(Jenkins), CD Automation Tools
  • Experience designing and enabling services in AWS or GCP using Terraform or Cloud Formation scripts.
  • Integrating Public Cloud Services with enterprise LDAP and security frameworks.