Job title: Senior Solutions Consultant (CAT1)
Job type: Permanent
Emp type: Full-time
Industry: Consultancy
Location: Singapore
Job published: 15/11/2021
Job ID: 32523

Job Description

Senior Solutions Consultant

In this role you’ll have the opportunity to develop massively scalable solutions to solve complex data problems using Hadoop, NiFi, Spark and related Big Data technology. This role is a client facing opportunity that combines consulting skills with deep technical design and development in the Big Data space. 

Responsibilities:

  • Work directly with customers to implement Big Data solutions at scale using the Cloudera Data Platform and Cloudera Dataflow
  • Design and implement Hadoop and NiFi platform architectures and configurations for customers
  • Perform platform installation and upgrades for advanced secured cluster configurations
  • Analyze complex distributed production deployments, and make recommendations to optimize performance 
  • Able to document and present complex architectures for the customers technical teams
  • Work closely with Cloudera’ teams at all levels to help ensure the success of project consulting engagements with customer
  • Drive projects with customers to successful completion 
  • Write and produce technical documentation, blogs and knowledgebase articles and keep current with Hadoop Big Data ecosystem
  • Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements 
  • Attend speaking engagements when needed 

 

Qualifications:

  • 10+ years in Information Technology and System Architecture experience
  • 5+ years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
  • 5+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions 
  • Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based and streaming data deployments. 
  • Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.
  • Ability to understand and translate customer requirements into technical requirements 
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience setting up multi-node Hadoop clusters
  • Experience in configuring security configurations (LDAP/AD, Kerberos/SPNEGO)
  • Strong experience implementing software and/or solutions in the enterprise Linux environment 
  • Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
  • Strong understanding of network configuration, devices, protocols, speeds and optimizations 
  • Strong understanding of the Java ecosystem including debugging, logging, monitoring and profiling tools
  • Familiarity with scripting tools such as bash shell scripts, Python and/or Perl, Ansible, Chef, Puppet
  • Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements 

 

Apply with indeed Apply with linkedin
File types (doc, docx, pdf, rtf, png, jpeg, jpg, bmp, jng, ppt, pptx, csv, gif) size up to 5MB