Sr. Solutions Consultant - Cloudera | New Day Jobs (Yangon, Myanmar)

Easy Apply (Cloudera) Sr. Solutions Consultant job. View job description, responsibilities and qualifications. See if you qualify!

Cloudera
Remote (Asia Time Zone Permitted)

Sr. Solutions Consultant

Apply from Source

Sr. Solutions Consultant

Cloudera
  • Remote (Asia Time Zone Permitted)
  • Other
Salary : Login to view salary Apply from Source
Job Type : Full-Time
Education Requirement : Bachelor Degree
Skills :
Experience : 3 to 5 years
Work Days : Monday To Friday
Share this
Job Detail

Job Description: Cloudera is seeking an experienced Solutions Consultant/Architect to join our team in Singapore with CAT 1 security clearance. This key role has two major responsibilities:

  • First to work directly with our customers and partners to optimise their plans and objectives for architecting, designing and deploying technologies such as Apache Hadoop, Kafka, Spark, and Flink environments.
  • Secondly, to assist in building or designing reference configurations to enable our customers and influence our product. The Solutions Consultant/Architect will facilitate the communication flow between Cloudera teams and the customer.
  • For these strategically important roles, we are seeking outstanding talent to join our team.

Responsibilities:

  • Work directly with customer ’ s technical resources to devise and recommend solutions based on the understood requirements
  • Analyse complex distributed production deployments, and make recommendations to optimise performance
  • Able to document and present complex architectures for the customers technial teams
  • Work closely with Cloudera ’ s teams at all levels to help ensure the success of project
  • consulting engagements with customer
  • Help design and implement Hadoop architectures and configurations for customer
  • Drive projects with customers to successful completion
  • Write and produce technical documentation, knowledge base articles
  • Participate in the pre- and post-sales process, helping both the sales and prodct teams to interpret customers ’ requirements
  • Keep current with the Hadoop Big Data ecosystem technologies
  • Attend speaking engagements when needed
  • Potential travel up to 25% post-COVID

Qualifications:

  • More than two years of Professional Services (customer facing) experience arhitecting large scale storage, data center and /or globally distributed solutions
  • Experience designing and deploying production large-scale Hadoop solutions
  • Ability to understand and translate customer requirements into technical reuireents
  • Experience designing data queries/workloads in a Hadoop environment using tools such as Apache Hive/Impala, Apache Druid, Apache Phoenix or others
  • Experience installing and administering multi-node Hadoop clusters
  • Strong experience implementing software and/or solutions in the enterprise Linx or
  • Unix environment
  • Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
  • Strong understanding of network configuration, devices, protocols, speeds and optimizations
  • Solid background in Database administration and design, ideally along with Data
  • Modeling with star schema, slowing changing dimensions, and/or data capture
  • Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements
  • Demonstrated experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments
  • Excellent verbal and written communications

Preferred, but not required experience:

  • Experience with Java/Scala development, debugging & profiling
  • Experience working with network-based APIs, preferably REST/JSON or XML/SOAP
  • Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.
  • Hortonworks/Cloudera Certified - Admin and/or Developer or Data Science experience
  • Familiarity with Data Science notebooks such as Cloudera Data Science Workbench,
  • Apache Zepplin, Jupyter or IBM DSX
  • Experience implementing machine learning algorithms such as R, PySpark, Scala, or Tensorflow
  • Automation experience with Chef, Puppet, Jenkins, or Ansible
  • Familiarity with scripting tools such as Bash shell scripts or Python
  • Experience with Cloud Platforms & deployment automation

Similar Jobs
LinkedIn-SG - 13 hours ago
26 total views, 1 today
Similar Jobs
You will receive the email for your email confirmation. Please check!