menu

Hortonworks

Overview

Located in Sunnyvale, California, Hortonworks is dedicated to developing and adopting Apache Hadoop as a platform for storing, managing, and analyzing data. Hadoop is an open source technology project, created by Apache and designed to handle large amounts of data across a variety of applications. Launched in 2011 by Yahoo! and Benchmark Capital, Hortonworks operates under the belief that Apache Hadoop will be used to process half of the world's technological data within five years. Through Yahoo!, Apache Hadoop runs on 42,000 servers, delivering content to 700 million users around the globe.

At DigitalApiCraft We focus on helping our customers accelerate their path to innovation through the introduction of breakthrough analytic, big data platform and data science consulting capabilities.

Our Expertise:

  • We Identify improvements, develop platform/infrastructure services and automate, automate, automate
  • We help the team build and maintain processes and tools for deployment, monitoring and operations, troubleshoot and resolve issues in your dev, test and production environments.
  • We Monitor and administer a large and growing production footprint with diverse application services.
  • We do performance tuning and optimizing of distributed systems, operating systems, networks, and applications
  • Implement system management strategies for optimization, rapid feedback and high availability
  • Developing  Spark based solutions to support near real-time data ingestion, analytics, and reporting.
  • Gathering requirements, designing a solution, determining the effort to implement the solution, and leading the team to implement the solution.
  • Document solutions for long term support.
  • Assist client support team with Spark related issue resolution.
  • Highly Experienced in Managing scalable Hadoop virtual and physical cluster environments.
  • Manage the backup and disaster recovery for Hadoop data.
  • Optimize and tune the Hadoop environments to meet performance requirements.
  • Install and configure monitoring tools for all the critical Hadoop systems and services.
  • Work in tandem with big data developers and designs use case specific scalable supportable -infrastructure.
  • Provide very responsive support for day to day requests from, development, support, and business analyst teams.
  • Expert in Performance analysis and debugging of slow running development and production processes.
  • Solid technical understanding of services such as Drill, Hive, Hue, and Oozie.
  • Work with Linux server admin team in administering the server hardware and operating system.
  • Assist with development  and maintain the system documentation.
  • Create and publish various production metrics including system performance and reliability information to systems owners and management. 
  • Perform ongoing capacity management forecasts including timing and budget considerations.
  • Coordinate root cause analysis (RCA) efforts to minimize future system issues.
  • Mentor, develop and train other systems operations staff members as needed.
  • Provide off hour support.