Senior Big Data Platform Operations Lead

Company
AIG
Location
Fort Worth TX
Salary
DOE
Posted
February 22 2018
Industry
Insurance, Securities

Functional Area:


IT - Information Technology

Estimated Travel Percentage (%): Up to 25%

Relocation Provided: No

AIG Technologies,Inc.(US)


Experienced Hadoop administrator/architect to provide subject matter expertise on all aspects of HortonWorks (required) and Cloudera (preferred) Hadoop deployment.



Responsibilities include:



  • Architecting scalable & highly available Hadoop environments
  • Working with business and application development teams to provide effective technical designs aligning with industry best practices
  • Capturing the as-is operational state and working with team members to define the roadmap and future state
  • Defining secure, highly reliable integration strategies for applications and customer systems
  • Interfacing with other groups such as security, network, compliance, storage, etc.
  • Providing subject matter expertise, training, and direction to team members.
  • Recommending and aiding in establishing development, testing and documentation standards
  • Monitoring and ensuring compliance with architectural and development standards
  • Identifying and recommending new technologies, architectures, processes and tools to increase efficiency and productivity
  • Engaging with external vendors to evaluate products and lead POC development
  • Working with multiple products and technologies at all tiers of the architecture to guide the design and implementation of innovative, scalable and robust solutions
  • Installation, configuration, monitoring, and administering of large Hadoop clusters
  • Designing and participation in testing of DR, replication & high availability solutions
  • Implementing Kerberos, Knox, Ranger, and other security enhancements
  • Arranging and managing maintenance windows to minimize impact of outages to end users
  • Independently managing, leading, and executing complex projects with cross-functional teams


Job Requirements:



  • 4+ years of experience on Hadoop clusters; minimum of 2 years with the HortonWorks distribution
  • Expert level experience in Hadoop infrastructure design and deployment
  • 8+ years of experience in Linux-based systems or database administration
  • Experience in AWS capabilities, environment design, and operational support
  • Experience with analytical tools, languages, or libraries
  • Hands-on experience with production deployments of Hadoop applications
  • Strong understanding of best practices and standards for Hadoop application design and implementation
  • Hadoop administration experience that includes:
    • Installing, configuring, monitoring, and administering large Hadoop clusters
    • Backup & Recovery of HDFS
    • DR, Replication & High Availability of Hadoop infrastructure
    • Securing the cluster using Kerberos, LDAP Integration, and/or Centrify
    • Managing & Scheduling jobs
    • Managing Hadoop Queues, Access Controls, user quotas etc.
    • Capacity planning, configuration management, monitoring, debugging, and performance tuning
  • Experience monitoring and troubleshooting using a variety of open source and proprietary toolsets
  • Hands-on experience with Big Data use cases & development
  • Experience with Hadoop tools/languages/concepts such as HDP 2.0+, HDFS, Hive, Oozie, Sqoop, PIG, Flume, Spark, Kafka, Solr, Hbase, Ranger, Knox, Map Reduce, etc.
  • Expert understanding of MapReduce and how the Hadoop Distributed File System works
  • Understanding of enterprise ETL tools (e.g. DataStage, Talend)
  • Experience in support big data environments in public and hybrid cloud formats
  • Experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef, Puppet)
  • Understanding of relational databases (RDBMS), SQL & No-SQL databases
  • Ability to coordinate and prioritize multiple tasks in a fast-paced environment
  • Strong verbal/written communication and presentation skills
  • Experience with highly-scalable or distributed RDBMS (Teradata, Netezza, Greenplum etc.) is preferred
  • Knowledge of cloud computing infrastructure and considerations for scalable, distributed systems is preferred
  • Experience with Statistical Modeling, Data Mining and/or Machine Learning is preferred
  • Ability to build relationships and work effectively with cross-functional, international team

Experience Preferred: 8 - 12 Years




Education Preferred: Bachelor's Degree (or equivalent)




Travel Percentage Estimate: None




It has been and will continue to be the policy of American International Group, Inc., its subsidiaries and affiliates to be an Equal Opportunity Employer. We provide equal opportunity to all qualified individuals regardless of race, color, religion, age, gender, gender expression, national origin, veteran status, disability or any other legally protected categories.



At AIG, we believe that diversity and inclusion are critical to our future and our mission - creating a foundation for a creative workplace that leads to innovation, growth, and profitability. Through a wide variety of programs and initiatives, we invest in each employee, seeking to ensure that our people are not only respected as individuals, but also truly valued for their unique perspectives




Similar jobs

Similar jobs