Senior Technology Specialist

October 17 2020
About Wells Fargo

Wells Fargo & Company (NYSE: WFC) is a leading global financial services company headquartered in San Francisco (United States). Wells Fargo has offices in over 30 countries and territories. Our business outside of the U.S. mostly focuses on providing banking services for large corporate, government and financial institution clients. We have worldwide expertise and services to help our customers improve earnings, manage risk, and develop opportunities in the global marketplace. Our global reach offers many opportunities for you to develop a career with Wells Fargo. Join our diverse and inclusive team where you will feel valued and inspired to contribute your unique skills and experience. We are looking for talented people who will put our customers at the center of everything we do. Help us build a better Wells Fargo. It all begins with outstanding talent. It all begins with you.

Market Job Description

About Wells Fargo India

Wells Fargo India enables global talent capabilities for Wells Fargo Bank NA., by supporting business lines and staff functions across Technology, Operations, Risk, Audit, Process Excellence, Automation and Product, Analytics and Modeling. We are operating in Hyderabad, Bengaluru and Chennai locations.

Department Overview

Enterprise Functions Technology is responsible for supporting enterprise technology initiatives related to messaging and collaboration, risk technology, corporate systems data, and various internal systems and tools.

About the role
Require a full-stack big data developer engineer for platform Data and build activities, responsible for architecting, designing and developing data services solutions using big-data solutions for the Risk & Finance core services infrastructure. Requires driving both near-term data to satisfy regulatory requirements while strategically creating the data services platform.
The person that fills this position will be a vital part of our team and will be responsible for converting technical requirements into outstanding design.


  • Provide technical solutions for Data services that deliver Strategic Enterprise Risk Management data platform.
  • Work with senior management, business analysts and project managers for requirements and business rules.
  • Background and expertise in architecture and designing bigdata solutions and building robust framework at scale.
  • Design and implement APIs , abstractions and integration patterns to solve challenging distributed computing problems.
  • Understand and own component security analysis, including code and data flow review. Collaborate with security team to implement and verify secure coding techniques.
  • Ensure proper metrics instrumentation in software components, to help facilitate real time and remote troubleshooting/performance monitoring.
  • Work with the software development team to identify and fix bugs and functional issues of our Big Data products.
  • Develop, Review and tests the software components for adherence to the design requirements and document test results.
  • Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
  • Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
  • Support ongoing data management efforts for Development, QA and Production environments
  • Utilizes a thorough understanding of available technology, tools, and existing designs.
  • Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
  • Acts as expert technical resource to programming staff in the program development, testing, and implementation process.

Essential Qualifications

  • 10+ years of application development and implementation experience using JAVA or python
  • 5+ years of experience with Apache Spark using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs)
  • Experience Hadoop ecosystem tools relevant for real-time and batch data ingestion, processing and provisioning using tools such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Hive or Apache Storm
  • Experience delivering ETL(Talend), data warehouse and data analytics capabilities on big-data architecture such as Hadoop
  • Reporting experience, analytics experience or a combination of both
  • Knowledge and understanding of DevOps principles
  • Experience working with NO SQL databases like Mongo DB

Desired Qualification

  • Experience developing RESTFul API using Spring Boot
  • Well versed with code deploy tools like Artifactory and UDeploy.
  • 5+ years of Agile experience

We Value Diversity

At Wells Fargo, we believe in diversity and inclusion in the workplace; accordingly, we welcome applications for employment from all qualified candidates, regardless of race, color, gender, national or ethnic origin, age, disability, religion, sexual orientation, gender identity or any other status protected by applicable law. We comply with all applicable laws in every jurisdiction in which we operate.

Similar jobs

Similar jobs