MUFG Union Bank Jobs

Mobile mufg Logo

Job Information

MUFG Union Bank Software Engineer, Vice President in Tempe, Arizona

Description

Discover your opportunity with Mitsubishi UFJ Financial Group (MUFG), the 5th largest financial group in the world with total assets of over $2.4 trillion (as ranked by SNL Financial, April 2016) and 140,000 colleagues in nearly 50 countries. In the U.S., we’re 13,000 strong, working together to positively impact every customer, organization, and community we serve. We achieve this by delivering on our values, putting people first, fostering long-term relationships built on honesty and mutual understanding, and inspiring the best in each other. This is all part of our inclusive, high-performing culture supported by Total Rewards that include our cash balance pension plan. Join a team that’s working to fulfill its vision to be the world’s most trusted financial group.

Program Summary:

MUFG Americas is embarking on a business and technology transformation to effectively deliver five key business imperatives: Growth, Business Agility, Client Experience, Effective Controls, and Collaboration. To accomplish these imperatives, MUFG has launched a Transformation Program built upon the following foundation pillars:

  1. Core Banking Transformation Program

  2. Data Governance, Infrastructure & Reporting Program

  3. Technology Modernization Program

This position supports the Core Banking Transformation (CBT) Program. CBT is a multi-year effort to modernize our deposits platform with a world-class digitally-led and simplified ecosystem for consumer, small business, commercial and transaction banking to deliver exceptional customer experience and provide the bank a competitive advantage in the market. Our customers will benefit from streamlined and automated processes that simultaneously will provide the bank business process efficiencies and operational cost savings.

Role Summary:

The Core Banking Transformation technology team seeks a talented Data Engineer who is collaborative and passionate about solving complex data engineering problems. This role is responsible for design, build, implementation, monitoring, and management of the MUFG Core Banking data services gateway that provides the foundations for the technology modernization and digital transformation.

As a data platform engineer, you will focus on building the firm’s next generation data environment. You will be a key player in creating a data services platform that drives real-time decision-making in service of our customers. You will develop, build, and operate the platform using DevSecOps and System Reliability Engineering (SRE) methods

Major Responsibilities:

  • Work closely with architecture teams to select, design, develop and implement optimized solutions and practices

  • Create and maintain optimal data pipeline architecture, responsibilities include the design, implementation, and continuous delivery of a sophisticated data pipeline supporting development and operations

  • Gather and process large, complex, raw data sets at scale (including writing data pipelines, scripts, calling APIs, write SQL queries, etc.) that meet functional / non-functional business requirements.

  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

  • Analyze complex data / data models and focus on the data research of cross-functional requirements, source and target data model analysis to develop and support the end-to-end data mapping effort

  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using streaming, pipeline, SQL/NoSQL technologies.

  • Proficient in usage of distributed revision control systems with branching, tagging (git). Create and maintain release and update processes using open source build tools

  • Develop and deliver ongoing releases using tiered data pipelines and continuous integration tools like Jenkins

  • Strong experience with environment and deployment automation, infrastructure-as-code, deployment data pipeline specification and development.

  • Work with partners including the Business, Infrastructure and Design teams to assist with data-related technical issues and support their data infrastructure needs.

  • Be a data expert to strive for greater functionality in our data systems.

  • Responsible for production readiness and all operational aspects of the new data services that will support mission critical MUFG applications

  • Partner with Risk Management and Security team to identify the standards and required controls, and lead the design, build, and rollout secured and compliant data services to support MUFG mission critical business applications and workload

  • Partner with application and DBA teams to experiment, design, develop and deliver on-premise as well as cloud native solutions and services, and power the digital transformations across business units

  • Embrace Infrastructure-as-Code, and leverage Continuous Integration / Continuous Delivery Pipelines to manage the full data service lifecycle from release of data service offerings into production through the retirement thereof

  • Participate in software and system performance analysis and tuning, service capacity planning and demand forecasting

  • Has the ability to write infrastructure, application and data test cases and participate in code review sessions.

  • Performance analysis and tuning of infrastructure and data processing

  • Provide Level 3 support for troubleshooting and services restoration in Production

Qualifications

  • Bachelor'sd egree in computer science or related field, or equivalent professional experience

  • 7-10 years of meaningful technical experience, with at least 5 years of experience in design, development and delivery of mission critical data solutions in large complex IT environment, poses Expert level skills in 3 or more of the following areas:o Data Warehouse, Data Mart and Data Vaultso Data Backup / Restore, Replication, Disaster Recoveryo Data field encryption and tokenizationo Application design / develop / test experience with RDBMS and/or NoSQL o Database Administration experience with Relational and NoSQL databases o Metadata management

  • Data Services solution design and implementation experiences in on-premise or cloud native environment, poses Expert level skills in 4 or more of the following areas:o Experience with relational SQL and NoSQL databases, including Postgres, DynamoDB etc. Experience with data pipeline and workflow tools: Wherescape Streaming, Wherescape RED, StreamSets Data Collector etc. o Experience with stream-processing systems: Kafka, AWS Kinesis, Apache Storm, Spark-Streaming, etc. o A successful history of manipulating, processing and extracting value from large disconnected datasets with ETL and Data engineering know how of SQL, Informatica PowerCenter or similar. o Experience with secure cloud services platform for Data Management and Integrationo Experience with object-oriented/object function scripting languages: Python, Java, C# etc.

  • Awareness of data governance aspects like metadata, business glossaries, data controls, data protection, canonical models, etc

  • Experience with container orchestration technologies such as Docker, Kubernetes, Openshift

  • Proven experience with Open Source software including OpenShift, Jenkins, PostgreSQL, etc.

  • Strong scripting experience with automating processes and deployments using tools such as scripting (bash, python, perl, etc.)

  • Familiar with DevOps toolchain, i.e. BitBucket, JIRA, Jenkins Pipeline, Artifactory or Nexus, and experienced in automate and deploy n-tier application stack in cloud native environments

  • Excellent data & system analysis, data mapping, and data profiling skills

  • Demonstrate good understanding of modern, cloud-native application models and patterns

  • Excellent collaboration skills and a passion for problem solving, with the ability to work alternative coverage schedules

  • Strong verbal and written communication skills required due to the dynamic nature of collaboration with leadership, customers, and other engineering teams

  • Bachelor’s degree in Computer Science, or a related field

  • Experience within a high integrity, and/or regulated environment (government, healthcare, financial sectors, etc.)

  • AWS professional level certifications is preferred but not required

The above statements are intended to describe the general nature and level of work being performed. They are not intended to be construed as an exhaustive list of all responsibilities duties and skills required of personnel so classified.

We are proud to be an Equal Opportunity / Affirmative Action Employer and committed to leveraging the diverse backgrounds, perspectives, and experience of our workforce to create opportunities for our colleagues and our business. We do not discriminate in employment decisions on the basis of any protected category.

A conviction is not an absolute bar to employment. Factors such as the age of the offense, evidence of rehabilitation, seriousness of violation, and job relatedness are considered in all employment decisions. Additionally, it's the bank's policy to only inquire into a candidate's criminal history after an offer has been made. Federal law prohibits banks from employing individuals who have been convicted of, or received a pretrial diversion for, certain offenses.

Job: Technology

Primary Location: ARIZONA-Tempe

Schedule Full Time

Shift Day

Req ID: 10031079-WD

DirectEmployers