We Hire America Jobs

Mobile We Hire America Logo
WeHireAmerica.jobs is a service of HR Policy Foundation and DirectEmployers Association. These two non-profit organizations are providing this free resource to help educators, policy makers and job seekers understand the great employment opportunities available here in the U.S. at some of America's biggest and best companies.

Job Information

Bank of America Feature Lead Technology in Charlotte, North Carolina

Feature Lead Technology

Charlotte, North Carolina;Pennington, New Jersey; Plano, Texas; Newark, Delaware

Job Description:

Position Summary

Responsibilities

• Lead projects end-to-end. Meet with business users to determine requirements, analyze the data lake for relevant datasets, collaborate with other developers to design a technical solution, and see the project through to completion.

• Design and build analytical workflows that take data from the lake and transforms, filters, aggregates, and computes a meaningful result or report for use by a consumer application

• Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels

• Continuously evaluate new technologies, innovate and deliver solution for business critical applications

Required Skills

• Bachelor’s / Master’s degree in Computer Science or equivalent experience

• Minimum 8 years of Software Development experience

• Minimum 5 years of experience with the Hadoop/Cloudera ecosystem, such as Spark, MapReduce, Yarn, Hive, Sqoop, Impala, Kafka and Oozie

• Experience with Apache Spark

• Experience with Unix / Linux and shell scripting

• Experience with two or more Programming Languages (SQL, Java , Python, Scala, R)

• Experience leading software projects, from the design through release phases

• Experience using the Data lake to design and produce analytical output through batch and real-time processing

• Strong understanding of capacity planning, software development lifecycle, and enterprise production deployments

• Hands-on experience benchmarking systems, analyzing bottlenecks, and designing performant code

Preferred Skills

• SDLC Methodology - Agile / Scrum / Iterative Development

• Job Scheduling Tools (Autosys)

• Version Control System (Git, Bitbucket)

• Continuous Integration / Continuous Delivery (CI/CD) pipelines (Jenkins)

• Real Time Streaming (Kafka)

• Visual Analytics Tools (Tableau)

• No SQL Technologies (Hbase)

• Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )

• Awareness or experience with Data Lake with Cloudera ecosystem

Job Band:

H5

Shift:

1st shift (United States of America)

Hours Per Week:

40

Weekly Schedule:

Referral Bonus Amount:

0

Job Description:

Position Summary

Responsibilities

• Lead projects end-to-end. Meet with business users to determine requirements, analyze the data lake for relevant datasets, collaborate with other developers to design a technical solution, and see the project through to completion.

• Design and build analytical workflows that take data from the lake and transforms, filters, aggregates, and computes a meaningful result or report for use by a consumer application

• Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels

• Continuously evaluate new technologies, innovate and deliver solution for business critical applications

Required Skills

• Bachelor’s / Master’s degree in Computer Science or equivalent experience

• Minimum 8 years of Software Development experience

• Minimum 5 years of experience with the Hadoop/Cloudera ecosystem, such as Spark, MapReduce, Yarn, Hive, Sqoop, Impala, Kafka and Oozie

• Experience with Apache Spark

• Experience with Unix / Linux and shell scripting

• Experience with two or more Programming Languages (SQL, Java , Python, Scala, R)

• Experience leading software projects, from the design through release phases

• Experience using the Data lake to design and produce analytical output through batch and real-time processing

• Strong understanding of capacity planning, software development lifecycle, and enterprise production deployments

• Hands-on experience benchmarking systems, analyzing bottlenecks, and designing performant code

Preferred Skills

• SDLC Methodology - Agile / Scrum / Iterative Development

• Job Scheduling Tools (Autosys)

• Version Control System (Git, Bitbucket)

• Continuous Integration / Continuous Delivery (CI/CD) pipelines (Jenkins)

• Real Time Streaming (Kafka)

• Visual Analytics Tools (Tableau)

• No SQL Technologies (Hbase)

• Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )

• Awareness or experience with Data Lake with Cloudera ecosystem

Shift:

1st shift (United States of America)

Hours Per Week:

40

Learn more about this role

Full time

JR-21044054

Band: H5

Manages People: No

Travel: Yes, 5% of the time

Manager:

Talent Acquisition Contact:

Susan Romine

Referral Bonus:

0

DirectEmployers