• Principal Software Engineer

    Job Location US-CA-Pleasanton
    Position Type
  • About Blackhawk Network:

    Blackhawk Network Holdings, Inc. is a global financial technology company and a leader in connecting brands and people through branded value solutions. Blackhawk platforms and solutions enable the management of stored value products, promotions and rewards programs in retail, ecommerce, financial services and mobile wallets. Blackhawk’s Hawk Commerce division offers technology solutions to businesses and direct to consumers. The Hawk Incentives division offers enterprise, SMB and reseller partners an array of platforms and branded value products to incent and reward consumers, employees and sales channels. Headquartered in Pleasanton, Calif., Blackhawk operates in 26 countries. For more information, please visit blackhawknetwork.com, cashstar.comhawkcommerce.comhawkincentives.com or our product websites GiftCards.comgiftcardmall.comGiftCardLab.com and OmniCard.com.


    We are looking to hire an accomplished Principle Engineer to join the Blackhawk Technology Engineering Organization for designing, developing, and maintaining Big Data  platform on AWS cloud. The desired candidate would have an excellent understanding of big data technologies and AWS services



    • Create and maintain big data pipeline in cloud environment
    • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
    • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using AWS database/data technologies
    • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
    • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
    • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions
    • Create data tools for analytics and data scientist team members that assist them in building and optimizing the products
    • Work with data and analytics experts to strive for greater functionality in the data systems
    • Write java application to process large volume of data
    • Write analytics applications using Spark/EMR/Scala
    • Implement manage CI/CD pipeline in AWS using Cloud Formation



    • Master/Bachelor’s degree in Computer Science or Engineering
    • 10+ years of software design and development experience with core Java and J2EE
    • Experienced in building resilient, stateless, scalable, & distributed systems
    • Ability to meet tight deadlines and to be productive and effective within a matrix organization
    • Excellent communicator – verbal and written skills required
    • Experience with Spark and Scala
    • Programming/scripting/query Languages: Java, SQL
    • AWS deployment experience 
    • General AWS Service knowledge: S3, Redshift, Elastic Search, DynamoDB,  EC2, EMR, Data Pipeline, Glue, Kinesis, SNS, SQS


    Good to have                                                                           

    AWS Solution Architect or other AWS certification







    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed