Principal Software Engineer, Data

ID
2019-11638
Category
International
Position Type
Full-Time
Location : Location
IN-KA-Bengaluru

About Blackhawk Network:

At Blackhawk Network, we shape the future of global branded payments through the prepaid products, technologies, and networks that connect brands and people. Our collaborative innovation and scalable, security-minded solutions help our partners to increase reach, loyalty, and revenue. We believe our future holds great things for Blackhawk Network and its partners. We believe that together, we can shape the future. Our beliefs? Win as one team, be innovative, global excellence and be inspiring!

So, what are you waiting for? Shape your career and join our global network.

Overview:

Blackhawk Network is building a digital platform and products that bring people and brands together.  We facilitate cross channel payments via cash-in, cash-out and mobile payments. By leveraging blockchain, smart contracts, serverless technology, real time payment systems, we are unlocking the next million users through innovation. 

 

Our employees are our biggest assets!  Come find out how we engage, with the biggest brands in the world.  We look for people who collaborate, who are inspirational, who have passion that can make a difference by working as a team while striving for global excellence. 

 

As a leader in branded payments, we are building a strong diverse team and expanding in ASIA PACIFIC –we are hiring in Bengaluru, India! This is an amazing opportunity for problem solvers who want to be a part of an innovative and creative Engineering team that values your contribution to the company. If this role has your name written all over it, please contact us apply now with a resume so that we explore further and get connected. 

 

Do you thrive in processing and analyzing data at Scale?  You believe in data driven decisions and are able to deliver expert insights from financial data?  Have you been seeking to work with world class, global scale payment processing and seamlessly handle the three golden V’s of financial data – Variety, Volume and Velocity?  Blackhawk is seeking passionate data engineers at all levels who thrive in the complexities of big data processing. 

 

What’s in it for you? 

 

You will tame all facets of big data at a Global scale and turn the wild west of data into robust and accurate business insights. The data platform that you build will enable the culture of data driven decision making across the entire company and will work at the forefront of cutting-edge technologies of data processing, visualization, and business intelligence. 

 

Responsibilities:

You will build a data lake at scale that will consume financial data from Blackhawk’s global business platforms, process and deliver real time insights to Blackhawk’s internal and external customers. You will build a best-in-class data platform that is able to deliver quality, accuracy and performance. You will bring your architecture, design and customer focus skills to build the future of data at Blackhawk. You will build visualization and data intelligence tools that will empower users to draw powerful business insights. You will collaborate with world class product development and business teams and develop solutions. You will push the envelope of scale and performance and will constantly look for opportunities to leverage horizon technologies to deliver business value. 

 

Responsibilities: 

  • Responsible for building a world class payments network 
  • Deliver strategic data initiatives at scale and on time with a deep understanding of engineering excellence and best practices relating to development, automation, and release 
  • Bring an expert level understanding of data, industry know-how, technology and architecture, with deep hands-on experience 
  • Charisma to bring organizational and team-wide excitement  
  • Develop deep relationships with open source communities, and industry leadership and bring positive impact on Blackhawk’s business. 
  • Collaborate with internal Blackhawk teams and influence the future product roadmap, direction and improve our current offerings through technical feedback.  
  • Be a mentor, coach and champion - drive a culture of technical innovation, curiosity, and learning 

Qualifications:

  • Qualifications: 

    • BS, MS in Computer Science or equivalent work experience. 
    • 3+ years of technology leadership experience in leading working with modern/large scale data lake initiatives preferably with amazon or google cloud technologies 
    • 12 years of enterprise level software development experience with deep focus on data 
    • Demonstrate solid engineering fundamentals. Strong algorithms, data structures and coding background. Experienced practitioner of CI/CD test driven development, and agile methodologies to projects to maximize the quality of the software a team writes together. 
    • Strong computational skills and being able to code fluently in Java or Python 
    • Solid foundation in backend software development. Deep understanding of software craftsmanship, solid design principles, best practices and design patterns 
    • Demonstrate a high degree of precision in software development. We deal with real money, and areas where precision and quality are key. You deeply care about the external and internal software quality and how you approach all aspects of the software development lifecycle. 
    • Strong knowledge of SQL, experience in query performance optimization 
    • Strong experience in data processing tools - Hadoop, spark, python, pig 
    • Deep experience in ETL, data warehousing, data mining, and data modelling, reporting and analytical tools 
    • Ability to assimilate and organize large volumes of disparate, minute detail, and assemble a big picture view. 
    • DNA of challenging status quo 

     

    Preferred: 

    • Basic experience working with Amazon EMR & Spark 
    • Strong knowledge and experience in most of the following AWS services: EC2, Lambda, SQS, Kinesis, S3, CloudFormation, CLI, CloudWatch 
    • Experience working with column-based databases like Redshift 
    • Experience in building real-time ETL processes 
    • Experience developing on Scala, NodeJS, Python 
    • DevOps experience: Linux, Jenkins, Docker, Kubernetes 

     

  • #L1-SP
  • #D18
  • #Glassdoor

Options:

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed