Senior Software Engineer - Hadoop, PySpark, Shell Scripting, SQL, Oracle (mastercard)

mastercard    Pune, India    2024-02-11

Job posting number: #88862 (Ref:R-210640)

This Job Posting is Expired.

Job Description

Our Purpose

We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. We cultivate a culture of inclusion for all employees that respects their individual strengths, views, and experiences. We believe that our differences enable us to be a better team – one that makes better decisions, drives innovation and delivers better business results.

Title and Summary

Senior Software Engineer - Hadoop, PySpark, Shell Scripting, SQL, Oracle

JOB IS FROM: italents.netVIEW

What is Data & Services?

The Data & Services Technology Organization is a key differentiator for MasterCard, providing the cutting-edge services that help our customers grow. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business experimentation, and data-driven information and risk management services.

What role do we play in the modern world? Are we an enabler of purchases or a facilitator for transactions? We play a much larger role in the world by enabling those that have no access to financial systems. We have the technology, people, and Brand to serve modern society. Today, we are a global tech company that connects everyone to endless possibilities, priceless possibilities.

Job Description Summary

The Technology Foundations, Data and Transformation Solutions (DTS) team provides ETL solutions and data to BI applications, products, and services. We are looking for an experienced Big Data Developer who loves solving complex problems for a full spectrum of technologies. The person in this role will develop and implement data pipelines for sources and downstream systems.

Role

• Highly capable in learning new technologies & frameworks and implementing them as per the project requirements by adhering to quality standards
• Experience in all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation, and support
• Adept in analyzing information system needs, evaluating end-user requirements, custom designing solutions and troubleshooting information systems
• Develop and implement data pipelines that extracts, transforms, and loads data into an information product that helps to inform the organization in reaching strategic goals
• Technical & Analytical Skills and Problem Solving
• Work on ingesting, storing, processing, and analyzing large data sets
• Investigate and analyze alternative solutions to data storing, processing etc. to ensure most streamlined approaches are implemented
• Monitor daily job failure alerts and resolve issues identified
• Ability to write algorithms with different rules
• Data warehousing principles & concepts and modification of existing data warehouse structures

All about you

• Must have experience deploying and working with big data technologies like Hadoop, Spark, and Sqoop
• Experience with streaming frameworks like Kafka and Axon
• Experience designing and building ETL pipeline using NiFi
• Highly proficient in OO programming (Python, PySpark Java, and Scala)
• Experience with the Hadoop Ecosystem (HDFS, Yarn, MapReduce, Spark, Hive, Impala),
• Proficiency on Linux, Unix command line, Unix Shell Scripting, SQL and any Scripting language
• Experience designing and implementing large, scalable distributed systems
• Ability to debug production issues using standard command line tools
• Create design documentation and maintain process documents
• Ability to debug Hadoop / Hive job failures
• Ability to use Cloudera in administering Hadoop

Optional:
Cloud technologies like Databricks, AWS, Azure and GCP.

Corporate Security Responsibility


All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:

  • Abide by Mastercard’s security policies and practices;

  • Ensure the confidentiality and integrity of the information being accessed;

  • Report any suspected information security violation or breach, and

  • Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.










Employer Info

Job posting number:#88862 (Ref:R-210640)
Application Deadline:2024-03-12
Employer Location:Mastercard
,
More jobs from this employer

Jobs Viewed Recently

顶部