DESCRIPTION:
Duties: Perform data manipulation, data structuring, data design flow and query optimization. Utilize advanced techniques to process large data sets. Develop software in a Microservices environment that deploys as a REST API. Design and build telemetry and usage tracking solutions for business intelligence tools to improve tool governance and monitoring. Develop and automate large-scale, high-performance data processing systems to drive and improve product experience. Build applications for data transfer. Perform exploratory data analysis within large enterprise databases (Terabytes) to extract, clean, transform, and load data. Onboard internal and external clients to receive information using bank approved transfer gateways and support clients for the data delivery. Execute nonfunctional business requirements and keep the business running without any incidents. Escalate and participate in resolving major incidents related to onboarding and data transfer faced by internal and external clients.
QUALIFICATIONS:
Minimum education and experience required: Master's degree in Computer Science, Computer Engineering, or related field of study plus one (1) year of experience in the job offered or as Site Reliability Engineer, Devops Engineer / Manager, or related occupation. The employer will alternatively accept a Bachelor's degree in Computer Science, Computer Engineering, or related field of study plus three (3) years of experience in the job offered or as Site Reliability Engineer, Devops Engineer / Manager, or related occupation.
Skills Required: This position requires experience with the following: designing and developing scalable applications from concept to deployment, ensuring functionality, performance, and security; applying object-oriented programming principles such as encapsulation, inheritance, and polymorphism to design and implement software solutions using object-oriented programming languages; writing unit tests and integration tests in Python, and adhering to test-driven development (TDD) practices to ensure code quality and reliability; performing data manipulation, structuring, design flow, and query optimization using programming languages such as SQL and Python; developing applications on distributed systems for efficient data processing and communication, and debugging and troubleshooting issues in distributed environments; utilizing event-based architecture to detect changes or anomalies, triggering alerts or automated responses, and integrating monitoring tools for scalable alerting systems, in systems such as Geneos and Dynatrace; utilizing financial frameworks for risk management, trading, and analytics, and developing and maintaining applications on these platforms; utilizing GraphQL query language for building APIs, designing and implementing GraphQL schemas and resolvers, and using TensorFlow for building and deploying machine learning models based on deep learning concepts and frameworks; using Apache Parquet to work with Parquet files for efficient data storage and retrieval; processing data sets using data containers, multithreading, and multiprocessing in PySpark and TensorFlow.
Job Location: 545 Washington Blvd, Jersey City, NJ 07310.
We offer a competitive total rewards package including base salary determined based on the role, experience, skill set, and location. For those in eligible roles, discretionary incentive compensation which may be awarded in recognition of individual achievements and contributions. We also offer a range of benefits and programs to meet employee needs, based on eligibility. These benefits include comprehensive health care coverage, on-site health and wellness centers, a retirement savings plan, backup childcare, tuition reimbursement, mental health support, financial coaching and more. Additional details about total compensation and benefits will be provided during the hiring process. In addition, please visit:Â https://careers.jpmorgan.com/us/en/about-us.
Full-Time. Salary: Â $148,000 - $185,000 per year.