logo

View all jobs

Quality Assurance Analyst

Audubon, PA
JPC Partners is looking for a Quality Analyst that will be responsible for test planning, test monitoring and control, test analysis and design, test implementation and test execution, evaluating exit criteria and reporting, and test closure. The Quality Analyst will be directly involved in hands-on, technical work including test data provisioning and test data management. Specific duties include the following.
  • Test planning: define the overall strategic and tactical objectives for testing software changes at different test levels. Work with developers, project managers and business customers to define the strategy to be used, such as risk-based testing. Test levels include component testing, integration testing, system testing and acceptance testing. Test types include functional testing, non-functional testing, structural ("white-box") testing, confirmation testing and regression testing.
  • Test monitoring and control: continuously compare test progress with the plan, adjust the plan and testing activities as necessary, and provide status reports.
  • Test analysis and design: transform testing objectives into test conditions and test cases. The test basis includes documented requirements, system architecture, behavior and structure of the software, existing data and data flows. Using structural ("white-box") test techniques, among others, design tests, or provide input into test design by identifying specific test conditions and high-level test cases.
  • Test implementation and test execution: develop and prioritize test procedures, set up the test environment and test data, and execute tests. Test changes to the database components of the system under test. Testable components include views, procedures and functions, data conversion and migration programs. Support business customers and others in acceptance testing. Includes identifying database model changes in higher environments (Production and Stage) and making those changes in the lower environments (Test and Development).
  • Evaluating exit criteria and reporting, and test closure: assess test execution against the objectives defined in the test plan. Specific tasks and deliverables are defined at the team or project level.
  • Identify the necessary test data to support test conditions and test cases as they are defined. Includes data to force error and boundary conditions. Analyze input data, including electronic files, message traffic and variations of user input.
  • Provide expected test results, and/or repeatable methods of generating expected test results, based on currently existing data. Includes preparing database queries and guides for testers to use.
  • Provide tools and methods to compare expected and actual test results. Includes bi-directional comparisons of database data with electronic files, message streams, and front-end displays.
  • Provide high-quality, realistic, fit for purpose and referentially intact test data. Capture end-to-end business processes and the associated data for testing. Subset production data: extract subsets of production data from multiple sources to meet test cases and/or to supply input values for data-driven testing. Create realistic test data sets small enough to support rapid test runs but large enough to accurately reflect the variety of production data.
  • Provide test data management. Script the setup of data in the application database to put it in a state that allows a specific set of test cases to be run against it. Script the creation of data files and message streams which require changing variables (usually date or timestamp related) in order to test applications which process them. Script database cleanups.
  • Be a specialist in the database layer of systems under test. Understand each database object's purpose and place in the technology stack, the business solutions it supports, and the business rules it encapsulates.
  • Assist software development and support teams in software product deployments. Includes deployment/build verification, and identification, analysis and troubleshooting of issues.
Required Skills
  • 7+ years of experience as a quality assurance analyst or tester
  • Strong customer and business focus
  • Strong communication skills
  • Understanding of Oracle RDBMS and SQL
  • Experience with test driven or behavior driven development practices
  • Someone who is energetic and passionate about their work, extremely positive and solution driven
  • Someone who has worked on large teams, on projects that have different business owners
  • Experience in defining test strategy/approach, test cases
  • Strong experience in the testing phase with business personnel, developers and management
  • Excellent knowledge of testing methodology (white vs. black box testing, function vs. path testing).
  • Experience developing testing artifacts, such as test plans and test cases
  • Experience working in multiple Software Development Life Cycle (SDLC) methodologies (Waterfall, Agile)
Preferred Skills
  • Experience in a SOA / Web Services environment
  • Experience using iterative development methodologies, specifically SCRUM
  • Experience using Xray for Jira or equivalent test management solutions
  • Experience using defect tracking tools – Jira preferred
  • Experience working hands on with databases – ability to write complex queries and joins
  • Experience using automated testing tools – SOAPUI, Cucumber, Selenium
  • Scripting experience using applicable languages (such as PL/SQL, Python, UNIX Shell, JavaScript, Perl, Ruby, HTML, DHTML, SOAP, XML, JCL, CICS)
  • Experience working with and testing logical data models
  • Experience as a software developer or architect

Share This Job

Powered by