Senior QA Engineer(Data Platform Testing)
Experian View all jobs
- Hyderabad, Telangana
- Permanent
- Full-time
- Design, implement, and maintain automated test suites across API, data workflow, and graph components, integrating them into CI/CD pipelines with quality gates (unit, static analysis/security, data quality, contract/compliance, and final integrity checks).
- Own CI/CD quality enforcement, including gated releases, automated evidence of test results, and publish blocks when validation fails (e.g., consent, schema, integrity, scoring).
- Develop automated contract and compliance checks, including schema validation, required fields mapping, TCF/consent validation, and PII-risk checks aligned to governance requirements.
- Build and execute automated end-to-end API tests using tools such as Bruno, Insomnia, or Postman, validating API behaviour through API Gateway, Lambda integrations, and downstream data/graph operations.
- Build data quality monitors (nulls/ranges, volume deltas, distribution checks) and statistical tests suited to identity clusters; operationalize them as part of routine runs.
- Create graph integrity tests in Neo4j, covering node/edge parity, constraints, traversal path validation, merge/split detection, and evidence-path reproducibility checks.
- Implement performance and resilience testing, including query latency targets, batch window completion for rebuilds, and regression suites for every deployment.
- Contribute to operational guardrails by instrumenting monitoring, alerting, and cost-aware test strategies; collaborate with engineering to ensure predictable weekly rebuild cadence and controlled ad-hoc extracts.
- Ability to design and implement end-to-end automated API tests using Bruno, Insomnia, Postman, or similar frameworks.
- Strong understanding of REST API behaviours, request chaining, environment variables, authentication flows, and test assertions.
- Experience testing APIs running on AWS API Gateway with Lambda backends, including understanding of routing, payload transformations, throttling, and logging.
- Hands-on experience with:
- S3 (structured landing/prepared zones and artefact validation).
- IAM (roles/permissions for automated testing and least-privilege CI/CD).
- Airflow/MWAA (end-to-end workflow testing, scheduled rebuilds, and validation tasks).
- Logging/monitoring (instrumentation for quality gates, alerts, and evidence capture).
- Data contract and schema validation; automated data quality checks (null/range, volume deltas, distribution/consistency) for identity datasets.
- Performance testing for query latency and batch-process windows; regression testing for rebuilds and deployments.
- Compliance validation (consent enforcement, PII risk, audit-ready outputs) integrated into automated pipelines.
- Drift detection and score distribution monitoring for identity resolution outcomes; alerting on behavioural/statistical shifts.
- Consent and governance checks that mirror operational policies (e.g., consent-only processing at node/edge level, version-controlled thresholds).
- Prior testing experience with identity resolution / data science / probabilistic systems-including calibration/threshold validation and explainability.
- Familiarity with observability (alerts, dashboards) to uphold guardrails on performance, resilience, and cost visibility.
- Python, Data testing, API Testing, Reporting, Automation
- Strong CI/CD automation with gated release controls and the ability to block publication when gates fail (e.g., consent, integrity, scoring). Experience integrating automated suites into Harness CI/CD tool.
- Data-driven SaaS platforms; regulated data processing environments
- Contract-testing frameworks (e.g., Pact) and property-based testing approaches
- Cloud observability stacks (e.g., CloudWatch/OpenSearch/Grafana) for pipeline and API SLOs