If you don't measure results, you won't be able to show improvement. Measure the fruits of your work and the current state of quality. A QA process’ main aim is to help the organization ship a higher-quality product.
Firstly, the number of bugs reported by or that affect customers. This is the most direct and easiest to manage. Log any reported issue by date and, if you can, by product area, developer and team. Every week you should summarize this log, look for patterns and report back to the team at the root.
You should be tracking the time-to-fix, i.e., how long does it take between when something breaks to getting fixed. Measuring time-to-fix answers how a development team is able to use the output from QA to triage and fix a bugs. The simplest way to measure this is the time between a failed build an the next passing build. Be careful to remove flakey tests from this metric.
If you wish to go further, split the tracking out by source of the issue. Examples are: external (i.e., customer), internal (i.e. missed by QA), automatic (e.g., error reporting), or test-case failures.
The number of tests added to your suite, versus the number of regressions that failed in your suite.
Flakiness. Especially when using automation, you should track tests which pass or fail intermittently. This indicates possible poor test quality, poor choice of system or poor execution. Execution problems can be the result of human QA tester or test environment failures. Minoring when this happens will expose patterns, allowing you to fix the root cause. Avoid being only reactive here.
NPS is a great end-measurement for your entire product, but it's a trailing indicator. Also, it conflates many things, such as product improvements and CSM effectiveness. Still, improvements here can be used as indicators of product quality.
Test coverage, while prevalent, is dangerous if misused or misunderstood. It is not a measure of the quality of the tests or the thoroughness of those tests. It should only be used to work out which areas are completely untested.
Note: there are no good leading indicators for production quality - only trailing indicators.