In general, you can only really trust scores from official tests. It's true that company tests generally don't use the same scoring algorithm as the real GMAT, but there's a much more important difference. The questions on company tests are not official questions. Company tests will overemphasize certain skills, and underemphasize others, just because of the types of questions they write. How well you score on a company test will depend on how good you are at the things they overemphasize. So you might perform similarly, better, or worse on a company test, all depending on your specific skills, and score comparisons between company and official tests will vary from person to person. Some of my students get very similar scores on official tests and on certain company tests. Others of my students get wildly different scores on official tests and those same company tests. Unless you know the company test's biases, you can't tell if the test will overestimate or underestimate your level.
There normally isn't much reason to take a dozen practice tests during prep, though it can depend on the test taker. If you know precisely what you need to work on (CR and RC, say), you'll make the most progress by focusing on official questions, and solving and analyzing them outside of a test setting. You should take diagnostic tests primarily to work on pacing strategy, and to get a realistic score assessment when you're wondering if you should book a real test. Official tests are the only ones you can use for either of those purposes. Unless you need to take a lot of practice tests to get comfortable with test taking, with working from a computer screen, etc, there should be enough official tests available that you can just use those.
And about analyzing your performance on a test - you broke down where in your test your wrong answers occurred. That kind of information isn't even useful when analyzing an official test, using the real scoring algorithm (what matters is not where your wrong answers happened, but how difficult the questions were that you got wrong), so there isn't any reason to take account of that when you take future practice tests. On the real test, if you get an easy question wrong, that will hurt you more than you might expect, and when you get a very hard question wrong, that won't matter much at all, and that's true no matter where you are in the test.
nik98
I think they did this, so that people score higher in the actual test, and so no one complains saying they scored lower on the actual test.
That's a charitable way to look at things! Companies in general (not talking about the specific company you mention) have a commercial incentive to produce low scores on their diagnostic tests. After all, if people take a company test and get a 790, they aren't likely to buy a lot of products to help them raise their score. I don't think it's likely many companies intentionally try to produce inaccurate scores for that reason (I have no evidence of any company doing that), but it's at least worth bearing in mind that a company test is unlikely to be designed to produce scores that are too high, for this reason.