The problem is I need to do this at scale, quickly, in a whitehat way. I thought about pinging, hiring guys on Fiverr to submit to social networks or using Scrapebox's rapid indexer add-on - but I think that all of these solutions are going to look mighty suspicious to Google and result in penalties.
I agree with some of the recommendations above;
1) Focus on making sure your XML sitemap is accurate, and make sure your weightings (server priority weights) are set correctly from parent to child (category > sub-category > product detail).
2) Submit through Google webmaster tools
3) Submit partials in increments (again, good suggestion at 50k/time) and pay close attention to your index rate to see how fast these are picked up
4) you can brute force a lot of these crawls by submitting and re-submitting your sitemap(s) - as often as every day
5) For very important URL's that are not getting picked up a handy trick is to tweet them out from an account that has a positive share of voice (more followers than following)