The problem is I need to do this at scale, quickly, in a whitehat way. I thought about pinging, hiring guys on Fiverr to submit to social networks or using Scrapebox's rapid indexer add-on - but I think that all of these solutions are going to look mighty suspicious to Google and result in penalties.
Putting that many new URLs on a new domain in the system may trigger a manual review. I would probably start putting 50K at a time over the course of 1-3 months.
Also best way to get those deeper URLs indexed are:
1) Setup the XML sitemap in webmaster tools.
2) Setup easy to crawl HTML sitemaps. Web pages with ~100 links per page that are easy to crawl and navigate through pagination.