I would add to the limit suggestion and advise you to really look at your listing page(s). Rare is the case that a user will need to or want to page through thousands, let alone millions of records. So what we’ve done is put a default limit of 1,000 rows on all our search queries and we only remove that limit when the user expressly needs more records. In our system, all of our listing pages have a parenthetical that discloses this 1,000 row limit (which is what they page through on the client). If the user needs more they have two choices. One is the download a csv, in which case we remove the limit and send them the complete data set (which is returned real time or via a scheduled download if the result set is enormous) and the other is to simply change their filter criteria.
The other thing I would suggest from a lot painful experience is to avoid running queries automatically in such pages. In other words, design your search listing pages in such a way that users have to press a search button after selecting their filters rather than having the page kick off a default search with little or no filtering. It sounds basic, but that alone will prevent a lot of server load in the form of one default search followed by one tailored to their needs.
Now reports are another matter. For reports where you might be returning a ton of data, it might be best to send the entire result set and forego paging altogether. And when your result set is simply too long (and you need to think about what that means) you can still handle that by informing the user that the report has been scheduled and will be emailed to them when it is done being generated. I mention this because there is a definite trade off in performance when trying to support online reporting where the data is very large. In many cases you might be better served sending your user a csv that they can load into Excel let them do their thing. Hardcore analysts might even thank you for that anyway.