How I Used Midpage to Make “Scorched-Earth” Legal Research Faster and More Organized
I wanted to see whether Midpage could help me run a broad, structured, and explainable research process....it did exactly that.
Scorched-earth legal research is usually a grind. When a partner says “don’t miss anything,” you know what’s coming: dozens of search-result tabs, a sprawling Excel sheet, multiple rounds of tweaking Boolean strings, and the constant fear that one stray case could undermine an entire argument. Thoroughness is non-negotiable. Efficiency is often nonexistent.
These are exactly the situations where many lawyers hesitate to use AI. They’ve seen sanctions for hallucinated cases. They worry about black-box reasoning. And they know that truly comprehensive research requires breadth across platforms, not blind trust in a single database.
With all of that in mind, I wanted to see whether Midpage could help me run a broad, structured, and explainable research process—not replacing judgment, but improving the workflow around it. Not promising perfect completeness, but making the path to thoroughness faster and easier to manage.
It did exactly that.
1. The Assignment: Cast a Wide Net Across a Busy Circuit
My hypothetical client is a home-exercise-equipment manufacturer whose optimistic sales forecasts proved wrong after post-pandemic demand collapsed. Plaintiff investors claim those projections violated federal securities law, and the partner on the matter asked me to gather a broad universe of potentially relevant Second Circuit cases from the past five years.
That meant canvassing opinions from the Second Circuit Court of Appeals plus all six district courts in the circuit—a jurisdiction that includes New York and its large volume of case law.
2. Running the Search in Midpage
To begin, I gave Midpage the same factual context laid out above. Then I set the search parameters: the full Second Circuit and its subsidiary district courts, a five-year date range, and Boolean terms aimed at securities claims, dismissal outcomes, and sales forecasting language.
Midpage returned 186 results. An equivalent search on Westlaw produced 207 results. That parity gave me confidence Midpage was operating within the same general universe of cases, though I still note that broad research often benefits from multiple platforms. Because scorched-earth research values breadth, I may later write about overlap rates and platform-unique results.
Midpage offers a ready remedy if it misses a relevant case identified by Westlaw or Lexis. Adding that case to my Midpage notebook takes seconds: Click “Add Cases,” paste the citation, and Midpage pulls the full opinion into the workspace. This kept my workflow centralized without locking me into a single research tool.
3. Screening the Results with Structured Columns
With the initial list in hand, I triaged quickly. I added Midpage’s stock columns—Outcome, Compare Facts, and Favorability—to get an organized snapshot of how each case related to our scenario. Instantly, I could scroll down a column of outcomes, skim factual similarities, and see predicted favorability for my client.
Midpage helpfully flagged in the Favorability column that it needed more context because I hadn’t clearly specified that my client was defending. After I clarified that point, the Favorability column recalibrated. Guardrails like this reduce the risk of an AI tool’s hidden assumptions shaping the results.
4. Building the Notebook
Once I confirmed the results were on target, I added cases to my notebook for deeper analysis. For this demo, I saved the top 50 (though real scorched-earth work would involve adding, and reviewing, all 186).
Since Midpage does not yet offer bulk-bookmarking, I manually clicked the bookmark icon next to each case. I’ve suggested a “save all results” button, and the team told me it’s in development.
5. Adding Custom Columns for Targeted Analysis
With the notebook assembled, I moved to structured analysis. I added two custom columns—one asking whether the motion to dismiss was granted, and another asking whether the case referenced a confidential witness or confidential source.
Midpage filled each row with Yes / No / Maybe answers, accompanied by short explanations and supporting text from the opinion. Seeing those answers arranged consistently down the page made the review significantly faster and more reliable than relying on Westlaw’s inconsistent headnotes and canned snippets.
And faster skimming is just one way Midpage made this research less mind-numbing. Traditional scorched-earth work on Westlaw can take hours of ministerial drudgery:
Run the broad search.
Build an Excel tracking sheet by hand.
Search within your results for every possible phrasing of “confidential witness.” And still miss cases using coded labels (“Employee 1,” “Individual A,” etc.).
Manually enter findings into the Excel sheet.
Skim opinions to mine factual analogies.
Midpage automated or sped up all of those steps.
6. Delivering Nimble Work Product
After verifying the notebook’s cells, I exported the data to Excel. Midpage doesn’t yet provide for sorting or filtering directly in notebooks, so Excel becomes the layer I use to deliver my work product.
A modular table enables quick response to follow-up requests. If the partner asks for:
a filtered list of cases that mention a confidential witness,
a set of cases where the motion to dismiss was granted, with analogies specific to our demand-collapse scenario, or
a list of cases where the motion to dismiss was denied, with short distinctions keyed to our facts,
I can produce it in minutes by sorting or filtering the exported sheet, rather than re-running searches or re-reviewing dozens of cases.
7. Why This Matters
Midpage changed how I approach scorched-earth research. Instead of spending hours assembling, sorting, and summarizing cases, I could shift into strategy sooner. I had more time to identify the strongest factual analogies and think about how they would feed directly into briefs.
Like any AI tool, Midpage is not perfect. It can miss cases. Verification remains essential. But my research workflow became faster, more organized, and far more manageable.
This blog is based on a fictional hypothetical. The Midpage exercises described here were conducted solely for educational purposes; no legal advice was given or received.
Midpage does not provide legal advice and can make mistakes. Always verify important information.



