By Susanne Strong, Content Developer, IPRO’s Learning Experience Team
Have you heard about Active Learning? It’s the next generation in Batch Management and document review.
I think of it as “next level.” It’s kind of like when you upgrade your iPhone. It might have taken a little bit of convincing, but yes, I recently upgraded my iPhone 8 to the iPhone 13 and let me just say that I have no idea what took me so long.
Active Learning is exactly like that! I get a warm fuzzy feeling every single time I create a review pass and begin reviewing documents. As a former litigation paralegal who has reviewed petabytes of documents over my lifetime, using linear review, effective searching, Boolean searching, and review pass management, I find the whole idea of Active Learning to be totally wicked.
Admittedly, getting used to the idea of having technology assist you in the review process is strange at first. However, once you get the hang of it, I have a feeling you’ll become a huge fan of it too.
What is Active Learning All About?
Active Learning is a method of using technology to make document review decisions. Once reviewers conduct an initial review, the system learns from their input after batches are checked in. Thereafter, the system classifies the remaining documents as either relevant or non-relevant. Metrics are then updated continuously as review goes on, making it easier for reviewers to decide whether review should continue.
Active Learning helps reviewers by making document review decisions based on extracted text only, and filtering data sets prior to indexing. It also suggests more relevant documents and makes document review more manageable.
Active Learning Enabled Review Passes are defaulted to 10 documents each, allowing reviewers to get through each batch quickly and easily, and once checked in, relevancy scores and the Active Learning Status is updated.
What it’s Important to Know
Active Learning Enabled Review Passes can be based off:
- The Entire Case
- A Search
- A Custom index
Prior to creating your Active Learning Enabled Review Pass, you must create a Primary Tag. This tag will be identified as your Primary Review Purpose for your Active Learning Enabled Review Pass.
For example, if your Primary Review Purpose is to determine responsiveness, then you would create a primary tag called Responsive. Upon applying the Primary Tag to a document, it will be marked as reviewed, and recorded as positive.
The reverse is also true, if the primary tag is not applied, the document will be marked as reviewed as well, however the document will be recorded as negative. Metrics are then autogenerated.
Let’s look at some initial metrics for a review pass I’ve started to identify responsive documents.
Here are some things to notice about the metrics:
- Positive documents: Since we identified the purpose of this review to be determining responsiveness, documents that I’ve tagged Responsive show up as Positive.
- The blue 1.1K hyperlink shows the list of documents that the AL system believes will be responsive (i.e. positive).
- Negative documents: These are the items that I have marked as Non-responsive.
- The blue 2.9K hyperlink shows the list of documents that the AL system believes will be non-responsive (i.e. negative).
- Documents on hold: In this example, we tagged a document as On Hold in the review status because there was an outstanding question of responsiveness.
- Total number of documents in the review: Notice the 269 omitted documents – these are documents that contained no extracted text (i.e. they are images).
- Predicted responsive documents: In evaluating the review performance, precision indicates the percentage of predicted responsive documents that were actually positive. Recall displays the percentage of actually positive documents that were correctly predicted to be responsive.
- Insights: These are key at this stage in the review. Currently, the Active Learning Status is Poor This is because such a small document set has been manually tagged. You need to keep reviewing to further train the system to properly identify responsive documents. As the review progresses the insights will continue to be refined and reflect the Active Learning progress.
You can drill down on any of the hyperlinks to view a subset of the documents (i.e. predicted positive, predicted negative, etc.). For example, you might want to drill down on the Negative documents that are marked as Have Conflicts. You can then create a QC batch to be re-analyzed.
Metrics & Reporting
The metrics associated with Active Learning Enabled Review Passes provide information as to how many documents are in our case, how many have been tagged, how many are positive or negative (based on your Primary Tag) or on hold.
Reporting provides even more information, including the estimated number of documents the system predicts to be relevant. All this information, which is updated in real-time as you continue to review documents, helps to make Active Learning even easier to understand.
When it comes to litigation, document review is a necessity. Why not make the job easier with Active Learning Enabled Review Passes? Batch review is based off extracted text and reviewers can keep at it until they feel comfortable with the metrics being reported.
For more information about Active Learning, check out this course on our online Learning Center: Keeping You in the Know: An Introduction to Active Learning
For information on the Metrics and Reporting associated with Active Learning, take a look at this blog article written by Josh Croye, Product Manager at IPRO
For even more information check our help center for a deeper dive in Active Learning.
If you’d like to learn more about IPRO’s products or purchase any type of training, head to our online Learning Center: https://learn.ipro.com/
IPRO’s Help Center, which is where you can find documentation on all IPRO products can also be found here: https://my.ipro.com/help
If you have topics you’d like for us to cover, please feel free to drop us a line and we will incorporate your ideas into future blog posts.