In preparation for Relativity Fest 2020, George Jon’s Tips and Tricks’ Blog series will tackle the most common issues/ mistakes linked to the Relativity application and what remediation strategies our engineering team has employed to maximize the application throughput for eDiscovery and Forensics platforms across the globe.
Non-optimized Searches
When leveraging the Relativity application, it is commonplace and even critical for end users to create multi-layered, nested searches. However, problems arise when end users fail to set the appropriate controls, which yield delayed outputs and the consumption of valuable resources on the SQL Server – all of which, undermine the performance of the application performance.
The proverbial ‘smoking gun’ for lackluster Relativity performance is tied to the “is like” or “is null” control leveraged on extracted text fields, which requires a full table scan and ultimately, creates a bottleneck effect as all other corresponding searches have to wait for that set to return.
Query Hints
The utilization of Query Hints, especially when improperly applied to multi-layered, nested searches, presents additional performance issues for the end user. Query Hints can cause issues with SQL Server’s cardinality estimates, which results in additional delays as the application must essentially wait to secure available resources narrowly defined by the Query Hint’s plan. By default, the SQL Server should be allowed to choose the proper plan for query optimization.
In order to navigate potential performance issues related to Query Hints, end users should first consult Relativity’s Support Team or an experienced Data Base Administrator (DBA).
dtSearch Index Management
As cases grow in size and complexity, the role of Index Management becomes increasingly important. First and foremost, end users must temper and/or adjust expectations when applying dtSearch to all documents as this process often takes multiple days to build/construct.
Moreover, it is imperative that end users /organizations employ a rational, documented search strategy, to yield a timelier output for cases with a high volume of data. A potential strategy to expedite the process in an orderly manner is to create multiple dtSearches based off extracted, OCR text size and/or required search fields.
Contact Us to Learn More!
If you found this information helpful and would like to tap into George Jon’s wealth of knowledge and experience, please contact us for a consultation. Our Subject Matter Experts (SMEs) are standing by, and we welcome the opportunity to optimize your eDiscovery environment capabilities and performance.
Epilogue
Since the birth of the eDiscovery market over 15-years ago, George Jon’s sole mission has been to architect, deploy, and manage eDiscovery / Forensic solutions – providing the best end-user experience, agnostic of the application, for our portfolio of blue-chip clients (MNCs, Top AM 200 Law Firms, Service Providers, and ‘Big Four” advisory firms) worldwide.
Through eDiscovery solution deployments from Toronto to Tokyo (not a hyperbole), George Jon’s expert team of infrastructure and application engineers have literally seen it all with regards to application performance issues. In our experience, there are usually two main culprits that hinder application performance in an eDiscovery environment:
- Human error
- Knowledge gaps associated with the appropriate resource provisioning of applications and basic adherence or understanding of an application workflow