Recently at my client site (I have a lot of posts that start this way) we have been getting more and more requests for groups that want to bring higher amounts of data into SharePoint. These requests are really pushing the limits of SharePoint Storage thresholds. So I started looking into the ways that we can get around that. Our thought was that since Microsoft recently announced being able to handle 25TB of data for SharePoint Online Site Collections. We should be able to easily handle the 4TB ceiling in our on-prem environment.
Update: I wrote another blog post concerning this where I go into greater detail on how to test if your environment can go beyond the 200GB threshold and the results of a test I did. You can view that information here.
SharePoint Database Size Limits
The limitations of SharePoint’s content databases are pretty well documented here: https://technet.microsoft.com/en-CA/library/cc262787.aspx#ContentDB. But in a nutshell you want to keep your content databases below 200GB. The same document actually suggests splitting out your site collections if the content DB reached more than 100GB. This would be to allow for growth within the sites.
But what if it’s a single site collection within that database? This now means you should consider branching off the site collection into multiple site collections. For example, create an archive site collection to house data that is no longer actively updated or used. Likely this will cut down on your data usage a great deal. You will have to migrate the data in order to do it, but it is a necessary evil to save on space.