Tips for handling large data volumes
Ever tried deleting 100 million records. If you have you will know your transaction log will likely blow up, you will block all access to the table and it will generally be painful. In this session we will look at some of the techniques you can use to make this easier. We will look at how
- deletes and updates work and how indexes affect them
- how to break the job into smaller chunks
- how to run those chunks in parallel
After this session you will understand more about why these operations are painful, what you can do to make them easier to do and reduc the impact on your end users.
Sorry, there are no downloads available for this session.
Simon is Head of Data at Wonga.com. Prior to Wonga Simon ran a SQL Server Consultancy specialising in performance and scalability.
He has worked with SQL Server since 6, became an MVP in 2006 and in 2007 founded SQLBits. He is also a SQL Microsoft Certified Master.
You can follow him @simon_sabin or read his blog http://sqlblogcasts.com
The video is not available to view online.
- Session Files Explorer
The network name cannot be found.