Tips for handling large data volumes

Ever tried deleting 100 million records. If you have you will know your transaction log will likely blow up, you will block all access to the table and it will generally be painful. In this session we will look at some of the techniques you can use to make this easier. We will look at how 
  • deletes and updates work and how indexes affect them
  • how to break the job into smaller chunks
  • how to run those chunks in parallel

After this session you will understand more about why these operations are painful, what you can do to make them easier to do and reduc the impact on your end users.

Presented by Simon Sabin at SQLBits VIII
  • Downloads
    Sorry, there are no downloads available for this session.
  • SpeakerBIO

    Simon has worked with data for all his career enabling companies to make the most of the data they have .

    He works with companies to help them

    1. Understand and define a cloud data platform strategy.

    2. Optimise their data platform, including performance, scalability, security and certification

    3. Improve their data development practices including implementation of agile methodologies and continuous integration.

    Education of teams is at the heart of everything Simon does.

    The passion for education led Simon to found SQLBits, the largest SQL conference in Europe. He is also a SQL Microsoft Certified Master and has been Microsoft MVP since 2005

    You can follow him @simon_sabin or read his blog
  • Video
    The video is not available to view online.