These are the sessions submitted so far for SQLBits 16.

This *hands-on* two days teaches you the modern, sensible R that so many of us now use. Bring a laptop and learn data manipulation, visualisation, and presentation the way it should be done! Two days of pure R, aimed at anyone looking to use R heavily and effectively.

We'll cover throughout the two days:
- ingesting data from various sources
- working with tabular data in-depth
- visualising data in static and interactive methods
- performing basic statistics
- producing reports and other outputs
- writing data to various destinations
We can write hacky R code but "works on my machine", especially "when I do this, this, and this" is ok for play, but not for sharing. This *hands-on* day teaches you development best practices for R including defensive programming techniques, testing, and package development. Laptops and basic R skills required!

Throughout this one day of training, we'll be covering:
- function writing
- defensive programming techniques
- package development
- unit testing
- code coverage
- continuous integration
The package data.table is super-fast and super-powerful but the acceleration curve is fast. This one day takes you through that process so you come out of it with all your parts and the ability to manipulate data with almost as much ease as you use SQL.

We'll be spending a whole day learning how to work with tabular data in R using data.table. From ingestion of millions of rows of data FAST, to being able to do all the things you know and love in SQL FAST, to doing things SQL can't FAST, to writing data FAST, data.table has it all and it's all FAST. (you may not have noticed that I think it's FAST)

So come spend the day learning how to use data.table!
Agile BI promises to deliver value much quicker to its end users. But how do you keep track of versions and prioritize all the demands users have?
With Visual Studio Online (cloud version of Team Foundation Server) it is possible to start for free with 5 users, with Version Control, Work Item management and much more.
In my session you will get the directions to a quick start with Visual Studio Online. You will learn the possibilities of Version Control and in which way to implement Scrum work item management with all available tools.
Your tabular model is done, approved and ready to be used by the user. By means of using Excel the user gets very excited about the use of tabular Models. For a while the user uses Excel as a self-service business intelligence tool. Then all of a sudden the user starts asking if they can use the program to extract more and other information from the tabular model by the use of Excel. Now it is up to you to familiarize the user with all the possibilities of working with the tabular model by means of Excel.
Given the small amount of documented knowledge about the use of tabular models by means of Excel, I will show you how to get the best of your tabular models by using Excel as a self-service business intelligence tool. Filters, named sets, and calculations in the pivot table: I will explain it all!
Where to start when your SQLServer is under pressure? If your server is
misconfigured or strange things are happening, there are a lot of free tools
and scripts available online.These tools will help you decide whether you have
a problem you can fix yourself or you really need a specialized DBA to solve it.  Those scripts and tools are written by renouwned SQLServer specialists. Those tools provide you with insights of what
might be wrong on your SQLServer in an quick and easy manner. You don’t need extensive knowledge of
SQLServer nor do you need expensive tools to do your primary analysis of what
is going wrong
And in a lot of instances these tools will tell you that you yourself can fix the problem. 
Whether you've dabbled in PowerShell or wondered what all the fuss is about, make no mistake: PowerShell is something worth learning to make your life as a SQL Server professional easier. Whether you're a DBA, a SSIS developer, or security professional, In this session you'll see practical, real world examples of how you can blend SQL Server and PowerShell together, and not just a bunch of regular T-SQL tasks that have been wrapped in PowerShell code.

In this session, you'll first get a brief introduction to the SQL Server PowerShell module. From there, it's nothing but code and examples to show you what's possible and why learning PowerShell is such a fundamentally awesome skill to make part of your portfolio or CV. You don't need to know anything about the language before stopping in, but by the time you leave you'll be excited to learn more!
We will kick off with a small history on how Columnstore indexes evolved since SQL Server 2012 and the internals of Columnstore indexes, because this is how Operational analytics became possible. Then we will build up the session showing tips and tricks into how to make Operational analytics fit in your environment by lessons learned in the field. In the end we will show you how to make it fit with various techniques and heavy loads to a system.
After this session you will be able to know how to make this fit for your environment, there will be no more mystery in what is your hot and cold data & you will know the different uses of columnstore to make this work.
If your regular SQL Server becomes too slow
for running your data warehouse queries, or uploading the new data takes too
long, you might benefit from the Azure Data Warehouse. Via its “divide and
conquer” approach it provides significant performance improvements, yet most
client applications can connect to it as if it is a regular SQL Server.

But to benefit from these performance
improvements we need to implement our Azure Data Warehouse in the right way. In
this session - through a lot of demos - you will learn how to setup your Azure
Data Warehouse (ADW), review indexing in the context of ADW and see that
monitoring is done slightly different from what you’re used to.
Digital Transformation is much more than just sticking a few Virtual Machines in the cloud; it is real, transformative, long-term change that benefits and impacts the whole organisation.
Digital Transformation is a hot topic with CEOs and the C-level suite, renewing their interest in data and what it can do to empower the organisation.
With the right metrics and data visualisation, Power BI can help to bring clarity and predictability to the CEO to make strategic decisions, understand how their customers behave, and measure what really matters to the organization. This session is aimed at helping you to please your CEO with insightful dashboards in Power BI that are relevant to the CxO in your organisation, or your customers’ organisations.
Using data visualisation principles in Power BI, we will demonstrate how you can help the CEO by giving her the metrics she needs to develop a guiding philosophy based on data-driven leadership. Join this session to get practical advice on how you can help drive your organisation’s short and long term future, using data and Power BI.
As an MBA student and external consultant who delivers solutions worldwide, Jen has experience in advising CEO and C-level executives in terms of strategic and technical direction.
Join this session to learn how to speak their language in order to meet their needs, and impress your CEO with proving it, using Power BI.
We all know about the "Vs" of Big Data: velocity, volume and variety. What about the missing "V?" That would be visualization? What does "big data" look like? It has to be more than beautiful. It must convey information and insights in a way that people understand. Furthermore, people expect to make actionable insights from their data. How can we make big data friendly to users? This session will look at a mix of technologies for visualizing big data sources and ways of achieving BigViz harmony in your Big Data.You will learn:
About Big Data—from machine scale size down to human scale understanding
Technologies for visualizing Big Data, both Microsoft and Open Source
Data Visualization principles
So Azure SQL DataWarehouse is now available and starting to be use, but what does that mean to you and why should you care?

Reflecting on a large-scale Azure DW project, this session gathers together learnings, successes, failures and general opinions into a crash course in using Azure DataWarehouse “properly”.

We'll start by quickly putting the technology in context, so you know WHEN to use it, WHERE it’s appropriate and WHY it works the way it does.
  • Introducing the ADW technology
  • Explaining distributions & performance
  • Explaining polybase
Then we'll dive into HOW to use it, looking at some real-life design patterns, best practice and some “tales from the trenches” from a recent large Azure DW project.
  • Performance tips & tricks (designing for minimal data movement, managing distribution skew, CTAS, Resource classes and more)
  • ETL Patterns (Surrogate keys & Orchestration)
  • Common Mistakes & Pitfalls
  • Conclusions & Recommendations
You’ve probably already seen that R icon in
the Power BI GUI. It shows up when creating sources, transformations and
reports. But the ugly textbox you got when you clicked upon those icons didn’t
encourage you to proceed? In this session you will learn just a few basic
things about R that will greatly extend your Power BI data loading,
transformation and reporting skills in Power BI Desktop and PowerBI.com
In this session, we will look at data visualisation using:

Apache Spark
R
Python
Power BI

We will look at how we can enhance the suite of offerings to your data consumers by opening your toolbox of Open Source data visualisation tools, as well as Microsoft's Power BI.

Using Microsoft SQL Server and Big Data as a source, we will look at these open source and proprietary tools as an accompaniment to Power BI - not a replacement. 

Join us for this demo-heavy session, and be sure to download the code from Jen's blog right before the session in order to try it out as we proceed through the session. The emphasis is on practical takeaways that you can apply when you go back to your office after SQLBits, and try it out for yourself.
So you have made first contact with Biml and are excited? Good!















You're wondering, if Biml can do more than just transfer data from SQL table to another? Great!















Because Biml does so much more than just simple SSIS packages. We'll explore the potential on how to improve your existing packages using BimlScript and LINQ.







Topics covered, amongst others, are derived columns, incremental changes and how to handle flat files.















You'll leave with sample code and hopefully a couple of ideas on how to bring your Biml development to the next level.
In this demo-heavy session, you will learn about the basic concepts of increasing productivity by creating your SSIS packages using Biml.



We will look into manual Biml code to understand the general idea of Biml, then take it from there and generate a whole staging area from scratch and end with a complete manageable solution to completely maintain your staging process using SQL tables.



Have you ever spent hours fixing your SSIS due to a schema change on the source? Ever wanted to add a "load timestamp" to 370 tables in your staging area but refrained because it would have taken you weeks to do so? If so, this is the session for you!
Most of the time pilots are learning to fly, they're actually learning how to recover from emergency conditions. While we as Database Administrators focus on taking backups, how much time do we actually spend practicing recovering with those backups? This session will focus on the kinds of situations that can dramatically affect a data center, and how to use checklists to practice recovery processes to assure business continuity.
Every business needs to guess. From core business (Will this customer default on its loan?) to IT management (How much disk space will this new database need?) they are making lots of decisions daily using some sort of ‘model’: from gut-feeling over averages up to linear regression. Machine learning offers more advanced techniques to build a model on historical data and apply this to new observations. In this one day precon we walk through the full process of building these models on AzureML, the cloud-based machine learning framework Microsoft released in summer 2014.
We start from scratch, so no machine learning nor statistical knowledge is required. First we need to figure out what we try to cook: not all business problems can be solved with AzureML. After discussing which problems can be tackled by AzureML, we start looking for the main ingredients: data. Also we investigate how to clean these ingredients: data preparation is a crucial step! Then it’s time to do the actual cooking: There are many machine learning algorithms available in AzureML: we’ll browse through them and give some intuition on what they do, such that we can configure them properly. Once we’re done cooking it’s time to serve the result. AzureML allows us to convert our process into a web service. We’ll see both how to create and consume such a web service.

 
Most of the time you’ll see ETL being done with a tool such as SSIS, but what if you need near-realtime reporting? This session will demonstrate how to keep your data warehouse updated using Service Broker messages from your OLTP database.
Did you know, you can use Biml for much more than just SSIS? No? This session will probably surprise you - even with some DBA related topics! We will look into deployment, sample data creation, test cases and other ideas, that will show you, how powerful and flexible Biml really is - and that it does way more than you may have thought!
Maintaining a solid set of information about our servers and their performance is critical when issues arise, and often help us see a problem before it occurs.  Building a baseline of performance metrics allows us to know when something is wrong and help us to track it down and fix the problem.  This session will walk you through a series of PowerShell scripts you can schedule which will capture the most important data and a set of reports to show you how to use that data to keep your server running smoothly.