New BI and reporting features in SQL Server 2008 R2 hit the mark, but other additions leave SQL Server shops wanting more
With Microsoft SQL Server 2008 R2, Microsoft begins to fully realize its vision of SQL Server as an information platform and not "just" a database. Hence the main theme for this release -- at least according to Microsoft -- is self-service BI. The PowerPivot plug-ins for Excel 2010 and SharePoint 2010 are easily going to make the biggest splash of all the new features, not least because they're the most complete. But then, SQL Server 2008 R2 isn't strictly necessary for PowerPivot for Excel, which works with plenty of other data sources.
A number of other features make SQL Server 2008 R2 worth a close look -- and they don't require Office 2010 or SharePoint 2010. Features such as StreamInsight and Master Data Services should gain traction right away, while others such as SQL Server SysPrep and DACPAC need some further baking. Overall, SQL Server 2008 R2 is a solid interim release, even if some key additions aren't quite what they could be.
[ Also on InfoWorld: From powerful productivity enhancers to important security safeguards, the new Microsoft Office has a number of features that businesses will love. See "Top 10 Office 2010 features for business" ]
Apart from new features, another change that may impact your shop is the addition of the new Datacenter edition. Along with introducing the new SKU, Microsoft has downgraded the Enterprise edition to support only 8 CPUs. If you have a SQL Server 2008 Enterprise server, you really need to be aware of this before you upgrade to R2.
Frankly, I think the levels for Enterprise have been set way too low. Shops that have 16-CPU servers aren't going to upgrade to R2 if it's going to cost them roughly $60,000 per processor. A more realistic break point for Datacenter edition would be greater than 64 CPUs.
But assuming you're not caught up in the horns of the Enterprise vs. Datacenter dilemma, R2 offers a handful of reasons to upgrade. Two or three of those reasons could even be compelling.
Reason No. 1 to upgrade to SQL Server 2008 R2: Self-service BI
Microsoft's new concept of managed self-service BI mostly refers to combining SQL Server 2008 R2 with the new PowerPivot plug-ins to push hefty row sets to Excel 2010 or to SharePoint 2010. The key to making this work is the use of column-level compression in the data set, which allows PowerPivot to process millions of rows in the same amount of time it would ordinarily take Excel to process only thousands of rows.
It's easy to see that financial professionals will love PowerPivot, but in fact PowerPivot will be useful to all kinds of Excel users. It's difficult to crunch any sizable data set in Excel; even if you're able to get all the rows into memory, processing them can take forever. I've seen this problem countless times when trying to analyze perfmon data, so I know many Excel users will welcome PowerPivot.
Note that PowerPivot for Excel can pull data not only from SQL Server but from almost any database. SQL Server 2008 R2 is required only to feed PowerPivot workbooks shared via SharePoint 2010. The PowerPivot plug-ins for Excel 2010 and SharePoint 2010 are freely downloadable.
Reason No. 2 to upgrade to SQL Server 2008 R2: Report Parts
One of my absolute favorite R2 features is the new Report Parts, found in Report Builder 3.0. Report Parts allow you to publish the different sections of your report to a centralized library, where others can grab them to include in their own reports. Charts, tables, and other segments become components that you can plug into any report you wish.
Let's say you create a chart with complex calculations embedded. Maybe it incorporates business rules or complex layout formulas. If you publish it as a Report Part, everyone else will be able to call upon the same chart in their reports. Best of all -- this makes me giddy inside -- all reports referencing this chart are merely calling an instance of the library object itself. If you change the logic in the library copy of the chart, the change is automatically propagated to all the reports that use it. (Downstream users can disconnect a Report Part from the library if they don't want it to update.)
That's a lot of power at your fingertips. Publishing these Report Parts was so easy I actually thought I did something wrong. I love this feature.
Reason No. 3 to upgrade to SQL Server 2008 R2: StreamInsight
StreamInsight is the name for Microsoft's new complex event processing engine, which is implemented as a set of .Net classes. StreamInsight makes it very easy to handle the processing of events in-flight -- rapidly executing queries on a stream of information -- so that you can get to decisions or actions faster. There are many use cases for this, but one of my favorites is monitoring production servers.
Let's say you're monitoring CPU metrics for a server and you don't want to persist every measure to the database. With StreamInsight, you could capture the CPU events as they happen, aggregate them as you like, and persist only the aggregated metrics to the database. You could also correlate the CPU metrics with other measures to glean real meaning out of them and get more significant alerts.
This is a simple example, but it should give you an idea of the types of actions that can be accomplished. Note too that StreamInsight is not an out-of-the-box feature, but must be written into .Net applications. While StreamInsight will take some spin-up for coders to get used to, they'll find that Microsoft has really smoothed the way.
Reason No. 4 to upgrade to SQL Server 2008 R2: Master Data Services
Master Data Services helps businesses build and maintain an authoritative source for critical data assets like products, customers, locations, accounts, employees, and so on. Master Data Services is a database, a user interface, and a set of services that enable organizations to rapidly build a model to manage the data that feeds dimensions or other systems. It may contain validation rules, notifications, and security roles. It provides versioning and enables you to reverse unwelcome changes to the data. Master Data Services may serve as a system of entry or a system of record. Using standard tools (such as SQL Server Integration Services or BizTalk), the data may be sent to or from Master Data Services as your business process requires.
What does that mean in plain English? In short, the Master Data Services scenario consists of a data model and a database. You pass your enterprise data through the master data engine, which validates the data against your rules before sending it on its way. Currently you have to build all your models manually, but that's not typically prohibitive to implementation, considering you're only likely to apply Master Data Services to a few important systems that need the extra checks. Still, it would be nice if Master Data Services would read the model and, in the process, let users concentrate on creating the rules.
Reason No. 5 to upgrade to SQL Server 2008 R2: Multiserver monitoring
The SQL Server Utility Control Point, which is the heart of R2's new multiserver management capabilities, allows you to monitor resource health across multiple SQL servers, but it currently doesn't support actions on out-of-policy items. In other words, Control Point is read-only for the measurements it shows you. Another limitation: Only 25 managed instances are supported in the Enterprise edition; you'll have to jump to the Datacenter edition to manage more. Here's another area where the Enterprise SKU is set too low; I've never been a part of an enterprise where 25 is an acceptable cutoff for monitoring multiple nodes.
Which controls are measured? Currently you can see server CPU and instance CPU, along with the more important storage stats. Server CPU versus instance CPU is a very handy distinction; quite often when you see a sustained CPU issue, you want to make sure it's related to the database instance before you continue troubleshooting. Another interesting capability here is that you can see CPU on individual databases; if you have a lot of databases on the server, it's good to know which one is causing the trouble.
On the storage front, you can see volume storage utilization at the drive level as well as space stats at the database, file group, and file (data and log) levels. Other than I/O and queue stats, these are the storage stats that DBAs most care about, so a lot of work is already done for you here. While the Utility Control Point has some way to go to provide actual control, it offers a good level of monitoring functionality and will get users thinking about this tool and what they'd like to see from it.
Reason No. 6 to upgrade to SQL Server 2008 R2: DACPAC
DACPAC, or Data-tier Application Component Packages, is a much touted feature that allows your developers to package database changes into a single file in Visual Studio and send it to the DBAs for deployment. This is a significant improvement over how changes are being distributed now. Today they're either sent as a series of .SQL files with deployment instructions, or as a set of Team Foundation Server paths for the DBA to fetch with the same deployment document. Either way, the process leads to deployment problems because there's just too much human interaction and opportunity for misinterpretation.
With DACPAC, DBAs get a single deployment file from the developers that contains all the changes; there's nothing to misunderstand or forget. But in this first incarnation DACPAC has some problems. For starters, to make even minor changes in the database, DACPAC completely re-creates the database and all of its objects, then moves all of the data over to the duplicated structures. In the final step, it drops the original database and gives the new one the proper name. You can easily see that creating an entirely new copy of the database for a small code change won't fit into most situations. Further, DACPAC doesn't copy user permissions or work with service broker or replication objects.
DACPAC is a great idea, but is currently suitable only for very small databases with limited structures. However, keep an eye on this feature; I suspect many of these limitations will be ironed out in future versions. DACPAC should only get better and better.
Reason No. 7 to upgrade to SQL Server 2008 R2: SQL Server Sysprep
SQL Server Sysprep is a really cool new feature that DBAs will love, but as with DACPAC, some limitations in this initial release make it less useful than I'd hoped. Like Sysprep in Windows, SQL Server Sysprep allows you to install the software and save the final configuration for later. This is most convenient when you use SQL Server Sysprep in combination with Windows Sysprep so that SQL Server 2008 R2 can be installed with the operating system during server provisioning; all the DBAs have to do then is configure it. Unfortunately, this initial release covers only the database engine and Reporting Services; clustering, tools, and SQL Server Information Services are unavailable. Most of the time, if you can't Sysprep those, you might as well install SQL Server 2008 R2 from scratch.
Tops for Windows:
- Best free remote access tools
- Top free troubleshooting tools for Windows
- More great free troubleshooting tools for Windows
- Top 10 Windows tools for IT pros
- The best free open source software for Windows
- Download InfoWorld's Windows 7 Deep Dive Report
More on Microsoft's back office:
- Top 10 Office 2010 features for business
- SharePoint 2010 is polished, refined, and feature-rich
- SharePoint 2010: What's new for users
- SharePoint 2010: What's new for IT pros
- SharePoint 2010: What's new for developers
- First look: Microsoft SharePoint 2010 beta spreads the wealth
- InfoWorld review: Visual Studio 2010 delivers
- Microsoft SQL Server 2008 is the best SQL Server yet
This weekend's Windows 10 upgrade has users angry, and it's unclear if the ploy will continue
Speaking at the O'Reilly Fluent conference, Eich also endorsed the Service Workers mobile app...
You don't need a tinfoil hat, either. Opportunists have exploited consumer fears to create an industry...
You're far more vulnerable to hackers than you think. Here are the secrets to staying secure
There's a lot of loose talk about our AI-driven future. But expect to keep using your own brain for a...
It's tempting to understand data encryption in terms of locks and other physical metaphors, but the...
Your cloud projects will not give you the return you expected if you don't treat security, targeting,...