917.848.7284Data for Finance and Accounting

Top

Using Dynamic SQL to Solve the Problem of Too Few SQL Server Instances

Using Dynamic SQL to Solve the Problem of Too Few SQL Server Instances

As consultants, we pride ourselves on both knowing the best way to set up systems AND finding workarounds when client infrastructure is “suboptimal” and unlikely to change. We recently performed such a workaround for a client that had too few SQL server instances—and I’m going to share the experience with you here. (Hopefully, your system is set up more optimally. But even if it is, you can still learn from our example.)

A Workaround for Dynamic SQL and BatchMaster

Our client was running both Dynamics GP and BatchMaster. Both systems share the same database instance, with different database names. So, we started with six different databases. (This wasn’t the exact structure, but it’s close.)

BMPROD

BMTEST

BMDEV

GPPROD

GPTEST

GPDEV

(Dynamics GP folks know that Dynamics Shared database makes things even complex. But I don’t want to get into that here or we’ll lose sight of our main topic.)

To make things more interesting, we set up our own databases for development. In general, we’re strong believers in having software packages control their own databases. It makes for easier upgrades and eliminates “Your peanut butter is in my chocolate!” kind of arguments.

So, we have three more databases:

RED3PROD

RED3TEST

RED3DEV.

Now, we have a problem because all nine databases are on a single instance. And much of our Red Three (RED3) code needs to reference both GP and BatchMaster.

So how do we do it? Well, say we start with a stored procedure in Dev. We write an interface between BM and GP, and we want to reference the item master in both systems for our stored procedure. So, we have to fully qualify the object names. A simple select statement would look like this:

SELECT bmi.ITEM, bmi.DESCRIPTION, gpi.ITEMNMBR, gpi.ITEMDESC FROM

BMDEV.dbo.ITEM bmi

LEFT OUTER JOIN GPDEV.dbo.IV00101

ON bmi.ITEM = gpi.ITEM

And that’s fine. But if we want to move this code to production, we have to search and replace as we move, so it looks like this:

SELECT bmi.ITEM, bmi.DESCRIPTION, gpi.ITEMNMBR, gpi.ITEMDESC FROM

BMPROD.dbo.ITEM bmi

LEFT OUTER JOIN GPPROD.dbo.IV00101

ON bmi.ITEM = gpi.ITEM

This is bad for several reasons:

  • It leaves room for human error. Every code promotion requires making a change.
  • If you want to compare your code in production and development, whatever tool you use (we use Red Gate SQL Compare) is going to find differences that aren’t really differences.
  • If database names change, you have to go back and change your code.

A More Ideal Solution: Separate Database Instances

A more ideal solution (and not just for coding reasons) is to have separate database instances for Dev, Test, and Prod. Then our database names could stay consistent:

Production Server

GP
BM
RED3

Test Server

GP
GM
RED3

Development Server
GP
GM
RED3

But unfortunately, that wasn’t an option.

An Alternate Solution: Build Simple Views in Our Red3 Database

We decided we would never, ever reference the tables outside of GP and BM directly with our custom code. So instead, we decided to build simple views in our Red3 database for each table we need to access.

To demonstrate, let’s use the above example where I’m accessing two different item masters. I would create two simple views:

First, a simple view like this:

USE RED3TEST,

CREATE VIEW dbo.vGP_IV00101 AS SELECT * FROM GPTEST.dbo.IV00101

CREATE VIEW dbo.vBM_ITEMMASTER as select * from BMTEST.dbo.Item

Second, a simple view so my red3 code doesn’t have to access any databases:

SELECT bmi.ITEM, bmi.DESCRIPTION, gpi.ITEMNMBR, gpi.ITEMDESC FROM

dbo.vBM_Item

LEFT OUTER JOIN dbo.vGP_IV00101

ON bmi.ITEM = gpi.ITEM

Now that’s not bad. I do have to setup underlying simple views, but I can move the code between environments with no changes.

Automating the Procedure to Reduce Odds of User Error

But I want to go one better. As you can predict, when your database grows, you’ll need more and more of these views. And keeping database names straight is prone to user error. So, I automated a procedure to automatically create a view for a list of tables. (Theoretically, I could create a view for EVERY table in the packaged systems. But that creates thousands of objects I’d never use and make SSMS hard to navigate.)

As a first step, I create a simple table that contains the list of tables I want to turn into views:

And then I populate it. (I will also save all these inserts as a stored procedure so I can easily update the procedure and the table when I need to create more views.)

(I repeat this for about 50 tables.)

Now, comes Dynamic SQL. Basically, I read through the table, create a view name using a prefix, and then create the tables:

Once this runs, I have the views I need without a lot of cutting and pasting. (Note: This procedure only updates views that don’t exist already. I have other versions that drop and create the view. And I also use synonyms when stored procedures are involved.)

Hopefully, most of you have multiple environments and don’t need to resort to this kind of workaround. However, the idea of using Dynamic SQL to build other objects can be useful in other environments as well, for example:

  1. Dynamics GP: Even with multiple environments, you’ll often have multiple databases. We use this technique to build code to “union” all databases in a single view and update that view each time we add a new one.
  2. Infor SunSystems: While GP creates a database for each company, SunSystems creates a different set of tables within a database each time a company is added. We use this technique to join all tables together in one view (and then one data mart).

If you’d like to play around with these solutions, you can download the sample code files.

Do you have these kinds of issues? How do you deal with them?

* * *

For more of these tips, sign up for our newsletter on the top right of this page.

 

No Comments

Post a Comment