Me vs Hungarian Notation

Hungarian Notation is a coding convention that many programmers use to (presumably) make their code more understandable.

It involves prefixing variable names with a kind of code that describes the “type” of variable you’re seeing in the code.

For example, if a variable is a Range object, you would name that variable rngSomeName.

Prefixing the variable name with rng allows the reader of the code to immediately know the variable is a range.

Here is a table of a few commonly used Excel objects and the universally agreed upon Hungarian Notation for each.

Variable Type

Hungarian Notation

















Pivot Table















I personally go through periods of really trying to use Hungarian Notation (mainly out of the fear that I’m a hack).

But I inevitably abandon the effort.

Mostly because I just can’t be bothered. I mean let’s face it. I’m not building the next PayPal here. Most of my code is for me to figure out.

Also, I feel as long as my variable declarations are reasonably close to the point where I use them, I can quickly look at the declaration if I have to.


To some extent (like one commenter wrote on Stack Overflow):

“vUsing adjHungarian nNotation vMakes nReading nCode adjDifficult”


My Fool-Proof System

I thought it would be fun to list my amazing system of naming variables.

Try not to faint from the brilliance.

I’m a tired 45 years of age (and this table really shows it).


What I Typically Use

















Pivot Table









Arr (typically an array)






Maybe, one day, I’ll come up with a serious naming convention that will change the face of programming. Although, that’s not likely.

Feel free to share any thoughts on the naming conventions you use.



Create SQL Server Tables from Excel Data – Free Tool

So I’ve got some time to come up for air and share a tool I’ve pulled together to help me in my client engagements.

The work I do often requires me to database existing Excel processes. That means taking existing Excel tables and moving their structure (and sometimes their data values) to SQL Server.

If you’ve had to do that before, you’ll know it’s a pain to do.

Today I’d like to share a tool I created to help me create SQL Server tables from existing Excel data.

First of all, you can download this tool here.

It’s wide open so if you want to look at the code and make it better, feel free.

The tool itself is fairly easy to use.

1. Paste your data into the first sheet.

2. Press the Capture Data button

This will activate a new sheet showing the captured fields converted to valid SQL naming conventions and valid SQL Server data types. The tool will do its best to predict the right data type, as well as avoid common errors like invalid characters and duplicate field names.

 3. In this new sheet, you can edit the suggested field names and data types. You can also select your Primary keys and add any Default values you want.

4. Press the Generate SQL button.


A new dialog box will activate.

5. Enter a name for your new table and then click the Generate SQL button.

6. At this point, you will see the CREATE TABLE SQL statement.


If all you need is the structure of the Excel table, you can simply this code and use it in SQL Server to create the table.

If you’re an advanced user with CREATE rights on the server, you can use the Advanced options.

7. If you want the tool to create the table and send the data as well, enter the server connection string.

8. Select the appropriate options

9. Click the Execute button.

Here’s a quick YouTube video showing this tool in action.

This tool has certainly saved me tons of time. Hopefully, some of you find this useful too.

Until next time.

Excel Against Humanity

April Fools is coming up, but it falls on a weekend.

So instead a post about pranking your co-workers, I thought I’d give you guys a fun way to spend your Friday.

I pulled together an Excel version of the game Cards Against Humanity.

This game is basically Mad Libs in card form. All players get a shared Question card, and they have to put down the funniest Answer card.

For example:


I pulled this game into Excel allowing you to play solo.

Simple macros drive it.

You can refresh the Question and Answer cards, as well as take a snapshot if something tickles your funny bone.


It’s surprisingly entertaining. You’ll find yourself spending a good bit of time trying to roll funny combinations.



The real Cards Against Humanity game is dirty – very dirty.

I scrubbed away most of the really egregious filth to make this Excel version relatively safe for work.

I said R E L A T I V E L Y….

Download it Here and have fun.

I’ll leave you to it. Enjoy getting fired.

Best Practices for Organizing VBA Modules

A friend of mine recently asked me:

“What is the best way to organize the Modules in my Excel workbook?”

That is to say, what is the best-practice for organizing VBA code?


I’ve searched the web to find a cohesive list of best practices, but I didn’t really find anything more than a few tidbits. Most of the advice out there simply tells you to organize your modules in a way that makes sense for anyone reading the code.

Ok…I got that. Most of us intuitively organize and partition our modules into loosely related procedures. For example, one of our modules may store all procedures pertaining to PivotTable automation, while another module may store all User Defined Functions.

Outside of that concept, I’ll attempt to relay some of the basic ground-rules I’ve always heard (and adhered to).


For me, organizing modules is not just about making it easier for the next poor soul to take over my bloated application. I typically make an attempt (feeble as it may be) to take into account the impact on memory and performance my large multi-procedural modules may have.

Excel (by default) loads code modules only as they’re needed/referenced. This means that when any procedure or variable is referenced in your code the entire module that contains that procedure or variable is loaded into memory.

This is one of the reasons why you’ll see workbooks where, for instance, all global variables are placed in the same module. You don’t want several modules holding global variables because each time a new global variable is referenced, Excel has to load another module into memory. It’s a far better practice to place all global variables into one module so Excel only has to load that one module into memory.

Another thing to remember is that the call tree for a particular procedure will ultimately include any additional procedure it references, plus any subsequent procedures that newly called procedure references. In fact, if your procedures reference numerous nested procedures stored in different modules, you could potentially be loading all of your modules into memory.

Confused? Here’s a simple example.

Let’s say you call a procedure found in module X. Module X is loaded. If that procedure includes code that references procedures in modules Y and Z, those modules will also be loaded. If those subsequent procedures reference procedures in other modules, those will also be loaded; and so on.

In that light, another best-practice is to place frequently used procedures in the same module as your commonly used procedures to prevent a different module from being loaded and compiled.

For example, you may have a module containing three procedures: UpdateCustomerAddress, UpdatedCustomerContacts, and UpdatedCustomerSales. Given that updates to Customer Address and Customer Contacts happen infrequently, but updates to Customer Sales happen all the time, it’s a better practice to place the procedure that updates Customer Sales in a module containing other frequently used procedures. This way, there is no need to load another module into memory. Excel will only need to load a separate module if Customer Address or Contacts is updated.

This may very well go against your instinct to organize your modules into “topics”, but it can help improve performance for larger complex Excel applications.


On last note:

Excel has an Option called Compile on Demand (in the VBE Menu – Tools->Options-General Tab). My understanding of the Compile on Demand option is that it tells Excel to load only the portion of the call tree required by the executed procedure. For example, if you call procedure in module X, any modules that contain procedures referenced in procedure X are loaded and compiled. However, Excel won’t load modules referenced in other procedures that may be called from in module X.

In other words, The Compile on Demand option tells Excel to load modules one level deep from the executed procedure’s immediate call tree; not the module’s call tree. This effectively makes execution of code faster.

You can uncheck this option to turn off the Compile on Demand. This will tell Excel to load and compile all modules at one time. I frankly don’t know why you would want to do that, but the option is there for you.


All that being said, it is quite possible I have my internal workings of VBA wrong somehow. Feel free to comment and correct me on any misstatements.

Pulling SSRS Data Directly into Excel with PowerPivot

As Excel Analysts, we’re often asked to pull data from various sources. One of these sources for many of us is SSRS. SSRS (SQL Server Reporting Services).

If you’ve pulled data from SSRS before, you’ll know that it’s a bit of a pain.

You’ve got to go to a URL, use the dropdowns on the SSRS interface to make parameter selections, wait for the report to refresh, export to Excel, then copy and paste the data into your data model.

Here is a screenshot of an SSRS Report.


Well today, I’d like to show you a way to leverage PowerPivot to automatically pull SSRS data directly into Excel.


Step 1: Run the SSRS Report and Get and ATOM file

So the first thing you’ll need to do is go to the SSRS report, select your desired parameters, and run it.

After it runs, instead of selecting the Export to Excel, choose the Export Data Feed option.

This option allows you export an ATOM Service file.

An Atom Service file is basically a text file that has XML markup defining how to connect and interact with a data source.


When you save an ATOM file, you’ll notice it has the extension of atomsvc.

The name of the file will typically be the name of the SSRS report you pulled it from. You can change the name if you want.

The file itself contains no data. It simply holds the connection parameters you selected when you ran the report.

This file can be used by PowerPivot as a kind of proxy to connect to SSRS and pull the data defined by the parameters.


Step 2: Use PowerPivot to Connect through the ATOM file.

Now that you have an ATOM file, you can open a fresh Excel workbook.

On the PowerPivot tab, click the Manage button.


This opens the PowerPivot window.

Now click on the From Other Sources option to activate the Table Import Wizard.

Select the Other Feeds option.


Browse to the ATOM file you saved, then click NEXT.


You can give the data feed a friendly name. In this example, I called my feed LineItemDataFeed.


After you click finish, PowerPivot will start retrieving the data.

After you get a status of Success, you can close the Table Import Wizard.


You should now see the data in the PowerPivot window.

You’ll notice that the first few columns of the data feed are not real data. They are the parameters for the data.

That is to say, they simply show the parameters you selected when first running the SSRS report.

Keeping these columns may be useful to you, but I typically get rid of them.


Here’s how to remove the parameter columns.

On the Design tab of the PowerPivot window, click the Table Properties command.

This will open the Edit Table Properties dialog box.

Here, you can simply uncheck the columns you don’t want to see in the data feed.

After you’ve completed unchecking the columns, click Save.



Step 3: Link the PowerPivot data to an Excel worksheet.

Now that PowerPivot contains the data from the desired SSRS report, you can now show that data on an Excel Worksheet.

This step involves nothing more than linking the PowerPivot data to a sheet.

First, you can close the PowerPivot Window, and switch to the Excel window.


Now, go to the Data tab in Excel and click the Existing Connections command.

In the Existing Connections dialog box, choose Connections in this Workbook and double-click your target data feed.


Choose to Import Data as a Table and click the OK button.



Step 4: Test/Refresh.

At this point, you’ll have all the SSRS data on a worksheet.

You can now right-click anywhere in the table and click Refresh to get the latest data from SSRS!



1. If you don’t have PowerPivot available to you, you can actually use Excel’s built in Get External Data feature. Simply click Data->From Other Sources->From OData Data Feed, then point to the ATOM file you saved in Step 1. That being said, I personally like using PowerPivot so the data I pull can be incorporated with other data in my data model.

2. You can actually edit the parameter values within the ATOM file to get different data. For example, if you ATOM file contains a parameter for year, changing the value from 2016 to 2017 will actually get you different data. Soooo….you can save one ATOM file, make a copy, and change the parameter values to get different data sets.

3. If the URL for your SSRS report is moved or changes, you’ll need to edit the ATOM file to point to the new URL, or more simply, save a fresh ATOM file from SSRS.

4. Interestingly, PowerQuery doesn’t allow you to point to ATOM files on your PC to get data, so this trick does not work with PowerQuery. I’m sure there is a way to hit SSRS data from PowerQuery, but I haven’t found it yet.

Why ‘Your Version’ of Excel only Accept Semicolons in Formulas instead of Commas

Today, I learned how Region settings in Windows can truly botch up Excel.

A reader of one of my Pulitzer Prize worthy books sent me an email stating that the example formulas aren’t working.

After a bit of back and forth, he sent me this:


Huh? Since when does Excel use semicolons instead of commas as argument separators?

Well, in researching this anomaly, I was surprised to find Read more

Don’t Miss Out – One More Week to Sign Up for Live Power BI Training with Ken Puls

I’m a HUGE proponent of live training.

Webinars and on-line training is fine, but there is nothing like physically being in an environment of like-minded people.

Ken Puls is hosting a Power BI Bootcamp LIVE in Vancouver, British Columbia.

Now, before you click away from this crass commercial message, let me preempt all your questions by answering them here.


1. Who’s Ken Puls?

He’s an Excel MVP, author of the wildly popular book on Power Query Read more

Winners of the Excel 5-Minute Challenge

The Excel 5-Minute Challenge has come to an end and the Excel community has chosen the winners!

Participants recorded themselves creating a functional dashboard/report in 5 minutes or less. Their videos were posted to YouTube, where winners were selected based on the number of “Likes” each video received.

And the Winners are…


1st prize:

Balázs Voros came in first place with 186 Likes.

He wins the Xbox One!


2nd prize:

Janis Sturis
takes second place with 118 Likes.

He wins the Fitbit Fitness Wristband!


3rd prize:

Alex Powers gets the Bronze with 89 Likes.

He wins the Amazon Fire HD 8!


3 Honorable Mentions

These fine young men took Honorable Mention with their submissions.

Steve Rider (41 Likes)

MF Wong (35 Likes)

Dinesh Natarajan Mohan (32 Likes)


They get to choose from one of these prizes!


DataPig Add-ins Gift Pack


Mr. Excel Gift Pack

From Bill Jelen


Magic of Pivot Table Course

From Ken Puls


RefTreeAnalyser Formula Auditing Tool

From Jan Karel Pieterse


Excel Dashboard Pro Training Course

From Jordan Goldmeier


Contextures Pivot Power Premium Add-in

From Debra Dalgleish


Peltier Tech Charts for Excel 3.0 – Advanced Edition

From Jon Peltier


Congratulations to all the winners.

And a special Thank You goes out to OZ Du Soleil, Doug Clancy, Senthil Thasma, Frédéric Le Guen, and Aamir Bhatti for submitting thier own videos. You guys are all awesome.

I’ll hold another contest like this one in the next few months. I’ve got a few ideas to lower the barriers to entry to make things more interesting.

Stay tuned.