Today, I got word that Microsoft has named me into their Most Valuable Professionals program. While I’m completely honored, humbled, and flattered, I’m also a little surprised. Yes, I knew I had been nominated, but I really didn’t think I would be named this time around.
Microsoft’s MVP website says this about the program:
For more than two decades, Microsoft has recognized exceptional, independent community leaders who share their passion, technical expertise, and real-world knowledge of Microsoft products with others. It is part of Microsoft’s commitment to supporting and enriching technical communities.
Microsoft Most Valuable Professionals, or MVPs are exceptional community leaders who actively share their high-quality, real-world deep technical expertise with the community and with Microsoft. They are committed to helping others get the most out of their experience with Microsoft products and technologies.
Technical communities play a vital role in the adoption and advancement of technology—and in helping our customers do great things with our products. The MVP Award provides us with an opportunity to say thank you and to bring the voice of community into our technology roadmap.
When you look at the MVP community, I’m a complete second stringer. This community has some really big names, and these people really are the A team. I tend to think of people who become MVPs as those who write books, speak at big conferences like the PASS Summit, SQL Bits, and TechED. They have thousands of twitter followers and a blog presence that makes the New York Times look like a backwater operation. I’ve done none of these things. Sure, I help with a user group, organize a few big SQL Saturday events, present at a ton of little conferences, speak at most of the New England SQL user groups, do some volunteer work for PASS, and even had an article published at sqlservercentral.com. Still, I’m not a big name in the community.
So I when I found out that I was named an MVP, I was waiting for the “just kidding” email. Seriously, I’m humbled, and I still blame Brent Ozar for sparking my interest in the community. Thank you to the MVPs who have nominated me for this program, and thank Microsoft for believing in me.
What’s next? I’ve deciced that I really need to step up my game when it comes to blogging. I have a vault of great information that I’ve learned working for a hosting company about managing large SQL Server environments. It’s time to start sharing some of that with the community. I guess that it’s appropriate that my first event as an MVP will be SQL Saturday in my hometown of Pittsburgh, where I will be presenting Transaction Log Internals: Virtual Log Files, and Recovery and Backups for Beginners.
Several months ago, I got an email from Karla at PASS HQ, asking me if I’d be interested in doing another SQL Saturday. It seems that the leader of our local BI chapter was going to be traveling on business and would be unable to host the Boston BI event. She made it pretty clear that I was in no way obligated to do this and my participation was completely voluntary. I checked with my team, and they were all up for the challenge.
In a way, we were completely ready. Having done two events at this venue in the past, we knew we had a formula that worked, so all we needed to do was repeat the success of the last event. Or so we thought. Over the next few weeks, we scrambled to get things into motion.
Our immediate challenge was to make sure all of the paperwork was in place. Because it was a different team running this event, we had to sign a new license agreement with PASS as well as get approval from Microsoft to use their venue again. Never mind that I was in the process of setting up my own LLC so that the event funds wouldn’t be funneled through my own personal finances. In very short order, Bay State Data Professionals, LLC was born.
After a mountain of paperwork, we were ready to roll. Or so we thought. It seems that the local BI team hadn’t reserved any space for Friday’s setup, let alone space for a precon. When we use the Microsoft facility in Cambridge, we always get the entire 4th floor for Friday and the entire facility (2nd and 4th floors) for Saturday. As luck would have it, Friday wasn’t available to us, so setting up would become a challenge. This also meant no precon.
Over the next week, Friday’s availability was resolved, and I started out in search of a precon partner and a key sponsor. Finding a precon partner proved to be incredibly difficult. All of the key BI people were already tied up with other engagements. Finding the key platinum sponsor was much easier. I went straight to our friends at Pragmatic Works. BI is their playground, and they were happy to sponsor us. I was happy to give them a great deal on a platinum sponsorship in exchange for something I really needed, a keynote speaker. Adam Jorgensen stepped right up and was happy to do our keynote.
Suddenly, things were falling into place, and it looked like we would have a great event. But it wasn’t all sunshine and roses. Tomorrow, there is more.
I had an incident come across my desk the other day that was rather odd. We had an application that was running slow, and my Windows team noticed that SQL Server was using all of the available RAM on the server. Before I gave my Windows guru the Brent Ozar lecture on Task Manager being a filthy liar, I wanted to give it a look myself first. My instinct told me that we had a configuration problem because it’s very rare for my boxes to page to disk. We’re pretty conservative when setting MAX RAM. That’s when I found the culprit on the Memory tab of the server’s configuration.
Yes, we had a configuration problem. I immediately set MIN and MAX RAM to 16 GB, which is our best practice for a server with 32 GB running this particular application. Within a few seconds, my Windows guru asked how I fixed it. It really was that quick of a fix. Great. But how do we prevent this from happening in the future. My Grandma Hillwig used to say that an ounce of prevention is worth a pound of cure. She was a pretty smart woman. The first thing I did was have my team check the build documentation to make sure that this is set during server setup. I also had them check the peer review checklist to make sure this gets checked. Checklists are good and all, but I had this little itch to automate the check. This little script took me less than an hour.
set nocount on DECLARE @v_max_server_memory int DECLARE @v_min_server_memory int CREATE TABLE #config (name varchar(128), minimum int, maximum int, config_value int, run_value int) INSERT #config exec sp_configure SELECT @v_max_server_memory = config_value FROM #config WHERE name = 'max server memory (MB)' SELECT @v_min_server_memory = config_value FROM #config WHERE name = 'min server memory (MB)' drop table #config if @v_max_server_memory > 262144 and @v_min_server_memory < 512 and PATINDEX('%Hypervisor%',@@version) > 0 begin DECLARE @v_recipient varchar(128) DECLARE @v_subject varchar (128) DECLARE @v_body varchar(2000) SELECT @v_recipient = 'email@example.com' SELECT @v_subject = 'SQL Server Best Practices Alert' SELECT @v_body = 'SQL Server instance ' + @@servername + ' has failed a best practices check. This server is a VM and has MIN RAM set to ' + cast(@v_min_server_memory as varchar) + ' and MAX RAM set to ' + cast(@v_max_server_memory as varchar)+'.' + char(10) + char(10) + 'Setting MIN RAM too low will cause the VMWare balloon driver to force SQL Server to give up RAM. Setting MAX RAM too high will allow this server to start to page to disk.' EXEC msdb.dbo.sp_send_dbmail @profile_name = 'dbmail', @recipients=@v_recipient, @body=@v_body, @subject=@v_subject ; end
Now I would argue that one should set MIN and MAX RAM on all instances, not just those in virtual environments. However, it’s absolutely critical to do this in virtual environments. Notice that I’m looking for the word “Hypervisor” when using the @@version variable. And I’m looking for servers that have MAX RAM set higher than 256 GB of RAM. None of the instances in my environment are nearly this big. Your mileage will vary and you’ll need to modify that value.
Now that I have a script, I just need to deploy it. I tested this by running it against all of the instances in my environment with the CMS. After that, we’ll deploy it as a SQL Agent MSX job that runs on every instance once a week. The next step is to add our code segment that will send it to our service desk software and have it parse out the right configuration items.
I’m incredibly excited right now.
SQL PASS has just announced that SQL Saturday #262 Boston 2014 will be held on March 29 at the Microsoft Conference Center in Kendall Square in Cambridge, MA. Once again, I will be leading the team that organizes this event.
We had such an awesome experience with the 2013 event, and I’m really looking forward to 2014!
You can register for the event at the event website.
SQL Saturday season is starting, and I will be at two upcoming events.
On August 17, I will be presenting Recovery and Backup for Beginners in New York City. On September 14, I will be presenting What the VLF? in Orlando. I’ve also submitted to speak at SQL Saturday 213 in Providence.
In October, I will be attending the SQL PASS Summit. This is the superbowl of SQL Server learning events.
At this point, I can’t say too much, but SQL Saturday Boston 2014 is in the works.
I was looking at syntax of DBCC SHRINKDATBASE today and came across a little gem. Yes, I’m shrinking. I’m also doing it to recover close to a terabyte of storage. But I’m staying in control of when we shrink.
Looking at the MSDN page with the syntax, I came across this little nugget of truth.
Let me copy and paste the text so that the search engines pick this up, too.
Unless you have a specific requirement, do not set the AUTO_SHRINK database option to ON.
Countless people have documented this before, myself included. Microsoft is saying it. Please disable AUTO_SHRINK. Now. Please.
And then after you disable AUTO_SHRINK, please look at your indexes. They’re probably a mess.