# Thursday, January 29, 2009

As our production database gets more and more data in it, we noticed that things were slowing down. I ran SQL Profiler trying to figure out if we needed to ad more indexes, better arrange the data, or anything we could do to improve the performance. After about an hour of running queries, creating indexes, profiling, and looking at execution plans; I had gotten barely anything for performance gains.

I then decided to take a different track and wondered if our indexes needed to be rebuilt.  A quick Google later, and I came across an article on Tips for Rebuilding Indexes over at SQL Server Performance.

The gist of the article is your indexes get fragmented and need to be rebuilt. I ran the script present in the article and noticed a substantial improvement in performance.  I failed to capture metrics to quantify the performance improvements, but the users definitely noticed.

Without further ado, here is the script from the article:

If your using MS SQL Server 2000:

USE DatabaseName --Enter the name of the database you want to reindex

DECLARE @TableName varchar(255)

SELECT table_name FROM information_schema.tables
WHERE table_type = 'base table'

OPEN TableCursor

FETCH NEXT FROM TableCursor INTO @TableName
DBCC DBREINDEX(@TableName,' ',90)
FETCH NEXT FROM TableCursor INTO @TableName

CLOSE TableCursor

If your using MS SQL Server 2005:
USE DatabaseName --Enter the name of the database you want to reindex

DECLARE @TableName varchar(255)

SELECT table_name FROM information_schema.tables
WHERE table_type = 'base table'

OPEN TableCursor

FETCH NEXT FROM TableCursor INTO @TableName
FETCH NEXT FROM TableCursor INTO @TableName

CLOSE TableCursor

HowTo | SQL
Thursday, January 29, 2009 10:04:54 AM (Alaskan Standard Time, UTC-09:00)
# Saturday, January 24, 2009


Call me a control freak, but I like to see all the shared volumes on my Mac. I could open Terminal and cd to the Volumes folder, or I could use Finder and Go -> Go to Folder to see everything that OS X has mounted in my Volumes folder. But thats kind of a pain.


After a little bit of digging around, I found out about SetFile. SetFile is a command line utility that allows you to set the file attributes on files in an HFS+ directory. After figuring out the parameters for it, I came up with this little ditty to make the Volumes folder show up under "Macintosh HD." Run this in Terminal:

sudo SetFile -a v /Volumes

With this command, you are setting the visibility attribute on the Volumes folder to visible. To reverse the process, change -a v to -a V. Now open up "Macintosh HD" and you should now see all the volumes mounted on your Mac!

Saturday, January 24, 2009 8:59:00 PM (Alaskan Standard Time, UTC-09:00)
# Friday, January 23, 2009


When I originally got Mac Mini, Mac OS X 10.5.1 was out. I did a quick google search and found simple steps to follow to get Time Machine to backup to a network volume on my Linux server. Everything worked great!

When Mac OS X 10.5.2 was released a few weeks later, Time Machine would no longer backup to my file server. I did a lot of googling and found I wasn't the only one with the problem, but couldn't find any solutions. I've given it the old collage try a few times since then trying to get it working again, as recently as mid December 2008, but to no avail.

I don't know what possessed me to try and get it working again this time, but I did. And I won! It wasn't an easy battle, nor was it an epic battle. But it was a battle none the less.


I found the various postings on the net about how to everything working, including Hupio's OSX Timemachine and Samba/Windows share. But nothing really worked. I kept getting the error message "the backup disk image could not be mounted."

I almost gave up again, but decided to google the error message. And came across a few more sites, but they didn't have anything of interest. I don't know why, but I tried to create my sparse bundle on the network share itself, instead of on the Mini and moving it to the network share. That got me the error message "hdiutil: create failed - Operation not supported".

Googling that error message led me to Viraj's post about Time machine + AFP + Ubuntu - Samba. Viraj got everything working by installing the AFP service on his Linux (Ubuntu) server. He linked to How to: Install Netatalk (AFP) on Ubuntu with Encrypted Authentication which was perfect because I happen to be running an Ubuntu server.


If your using a Linux file server like I am, and want to backup your Mac using Time Machine to your file server, follow these steps:

1. Install AFP on your Linux server

2. Figure out where you are going to store the backups on your file server. I stored mine in /media/backup/TimeMachine. You will need to edit your /etc/netatalk/AppleVolumes.default file and point it to the directory:

sudo echo "/media/backup/TimeMachine \"Time Machines\"" >> /etc/netatalk/AppleVolumes.default

3. Restart netatalk

sudo /etc/init.d/netatalk restart

4. Mount your "Time Machines" volume. Finder -> Go -> Connect to Server and enter afp://IPADDRESS/Time Machines

5. Create a sparse bundle. If your OS volume is case-sensitive like mine, run this in terminal:

hdiutil create -library SPUD -size 50g -fs "Case-sensitive Journaled HFS+" -type SPARSEBUNDLE -volname "TimeMachine for YOURNAME" "YOURMACSNAME_MACADDRESS.sparsebundle"
this will create a 50 GB sparse bundle for Time Machine. If your OS volume is not case-sensitive (the default) use this command:
hdiutil create -library SPUD -size 50g -fs "Journaled HFS+" -type SPARSEBUNDLE -volname "TimeMachine for YOURNAME" "YOURMACSNAME_MACADDRESS.sparsebundle"

I'm not going to go into the details about the command line, the link above goes into greater detail. You will need to read the article so you can plug the correct values in.

6. Move your newly created sparsebundle to your "Time Machines" share:

mv mini_MACADDRESS.sparsebundle /Volumes/Time\ Machines/

7. Configure your Mac to allow backing up to a network share:

defaults write com.apple.systempreferences TMShowUnsupportedNetworkVolumes 1

8. Finally, open Time Machine, click "Change Disk" and point to your "Time Machines" volume. In 2 minutes, Time Machine will start to backup your data to your Linux network file system!

A quick note about the conventions used above

all commands blockquote are supposed to be run in Terminal. All commands that start with a sudo (items 2 and 3) are supposed to be run on your Linux server

Friday, January 23, 2009 10:42:58 PM (Alaskan Standard Time, UTC-09:00)

Wednesday, I posted my musing on becoming more grounded as a developer. Yesterday, I tried in ernest to put my reflections into action.

I exercised my code from every angle I could think of, and found a bunch of minor nits. I fixed and refactored the code until I had something I was truly proud of. The problem is, I became so focused on the details that I forgot about a main scenario.

Given the state of my laptop, it would have been a pain, but not painful to test this scenario. All and all, I can say that I'm pleased with my performance yesterday, and will try in earnest to keep this up going forward.

Friday, January 23, 2009 7:32:19 AM (Alaskan Standard Time, UTC-09:00)
# Wednesday, January 21, 2009

I'm trying out blog editing software on my Mac Mini. So far, I've used MarsEdit and Ecto. I'm honestly not very impressed with either of them. MarsEdit doesn't support (that I've found) rich text editing from within its editor. It only supports raw HTML editing, but will shell out to other editors to do the rich text editing.

Ecto, took me quite a few tries to find a good link to download it. When I finally got it downloaded, isn't to terrible. The UI isn't very clean IMHO. Its interface for adding links is sub-par, and NOT discoverable. At least it supports a rich text editor out of the box.

I wish Microsoft would make a version of Windows Live Writer for the Mac :)

Technorati Tags:

Wednesday, January 21, 2009 9:48:12 PM (Alaskan Standard Time, UTC-09:00)

In the current version of the software we are working on, I've become very complacent as a developer. I've written my code, and then been done with it. I've tested what I've written. But after seeing the number of issues that have arisen, I think I can say I haven't really tested my code.

Its a humbling feeling, this self-realiziation I've come to. But the bigger and better question is what am I going to do moving forward? I'm going to allow this to make me more humble.

I need to put on my tester hat more. I need to find that glee and excitement I had long ago when I would break other peoples code and apply that to my own work. I need to be more pessimistic regarding what might fail; not optimistic about everything going right.

I also need to take a page out of my friend Aaron's book, and realize that its really a personal failure on my part if a defect with my code makes it out of development. If testing finds a defect with my stuff, then in my opinion, I have failed as a developer. But, it doesn't make me a failure as a person, or as a developer. But it does mean that I need to redouble my efforts to ensure that it doesn't happen again.

Wednesday, January 21, 2009 9:26:46 PM (Alaskan Standard Time, UTC-09:00)
# Tuesday, January 06, 2009

After much time, trial and error, I was finally able to get Visual Studio's remote debugging features to work.  In my travels around the Internet, no one seems to have compiled all the steps to make the process work successfully and seemlessly into one page, this is my attempt.

Setting up

Complete all the steps listed on the How to: Set Up Remote Debugging article on MSDN.


Don't try and fight the cross-domain permissions battle, its just not worth it.  If the machine your trying debug is not on a domain, then don't run VS from a machine thats on the domain. Make sure the same user with the same password exists on both machines.


Put the PDB files on the remote machine (target) in the same folder as your app.

Source Code

In our environment, our build server produces an installer, and it also produces the PDB's we used in the above step. If we want Visual Studio to automatically pick up the right source files when we are debugging, you need to store the source code on your host machine in the same location as the machine that built your PDB's.

For example, in our world, on our build server the code lives on d:\code\projectname\code\ so on your host machine, you would store our source code on d:\code\projectname\code


If you have followed all these steps, and if the stars align just right, you should now be able to step through your source code when remote debugging.


Questions, comments or just can't make it work for some reason? Leave a comment and fill out your actual email address nd I'll try to address it!


kick it on DotNetKicks.com

Tuesday, January 06, 2009 1:20:47 PM (Alaskan Standard Time, UTC-09:00)