Wednesday, November 28, 2012

Elbow injury - "terrible triad" - Follow Up

This is part 4 of 4 posts I have written about my elbow surgery. You can read them all here:
Part 1: The Emergency Room 

Part 2: Surgery and Recovery
Part 3: The Ketamine Experience

Part 4: Follow Up

It has been 3 1/2 years since I broke my elbow and I thought I should write a quick follow up to let you know how things are going. Some time has past, so some of these numbers might be off a little bit.

Physical Therapy

One thing that really surprised me was the amount of physical therapy I had to go through. After surgery, I had the partial cast/splint thing on my arm for a couple weeks. On a follow up appointment with the surgeon, he removed the splint and had me start scheduling physical therapy as part of my recovery. He said he was going to start me off with 28 (I think) appointments -- I thought he was joking! At first I was going twice a week, and then at the end I was going only once a week. At some point I ran out of my prescribed PT appointments and they had to go and prescribe more. I really don't know how many times I went, but I broke my arm at the beginning of summer and I remember still going to PT in the fall, maybe even winter.

I was told by my physical therapist that during surgery the surgeon tested the mobility of my arm to make sure I still had full range of motion. Being in the cast/splint for two weeks caused my muscles to atrophy and my muscles and tendons to shorten. It was absolutely amazing how much range of motion I had lost. After taking the splint off, my arm was permanently bent, and I could only contract or extend it a couple degrees -- literally just 2 or 3 degrees in either direction. I could only rotate my wrist a couple degrees in either direction. I also could not bend my hand back. Much later, the physical therapist told me that she was actually a bit worried about how much I would be able to recover, especially with the wrist rotation.

Physical therapy sessions were pretty much the same each time. First they would heat up my elbow for several minutes and then stretch my arm to try to lengthen the muscles and tendons. It was a little painful, but not too bad. I had a series of stretches to do at home as well. A couple sessions in, I was fitted for what I call the torture device. It was a plastic contraption that fit on my arm that I could use to help with my stretches at home. You basically strap your upper and lower arm into it and turn a crank to stretch your arm. It works quite well, but it is painful. The thing made me angry when I used it. My girlfriend and I had to make an agreement -- there was no talking to me while I was using the torture device. After several months of using the thing, the teeth of the gear started to break off. I felt joy that I broke it before it broke me.

As a side note, looking back something went wrong with getting the torture device. I think I was supposed to get it right away, but I think the sales rep for the company messed something up. I could tell that my physical therapist was mad that it was taking so long and I sometimes wonder if my recovery would have been better if I had received the device earlier.

I pretty quickly regained rotation of my wrist, and flexing of my hand - I'm not really sure, but I think that only took a couple weeks. The thing that took the longest to recover was the extension and flexion of my elbow joint. It took me months to get to a normal range.


Eventually, my progress in physical therapy came to a stop. Once the progress stopped, so did the physical therapy sessions.

Today

Three and a half years later, my elbow is the same as when I stopped PT. I regained extension to about 170 degrees (these numbers are really rough estimates) compared to 180+ on my good arm. I can flex my arm to about 60 degrees compared to 45 on my good arm. Rotation is pretty good - I might be off by about 10 degrees there, but that seems to be pretty minor. So, I can't flex my arm completely, or extend my arm completely, but I do have what the physical therapist called "full functional use" of my arm. That means there are very few things in your day-to-day activity that require full extension or flexion of your arm. The only thing I have noticed to be difficult is buttoning the top button on a shirt. I really have to stress my arm to get my hand in that position. All the other buttons are easy, so really it only comes into play when I wear a tie, which for me is quite rare.

I was told that in a bad case where you don't regain enough range of motion, they can perform another surgery to help extend your range, but even in that case, the best they would hope for from that second surgery is to get full functional range of motion.

Risks and Complications

One of the risks associated with this type of surgery is that you can grow extra bone. For some reason, elbow injuries in particular have a good chance of extra bone growth. I don't remember the exact numbers but some decent percentage (maybe even 50%) of people with elbow breaks like I had will have extra bone growth. In those with extra bone growth, another decent percentage (maybe 50% again??) will have so much bone growth that they will need another surgery to remove the bone growth.

The surgeon showed me my x-ray and showed a cloudy area that he had concerns about before the surgery. I did end up having extra bone growth, but it was not so bad that I required another surgery

One thing that showed up during my recovery was a little nodule on the ring finger tendon in my palm. It is called dupuytren's syndrome - basically the tendon contracts and causes your finger to be contracted. The little nodule is still there, but it doesn't seem to bother me at all -- at first that finger was contracted a little bit, but not anymore. It might be something I have to deal with later in life. I guess there might be some vague correlation between trauma and dupuytren's.

I have noticed nodules under the incision line. I've been told those are pretty common and related to suturing and they are easy to take care of, if it is bothering me. I'm planning on getting it checked out just to be sure its not something else.

So, that's about it. Three and a half years later, my elbow is doing well. I have a big scar from surgery. I have "full functional" use of my arm and the only thing I have trouble with is buttoning the very top button on a shirt. I haven't put the clip-in pedals back on my bike - maybe someday.

One last thing. My physical therapist told me about another patient she had dealt with that had the same injury. He was a big burly weight lifter. He broke his arm while jumping on the bed with his kids. I thought that was a cute story.

This is part 4 of 4 posts I have written about my elbow surgery. You can read them all here:
You can read them all here:
Part 1: The Emergency Room 
Part 2: Surgery and Recovery
Part 3: The Ketamine Experience

Part 4: Follow Up
 


Friday, February 17, 2012

Get Amazon S3 Resource URL with Powershell

I've been working on automating our build and deployment process recently. We are storing our binaries in S3 and I needed a script to pull the binaries from S3.

I thought this might help other people, so here you go.


# http://docs.amazonwebservices.com/AmazonS3/latest/dev/RESTAuthentication.html#RESTAuthenticationQueryStringAuth

#Signature = URL-Encode( Base64( HMAC-SHA1( YourSecretAccessKeyID, UTF-8-Encoding-Of( StringToSign ) ) ) );

#StringToSign = HTTP-VERB + "\n" +
# Content-MD5 + "\n" + # is empty for s3 request
# Content-Type + "\n" + # is empty for s3 request
# Expires + "\n" +
# CanonicalizedAmzHeaders +
# CanonicalizedResource; # relative URL kinda?

function get-s3Url ($server, $resourceUrl, $accessKey, $secretKey, $expireDate)
{
$s3BaseTime = [System.DateTime]::Parse("1970-01-01T00:00:00.0000000Z")
$exipires = [Convert]::ToInt32($expireDate.Subtract($s3BaseTime).TotalSeconds).ToString()
$stringToSign = "GET`n" + "`n" + "`n" + "$exipires`n" + "$resourceUrl"

$sha = new-object System.Security.Cryptography.HMACSHA1
$utf8 = New-Object System.Text.utf8encoding
$sha.Key = $utf8.Getbytes($secretKey)
$seedBytes = $utf8.GetBytes($stringToSign)
$digest = $sha.ComputeHash($seedBytes)
$base64Encoded = [Convert]::Tobase64String($digest)
$null = [Reflection.Assembly]::LoadWithPartialName("System.Web")
$urlEncoded = [System.Web.HttpUtility]::UrlEncode($base64Encoded)

$fullUrl = $server + $resourceUrl + "?AWSAccessKeyId=" + $accessKey + "&Expires=" + $exipires + "&Signature=" + $urlEncoded
$fullUrl
}

$server = "https://s3.amazonaws.com"
$resourceUrl = "/[your bucket name]/[path to your file in s3]"
$accessKey = "[your access key]"
$secretKey = "[your secret key]"
$expires = [System.DateTime]::Now.AddMinutes(5)

$url = get-s3Url $server $resourceUrl $accessKey $secretKey $expires

Write-Host $url

Tuesday, July 5, 2011

TDI Limited - possible scam

A friend of mine was contacted by someone claiming to be from a company called TDI Limited. The job involves: "Transferring payments straight from customers to our free-lancers by means of international money transfer systems"

It seems like a scam to me. I'm just putting this out there to see if anyone else hits this page looking for similar info.

Please leave a comment if you are getting something similar.

UPDATE: So, this is a scam. It is called a payment processing scam. See here for more details: http://www.delphifaq.com/faq/scams/f1057.shtml

Basically, they will have you set up a bank account, make a fraudulent transfer to your account, then ask you to make transfers out. After a couple days the bank will notice the first fraudulent transfer and go after you for the money. Also, I don't know what the definition of money laundering is, but this seems close.

Furthermore, if you look up the registration information of their website here:
http://www.whois.net/whois/tdi-limited.com
You will see they only registered the website on June 26, 2011. So, this big international company has only had a web site for 2 weeks.

Stay away from this scam.

Saturday, May 7, 2011

How Sovereign Bank lost me as a customer with one letter

I just got the monthly statement for my Sovereign Bank business checking account. And this lovely notice was on the first page.



If you can't read the image, it says: "Beginning in June, non-customers cashing checks drawn on your Sovereign Bank business accounts will be charged a $5.00 fee by Sovereign Bank unless you have agreed to pay some or all of this fee." It then goes on to describe a non-customer as essentially anyone that does not have an account with Sovereign Bank.

$5 to cash a check? That seems very unreasonable. Is depositing a check into another bank account considered cashing a check? I don't think so, but it isn't entirely clear to me. I'm pretty sure this is aimed at people walking into a bank branch and converting a check to cash.

I don't actually write many checks. In fact, I think the only checks I write are to myself. So, I'm not going to get hit by this fee, but I've had enough of the large banks and their unreasonable fees. Sovereign Bank you've just lost a customer. Local banking, here I come!

Sunday, February 27, 2011

WCF/MSMQ/.Net 4.0 stops processing messages–code fix

I recently upgraded a self hosted WCF service to .Net 4.0. Everything seemed to go well, but a few hours after deploying the service messages started piling up in the queue. I check the service host process and it was idle. It was still executing, but the service had stopped pulling messages. After a bit of trial and error I have discovered the problem and hopefully a solution.

The problem

It seems that .Net 4 has changed the behavior of WCF/MSMQ services slightly. If a problem occurs with the queue, the service host faults and stops.

The solution (I think)

Here is the code I came up with to address the problem. I wrote a class that creates the ServiceHost and attaches an event handlers to the Faulted event. If a fault occurs, I abort the ServiceHost and restart a new instance. (This is for a self-hosted WCF service – if you are hosting your service in IIS/WAS, I think you need to create a ServiceHostFactory class.)  Here is my code:

    /// <summary>
/// In .Net 4, the WCF/MSMQ behavior seems to have changed slightly. Occasionally, the service will fault and not restart causing
/// messages to get stuck in the queue. This class attempts to abort and re-start the service host in the event of a fault.
/// </summary>
/// <typeparam name="T">Your service class</typeparam>
public class RestartingServiceHost<T>
{
protected static readonly log4net.ILog Log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
ServiceHost _serviceHost;

public void Start()
{
_serviceHost = new ServiceHost(typeof(T));
_serviceHost.Faulted += new EventHandler(host_Faulted);
_serviceHost.Open();
}

private void Restart()
{
_serviceHost.Abort();
Start();
}

private void host_Faulted(object sender, EventArgs e)
{
Log.Info("Host faulted. Restarting service.");
Restart();
}
}


To use the class, simply create an instance of this class using the type of your service class:



var host = new RestartingServiceHost<MyService>();
host.Start();



That’s it. It seems to be working so far.

Setting up MVC 3 for Azure with a scripted build

 

I just spent a little time setting up an Azure project using ASP.Net MVC 3 and a scripted build. I thought I would share the solution. Be warned, I'm not an expert in this stuff -- let me know if you have any ideas on how to improve this.

Step 1: Set up the directory structure.


I like to follow a relatively simple directory structure for my projects. Of course you can use your own structure; I like this one.

\ProjectName\src     <-- source code, project files, etc.
\ProjectName\lib     <-- 3rd party libraries
\ProjectName\tools   <-- tools (FxCop, 3rd party build tool components, etc)

For this post, I will be using the project name "Gamma".


Step 2: Create the VS2010 solution



Start Visual Studio 2010 and create a new solution. I like to start with a blank solution and add projects as needed. Save as "\Gamma\src\Gamma.sln"



Add a new Windows Azure project. When you are prompted to add roles to the Azure project, skip this step. At the time of this post, the VS templates do not include an Azure MVC 3 project. To get around that problem. Add the empty Azure project to the solution. Then add a new project to the solution and select an MVC 3 project. Then, in the Azure project, add a "web role project in solution..." and select the MVC 3 project.



So, now you should have \ProjectName\src\ directory with one solution file, a directory for the Azure project, a directory for the MVC project and possibly a directory for the unit test project if you elected to add one.



Step 3: Create command line build scripts



I like to have a command line builds, especially for automating your build with a tool like Hudson or Jenkins. I found a post a while back about automating .Net builds and I really liked the approach. I will try to search for a link later to give credit. You create 3 files:




  1. a simple MS Build project file that builds your solution,


  2. a script file to kick off the build from the command line and


  3. a batch file that allows you to double-click on a file in windows explorer to produce a build.



Here are the file's I use:



build.ps1 -- powershell script for command line builds



if ($args)
{
& $env:systemroot\Microsoft.Net\Framework\v4.0.30319\MSBuild.exe build.proj /t:$args /verbosity:minimal
}
else
{
& $env:systemroot\Microsoft.Net\Framework\v4.0.30319\MSBuild.exe build.proj /verbosity:minimal
}


ClickToBuild.bat -- batch file for kicking off a build from windows explorer



PowerShell -Command ".\build.ps1"
pause





build.proj -- MSBuild file for building your solution.



<?xml version="1.0" encoding="utf-8" ?>
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

<PropertyGroup>
<ProjectName>Gamma</ProjectName>
</PropertyGroup>

<Target Name="Package" DependsOnTargets="Clean;DebugBuild">
<Message Text="Creating Package for $(ProjectName)" Importance="High" />
<MSBuild Projects="$(ProjectName).sln" Targets="Publish" Properties="Configuration=Debug" />
</Target>

<Target Name="DebugBuild">
<Message Text="Building $(ProjectName)" Importance="High" />
<MSBuild Projects="$(ProjectName).sln" Targets="Build" Properties="Configuration=Debug" />
</Target>

<Target Name="Clean">
<Message Text="Cleaning $(ProjectName)" Importance="High" />
<MSBuild Projects="$(ProjectName).sln" Targets="Clean" Properties="Configuration=Debug" />
</Target>
</Project>





These are relatively simple. I save them the root of the "src" directory. In Visual Studio, I add a solution folder named "build" and add these 3 files to that solution folder.



The important thing to note is the Targets="Publish" in the package target of the MSBuild file. The "publish" target is what creates the Azure package files that you need for deployment. By default they will be in the \bin\Debug\Publish\ directory of the Azure project.



Note: Azure does not have the MVC binaries by default. You will need to get these to your Azure instance. There are a couple ways to accomplish this. You can either include all the DLLs in your project and set them to "copy local", or you can install MVC3 as part of the Azure startup script. Links to come later.



Let me know if this helped. If I find some time I will add some screen shots.

Tuesday, November 9, 2010

Troubleshooting SQL Mirror errors

I was setting up a new database server and I had some difficulty in setting up the mirroring, so I thought I would post here just on the off chance it helps someone. We are still using SQL 2005, but I think the same ideas apply to SQL 2008.

When I tried to start mirroring I would get an error like this:
The server network address "tcp://[server name]:5022" can not be reached or does not exist. Check the network address name and that the ports for the local and remote endpoints are operational.


First and foremost the single most important thing I learned was this: If you get an error while trying to start mirroring check the application event logs on both servers. The error messages you get back from the UI are not helpful at all. The event log messages will lead you very quickly to the error.

In my case, the first error I got was:
Database Mirroring login attempt failed with error: 'Connection handshake failed. There is no compatible encryption algorithm. State 22.'. [CLIENT: ]


In my case, this error indicated that the two database endpoints did not have the same encryption algorithm selected. I was working with an existing server that has mirroring already running and a new server that did not yet have mirroring running. I didn't pay attention in the mirror setup wizard and did not check the box to indicate that the connection should be encrypted. This left the original server requiring encryption and the new server without encryption. You can check this by running the following query on both servers:

SELECT * FROM sys.database_mirroring_endpoints


Compare the values of the encryption_algorithm_desc column from both servers. In my case I had one server with RC4 and one server with None. They have to match. The solution was to drop the endpoint on the new server ('DROP ENDPOINT Mirroring') and re-run the mirror setup wizard and checking that the mirror server required encryption.

Once I figured out that error, the second error I got was:
Database Mirroring login attempt by user '' failed with error: 'Connection handshake failed. The login '' does not have CONNECT permission on the endpoint. State 84.'. [CLIENT: ]


Again, it was a configuration problem. In this situation it was because I had forgotten to set the user that the SQL service was running as. I believe SQL has to be running as the same domain user on both servers or, I believe there is some way to use certificates. In my case, my existing server was using the domain ID, but on the new server the service was running as 'Local System'. You can check this by examining the credentials used by the the 'SQL Server' service in the server administrator tools. Once I set the new server to use the same domain ID, mirroring started up successfully.

I hope this saves you some time.