Lately, I’ve been working on code that fetches EDI files from and FTP server using a VM running Windows XP. But yesterday rather than go through a VM I decided to use Vista’s FTP server for testing which turned out to be a bad idea.
I’m new to this code base so I’ve been stepping through it to see how it works and immediately started having problems where the app failed to download files off the server. The code is based on edtFTPnet source so I started stepping through it to see what was going on and found it uses FTP’s LIST command as follows “LIST *.NS”. That all seemed fine but I never got any results back. After double checking that the expected files were really there I jumped to the command line to test things out. In my first test I ran into the fact that LIST isn’t supported from Windows FTP.EXE command line tool since it requires PASV mode. That’s ok I’ll just use the DIR command:
No dice. While a plain DIR command works, specifying a wildcard of “*.NS” failed! What’s up with that? I fire up my XP VM and this works just fine under XP. It turns out this is a bug in Windows Vista’s FTP server. Thanks Noel for finding that link. Btw, we both got a nice laugh over the suggested workaround:
“To work around this problem, do not use an FTP LIST command together with a file mask argument.”
Yeah, thanks that’s a big help. On a side note, I just found out the broker our client is exchanging EDI files with is switching FTP server software, I just hope it’s not to Windows Vista.
Just a quick note to say I’ve updated my Facebook.NET starter kit to version 0.2.1.0 of Facebook.NET and changed the application to incorporate some of the new features so be sure to grab the latest copy.
Back in July Nikhil Kothari released Facebook.NET which is an ASP.NET wrapping of the Facebook API. Facebook.NET is a bit different approach than Facebook Developer’s Toolkit and provides some ASP.NET controls that wrap some of the more useful elements of the FB API. Nikhil and I exchanged a few emails back in July and he asked if I’d put this together but I’d just started a new job and was in the midst of remodeling my house and now things have settled down I decided to get it out.
Use of this starter kit is very close to the previous one I released. The kit includes a Welcome.htm which is displayed in VS.NET when you create a new application that has all of the details of getting the two flavors of FB application (IFrame and FBML) up and running. One note, I tweaked the FB API version in FBML\Default.aspx since “0.3” is no longer supported.
Let me know if you have any problems. Download
[UPDATED: Jan 14, 2008] I updated the Starter Kit to v0.3 of Facebook.NET. Additionally, the starter kit has been updated to VS.NET 2008
[UPDATED: May, 6, 2008] Here are some additional links:
I’m a big user of console tools/utilities and over the have years written my fair share nearly all of which were written in Delphi. I’d love to see CodeGear release the Delphi command-line compiler and RTL like what was done in the past for Borland C++ command line tools. Of course, I’d want it to include a broad license for strictly console based applications.
Truth be told, this isn’t a new post, I wrote months ago while still working at CodeGear. At the time, I decided against posting it but my feelings haven’t changed and when I found myself tweaking one of my command line tools it reminded me of this entry so there you go…
Btw, congrats to my friends at CodeGear for getting Highlander out the door!
Here’s a funny email exchange that occurred the other day when one of our CruiseControl.NET
builds failed, at least it gave me a good laugh.
I can see it now, a version of CCTray
that has various balloon hints users can trigger when they cause a build failure like “Not my fault” or “Nice goin’ Ed”.
At Borland/CodeGear, Allen Bauer is famous for this line. There are days when I wish I had his machine.
Yesterday I took my first ever Microsoft Certified Professional test and yes, I passed. I took test 70-315 otherwise known as “Developing and Implementing Web Applications with Visual C# and Visual Studio.NET”, a mouthful for sure. Lino asked me to take it in light of the fact that the rest of the Falafel team is certified. I think I spent a little over four hours studying with this book which I found quite helpful. I focused on Internationalization and deployment since I’d pretty much touched on everything else. I found the included practice exam was right on the mark and good preparation for the test.
When I arrived at the testing center and the attendant was on the phone so she setup a computer and let me sit down. I was given no directions to speak of and after clicking through the first few screens for NDA’s etc. the test started. When I finished what I thought was the first set of questions I was given a chance to review my answers and continue. At that point, the test ended rather unceremoniously after 45 or so questions and gave me a “Congratulations you passed!”. I informed the attendant and she came over and printed a piece of paper with the details, said “congrats” and I left.
It’s funny I was struggling to recall the last exam I took and I guess it would probably have been my driver’s test back in 1999 when I moved to Seattle to work for Microsoft. Anyway, I’m over the test taking anxiety even though Lino assured me it would be easy. He was right.
Previously, I wrote about setting up IIS logging to allow for better performance analysis using Microsoft Log Parser. Now, that we’re collecting the necessary data it’s time to start querying the logs and see what we can find as well as prepare stats that can be viewed in a browser. Since I’m fairly new to Log Parser and considering it has been rather widely covered I’ve decided to touch on making reports available from the server.
I’m interested in setting up an automated process where log file stats get generated to html files and published from the server. Fortunately, Log Parser supports just such a thing using a template mechanism though it seems to be only lightly documented. Basically, you can create a template HTML file with specific tags which Log Parser uses in a mail merge-like fashion. Here is an example template and its resulting output/screenshot:
<head><title>most hit urls</title>
<h1>most hit urls</h1>
To use this template with Log Parser the command line looks like this:
"C:\Program Files\Log Parser 2.2\LogParser.exe" "SELECT TOP 10 cs-uri-stem as Url,
COUNT(cs-uri-stem) AS Hits FROM ex0709*.log GROUP BY cs-uri-stem ORDER BY
Hits DESC" -o:tpl -tpl mosthit.tpl > mosthit.htm
Note, this particular query is from the Jeff Atwater’s excellent blog post on Log Parser which you should definitely check out since he’s posted a number of other useful queries.
You can see I’ve narrowed the range of files to the month of September (ex0709*.log) and I need to come up with a scheduled task to produce rolling 30 day reports. Btw, the report could look substantially better and given Log Parser’s support for many different output format’s including a fun but rather useless “Matrix” style there are lots of possibilities. If you’ve created some useful/interesting templates I’d love to hear about them.
Anyway, with log parser in my toolbox I can start watching trends over time to learn more about usage and performance of the application. On to more immediate methods of analyzing perf!
One task that I’ve been working at Falafel on is performance tuning of a large ASP.NET 2.0 application running on Windows Server 2003 under IIS 6.0 which is load balanced across three machines using a SQL 2005 back end. I’ll post a series of entries related to the work I’ve been doing in my investigation.
Configuring IIS Logging for Performance Analysis
First, I wanted to dig into the IIS log files to get a good understanding what parts of the application were being used to most and start collecting performance metrics. Unfortunately the default configuration for IIS doesn’t include some of the more useful fields for log file analysis so correcting that was the first step. The default IIS settings don’t include some of the more useful fields for performance analysis so you’ll need/want to add them manually. You can do that by:
- Launching Internet Information Services Manager from the Administrative Tools menu
- Expand the treeview to Default Web Site, right click and select Properties
- On the Properties dialog click the Web Site tab and check Enable Logging
- Click on the Properties button in the Enable Logging groupbox and make sure to check the additional fields indicated below (all of the one’s I’m interested in are in this screen shot:
Note, the names following each option in parenthesis as they’ll be useful when we analyze the data.
Of course, after making these changes we need to wait for the log files to start accumulating useful data before we can go much further. In the mean time, be sure to download Microsoft Log Parser to help analyze the results. Log Parser is an incredibly powerful tool which brings the power of SQL to the task of analyzing plain text files as well as other data stores like the Windows Event log. Here is a link from my Google OPML search engine with lots of great Log Parser related posts. I tried Google’s recently added search support but it didn’t yield anything even remotely useful which left me scratching my head wondering what it’s really useful for.
Next, I’ll take a look at putting some of this new logging information to good use.