So here it is, in all of it un-glory, my Technorati rank. rank. Jeez, it's pathetic; maybe I should stay home and watch re-runs of the "Love Boat" and "Gomer Pyle."
OK, screw that. But if you link to me, I won't complain. Maybe I should make a business plan for my blog, set some objectives for my Technorati ranking. I can probably post pictures of Paris Hilton or something to get some links. Yea, that's it.
( Mar 31 2004, 05:07:24 PM PST )
Permalink
Here's my list:
In the end, I think getting all of these parts working together will render a whole greater than the sum its parts. Getting a collaborative environment functioning isn't easy but a lot of these tools are readily available off the shelf to help it along.
( Mar 30 2004, 03:04:12 PM PST )
Permalink
The first one was a little bit about tricks with server side includes ("SSI's") and some of the crude things you can accomplish with them that, while falling short of a "real" programming language, are surprisingly powerful. The browser detection stuff is very old and yet still very pertinent today when folks want to determine on the server side which CSS style sheet to send the browser. The other HOWTO is some notes I'd had about messing with HTTP cookies.
So I performed some minor updates on these things and reposted them:
The SSI Mini-Tutorial | ||
The HTTP Cookie Mini-Tutorial |
I went to Golden Gate Perk where they have free WiFi for paying customers. Great! I ordered a chicken pie but sat down to realize that my wireless card (I have an old Orinoco "Silver" card which only supports 64 bit WEP) wouldn't work, they only support 128 bit WEP. Well, the chicken pie was good.
If you go to Google's search results for WiFi downtown, there's no way to refine the search to show
Some of these robots are very persistent (or just plain dumb). They try to access pages that haven't been around for a long time getting 404's or redirects elsewhere. I'm presuming that after some number N occurences of non-success responses, that these robots will get a clue and just crawl the links that are there... for some reason, new links that've shown up aren't crawled. Here's a list of crawler URL's:
Between the robots and viruses, I probably have as many software entities hitting my site as I do human readers.
( Mar 24 2004, 11:01:06 PM PST )
Permalink
Refactoring: making things better in little increments I had a nice conversation with some folks who weren't looking to for help building The Next Great Thing (that was yesterday's nice conversation). They just had a mish mash of code, all Perl, old and new (but mostly old), that needed to be revamped. They didn't say that they needed a refactorer but that's about what it boils down to: their code needs some serious refactoring.
The funny thing was, the code in question sound like the kind of stuff that you'd typically think of as refactoring candidates. A lot of it is apparently old CGI's (run as Plain Old CGI's or Apache::Registry scripts) that atrophied into speghetti of gobbledy gook. So the problem there isn't one of taking software objects and restructuring them for better design, the first refactoring, if you will, is to take a bunch of procedural code and re-compose it as an object system.
Frankly, this could sound really boring. But on the other hand, there can be something very satisfying in something like this. Have you ever set aside an afternoon or a day to clean out the garage? You sort through a bunch of junk, shelve things that were set down on the floor, throw out a bunch of garbage, fire up the shop-vac and vacuum out all of the cob webs and organize the toolchest. In the end, you haven't produced anything tangible with any value. But you've created something that allows for creating value. I would never go into the garage and build a planter box if it's all cluttered and dusty. It's just too grotty.
Code gets the same way. It gets cluttered and dusty. Refactoring is explicitly not about creating tangible value (i.e. you don't add new features in a refactoring), it's about creating an environment within which tangible value can be created. Fowler sez that refactoring is more than just "cleaning up code" - it's a more methodical, tested and controlled fashion of cleaning up. That's all good. Unfortunately, it seemed as though I drew a blank stare from these folks when I talked about testing. I mean, I understand that test harnesses in Perl such as plain old Test, Test::Unit and Test::Harness are hardly standardized or well entrenched in Perl culture but it was kinda disheartening to see only vague recognition of the value of writing tests. When you're done cleaning the garage, there's that satisfying feeling knowing that you've brought a little order to a chaotic corner of the world. I get that same feeling when I'm working on code, breaking it and fixing it and when it's all said and done, running the test harness; see "100% passed" is another way of bringing order to the chaos.
Anyway, so the basic starting points for refactoring were absent: a system of objects and a test harness but if the goal of refactoring is to "make the software easier to understand and modify" (quoting Fowler, again) then perhaps just getting things to that starting point is the first important refactoring to undertake. Somehow, I don't think the connection between the "fire fighting" mode that these folks were perpetually in and absence of tests was immediately clear to them. They know that "things" need to be fixed (which is a good start) but IMO those fixes need to come in little increments of refactoring.
Now, I've never unit tested a garage cleaning effort but today's conversation made me consider how that'd be done.
( Mar 24 2004, 10:43:06 PM PST )
Permalink
While I'm not especially enamored with CVS, it's like an old shoe. It's kinda stinky but still comfortable; you know how it fits and what its limitations are. Arch looks like a whole new beast, with funny naming conventions and this concept of categories being central to its repository model. I suppose the motivation is in part to replace bitkeeper as the linux source repository (inferred from all of the references to it be "suitable for free software development" but perhaps I read the wrong inference). Now I like Larry McVoy, he's really a good guy. It'd be weird for me to use a product the is intended to to undermine his business. On the other hand, that's the wrong reason not to pursue what may be a better technology.
From time to time, it's good to just go out and try on some new shoes; I guess I'll look more closely at Arch.
( Mar 23 2004, 07:19:42 PM PST )
Permalink
Martin Fowler recently wrote an article about this. In discussing the use of offshore programmers with his company, one of the things he mentions is using Cruise Control to aid continuous integration of multiply located collaborators. This sounds good. I've seen lots of problems with Cruise Control grinding down with OutOfMemory errors but I'm more suspicious of running in-container tests like Cactus than I am of Cruise Control per se.
An article I read in the paper the other week mentioned that the help-line calls for the welfare and food stamp programs in California are routed to offshore call centers in Mexico (for Spanish speakers) and India (for English speakers). What would be grimly humorous is if the State of California's Employment Development Department started offshoring its labor.
( Mar 22 2004, 09:35:16 PM PST )
Permalink
On the one hand the favicon.ico icons have always seemed somewhat silly to me. But Mozilla does something nice: it puts the icon in the tab, not just in the location bar -- ah, that is useful. So I set out to make my own arachna.com icon. I had to install the kdegraphics rpm for redhat9 (kdegraphics-3.1-4.i386.rpm) to get kiconedit but once that was all said and done, I set to work making a little 16x16 icon.
Working with a canvas that small is hard; my spider looks more like an ant! I'll have to assign a problem report to myself in bugzilla.
( Mar 19 2004, 11:41:03 PM PST )
Permalink
It's definitely a case of the something old/something new/something barrowed/something blue... there's a lot of stuff in Perl6 to like and yet it seems like some of the opportunity to unclutter the language appears to have passed us by.
Oh, darn.
Anyway, my thoughts on the matter are captured here
( Mar 18 2004, 08:41:25 PM PST )
Permalink
MS-Word to your mutha At first nobody I spoke to about work options cared about what format my resume was presented in; the plaintext I could generate via XSLT or the web-based resume itself was sufficient. Recently though, HR/recruiting types want to see it in MS-Word.
It never ceases to amaze me how a product such as Word can be so easy for the straight forward things but get so messed up as soon as things get a little complicated. I went through what I was planning on being a quick exercise in pasting the plain text in to Word's "Elegant Resume" template but I found that formatting defaults carried in by Window's clipboard were not desirable and in fact, the template itself had some things in it that required adjustment. Next thing you know, Word is crashing every other minute. At first I threw up my hands and tried getting what I needed to get done in OpenOffice but it's word processing application also blows as soon as you need to do anything slightly complicated. I went back to Word and finally got it to behave by walking through all of the table cells, selecting the text blocks and removing all of the pasted in formatting attributes; letting Word just use the default fonting attributes from the template seemed to make it much happier.
Well, it's done for now and posted online. Given how deficient the OpenOffice word processor is, MS-Word continues to be a necessary evil. And OpenOffice's only application that is remotely useful is the presentation (Powerpoint replacement) application.
( Mar 18 2004, 08:40:33 AM PST )
Permalink
Here's the deal:
The tutorial covered SOAP interoperability between implementations in Java (using Apache Axis) and Perl (using SOAP::Lite). Some components on the Perl side required writing a few little modules. Nothing CPAN-worthy but still I wanted to package the modules nicely so users of the code samples can get the Perl side running without a lot of hand-work.
I read the man page for h2xs, the perldoc for ExtUtils::MakeMaker, I grabbed ExtUtils::ModuleMaker off of the CPAN and checked out it's docs ...the latter seems like an improvement but my needs were, IMO, real simple and I just wanted an appropriately simple set of steps. In the end, I followed a procedure similar to the one described here, bascially bootstrapping the package with structure with h2xs and manually tweaking the file system layout into something nice. Makes me wish Perl had a well defined self contained packaging structure similar to Java's jars.
Of course, then I'd want ant implemented in Perl to handle the packaging! So it goes. At the end of the day, the tutorial is at last online.
( Mar 16 2004, 09:14:49 AM PST )
Permalink
There are a lot of different flavors of agile development. Some of them work best, ideally anyway, if all of the practices in that particular flavor's agile "recipe" are adhered to. It's been my experience that most situations are less than ideal. People come to work with a lot of preconceptions about the right way to do things or with their own ego issues that get in the way of buying into an agile methodology lock, stock and barrel. However, I found the book Agile Software Development Ecosystems (by Jim Highsmith) very thought provoking and led me to conclude that depending on the context of the workgroup, there may be no perfect fit for a particular methodology but not to despair. If you can get people to agree upon principles and values about what they want to accomplish, values that are open to new ideas and not doctrinaire about waterfall processes, it is possible (and in fact likely) to create an agile environment borrowing what is needed from one or many of the established agile development flavors. A properly established environment should be generative of agile practices and that's what I'm really looking for in my work place - instead of rigid rules, agree upon principles and communication norms and the appropriate practices will become clear as everyone's work styles come into play.
At least, that's my current thinking on the matter; I'm open to new ideas.
( Mar 12 2004, 10:53:22 AM PST )
Permalink
After the next build was released, a new bug was filed that the browser "Back" button didn't work anymore. MSIE would display an error message about the content being expired (indeed, I'd instrumented the filter to send an HTTP Expires: header) whereas Mozilla (my browser) wasn't quite so facist with the shrill error messages.
The resolution was that the Expires header was ditched. Something else came up with screwy caching behavior and one of my colleagues turned off the filter, declaring that using META HTTP-EQUIV tags in the JSP was the "right" way to fix it. I never got a good explanation as to why that was "right" but as long my old bug wasn't reopened, I didn't lose any sleep over it.
I believe they say, "c'est la vie"
Now OnJava has a pretty good discussion of these issues, here it is
( Mar 11 2004, 11:57:27 PM PST )
Permalink
The content management stuff I used to do at Salon.com has flourished; going from being an ill-conceived spin-off business to a successful open source project (Bricolage) [warning: I like java as much as anyone but I also like Perl, if you're a Perl-phobe, don't click that link!]. The web server management stuff I did at Covalent was also admittedly ill-conceived but in the end it was an interesting exercise in identifying the customer's pain points, pinning down their concrete scenarios and appropriately scoping a product to fulfill those needs. These products had little in the way of end-user customization. The display that the user saw was the display that was coded into the UI logic. As a reference point, that's not necessarily a bad thing. But as one sees the different roles that end-users fulfill using an application, allowance for end-user customization of the display seems increasingly important. Thus, the value of decomposing the display elements into portlets and allowing them to be aggregated by a portal framework.
The next product I worked on at Covalent was an application management web interface. The first version had a nice dashboard oriented entryway portal but had hardly any end-user customization as it displayed system objects. The architecture that was decided upon for the next generation ("two oh") product specified that just about everything be presented as a portal, ostensibly allowing the varyingly-role-filling end-user to see what was important to them at all times.
The framework used for the portal was a homebrew built on struts and tiles, for two oh we had a homebrewed jsr-168'ish framework underway. I'm certainly glad we didn't paint ourselves into a costly, proprietary corner with one of the Big Commericial Portal Frameworks. I'm actually wondering how the traditional big boys in the web infrastructure market can sustain six-figure price tags for frameworks like that when there has been so much activity around jsr-168 and open source portals.
( Mar 09 2004, 08:20:45 AM PST )
Permalink