Other things you may like to know


Find out aboutWhy REXX and ARexx are being used?

Find out aboutPossible further developments

The development of this site, and the associated software being used to create and maintain it, started on Saturday 12th-Oct-96. I had the idea in the morning, sketched out some preliminary plans, ran a few ARexx test scripts on an Amiga A4000 in the afternoon, and by early evening a small test site was up and running on Compuserve's ourworld server. The following day I extended the ideas to allow additional files to be automatically pulled into the Web pages as they are created. And I wrote the date stamp extraction code that allows the generator script to determine when pages have been updated. One day after that, the Monday, a skeleton framework containing over fifty pages was up and running. Needless to say it's going to take a while to get the site up to scratch content-wise, but already it's clear (at least as far as I am concerned) that Webbing is far more enjoyable if you can eliminate the trivial (but still time consuming) tasks of building menu links, inserting page titles, checking that menu items and page titles correspond and so on.

During mid-November the, relatively easy, job of converting the ARexx scripts into Personal REXX form for use on the PC was started. This took just a couple of days and so now I have both ARexx and REXX software versions running in tandem. The reason for the conversion to Quercus Personal REXX, incidentally, is that I am reviewing the package and wanted to test/use it with real-life applications rather than experimenting only with trivial test scripts.

As far as this particular Web site project is concerned I'm already convinced that such automation methods are the only sensible way to go. I am also aware, incidentally, that there is now increasing general awareness of the usefulness of automated site tools. Microsoft for example have recently added site development facilities with their latest Visual SourceSafe system. I've not yet seen it but apparently it can check for broken page links and so on. Doubtless the software is good although my immediate reaction is that perhaps there is a flaw in the basic 'link checking' argument. Surely, if you let the computer create the links in the first place then you would never be in a position to have to fix them anyway! Having said that we ought to remember that 'good old Bill' didn't get where he is today without being right about a few things - so chances are it's me that's out of step here!

I'm just following my own pathway at this site. Firstly, for fun... and secondly for the hell of it. So far it's cost me nothing but a few hours of my time - and the REXX/ARexx software being used for all site management (originally around 4K) is, with various indexing and date stamping enhancments, still increadibly small (main script is about 8K). OK, so it is growing slightly as I am making minor enhancements but we're still talking small programs. What's more the basic coding principles are not even hard to understand - so why on earth there should, in general, be this ever increasing drive to have megabytes of ready-made software around to solve what are essentially trivial structuring problems God only knows! Such thoughts, surely, have got to count for something in the great scheme of things - and I'm pretty sure that there are plenty of other Webbers who feel similarly about these issues!


Go back toReturn to page at previous level

Go back toReturn to main index page

Page last updated 21-Nov-96