|
|
No, my hard drive hasn't gotten any better. It's definitely heat-related, I was able to make it operate for extended periods of time by putting a fan several inches above it and blowing cool air on it - but the placement of the Amiga, the hard drive, the power supply, and the length of the IDE cable, means the computer itself is inconvenient to use like that.
So someone I work with decided to sell off one of his hard drives. I jumped. I broke $125 out of my emergency stash - and this morning went home with a 1.2GB Conner. Yes, I know - a Conner. I already have a collection of broken Conners. At any rate, this was made right about the time Seagate bought them - so maybe it won't have the usual Conner trickery.
This is great, I'm thinking: I'll upload the current hard drive a few megs at a time to an offsite private FTP server, physically swap drives, and download it back. I'll have an extra 400MB to play with too - and I'll probably partition that off into a "play" space, where I can swap Linux or other OSes in and out safely. And of course, while I'm setting up the drive, I'll have to make a really neat screenshot showing the Commodore Installer putting AmigaOS on hd0: while hd1: and hd2: are formatting.
It didn't happen that way.
Here's the situation: HDTools can find and identify the drive. I can partition it. But the Amiga will NOT recognize the drive on bootup - the early start menu shows NOTHING but two floppies and a nonexistent PCMCIA card.
It gets stranger. The drive's spinning up in plenty of time for the OS to acknowledge it - it just ISN'T. So it boots to floppy, brings up Workbench, and shows no hard disk icons.
So I run HDTools again - and during the "checking for devices" stage, SOMETHING it says to the drive makes it acknowledge, and suddenly all the partitions appear on Workbench.
I can install Workbench on the hard drive. I've made it all automountable, FFS, and the first partition is bootable. Yet the machine will NOT automatically recognize them unless I run HDTools. Even AFTER running HDTools, if I reboot the Amiga, and as with the Seagate this drive doesn't spin down during a reboot, the drives disappear again and I must run HDTools to make them visible.
Is this a known problem with Conner hard drives?
Anyway. I can still operate the Amiga, if I assemble the cooling apparatus over it. It's just a pain in the butt. And I don't trust the data on the drive anymore - whatever's causing it to do this, it DIDN'T always exhibit this problem, so it's a failing component somewhere. (It appears the A1200 is not the problem - that Conner can stay operating for hours and not spin down, and the problems I described above don't appear to be related to those shown by the Seagate.) The drive seems to "overheat" at a temperature that's warm to the touch but not boiling; it used to get a LOT hotter on top than this, and it never had this kind of problem. At any rate, keeping a fan on it keeps it running - but since I don't know what's failing, I don't know how long it'll last.
The war continues. For a week or so, I was on the Industry Council/Open Amiga mailing list - and had to leave because it wasn't available in digest and I don't have time to wade through 80 e-mails a day. Besides, after a while things settled down and it got "boring" - they actually started getting down to business. I hate when that happens. :-)
A topic that appeared briefly and disappeared was the question of what Joe User wants, and how to provide that. There were some good answers - but they all seemed to get too deep too quick. I saw "Joe User wants to run industry-standard off-the-shelf software" and similar answers. Uh... let's try this again.
What does Joe User want?
Answer: to be happy.
Sounds simplistic? It is. This is basic economic theory - the concept of satisfaction. Humans and planaria and single-celled lifeforms alike all move in the general direction of that which they think will make them happy. Before you point to Trent Reznor or whatever and say "that's not always true," consider that if you enjoy being depressed and angry, it counts - if being sad makes you happy, you'll be willing to expend energy in the pursuit of sadness. But that's out of our scope here - with the possible exception of those who buy lousy OSes because they WANT to be frustrated, and I doubt that counts for much market share.
Computers do not make people happy. Well, I suppose computers DO generate happiness - if you're a nerd like me. Unfortunately, Joe User isn't going to derive much entertainment from shuffling files around on his hard drive just for the sake of it. Neither will Joe User derive entertainment value from rearranging his Word toolbars - it is marketing alone that has convinced him he may find happiness in Word.
Microsoft has fallen for this trap repeatedly, and their advertisements show it - the notion that if you just get Windows 95 and Office 97, you will be eternally happy. Never do you see anyone DOING anything with them - how appropriate that Bill's book, "The Road Ahead," shows a road going off into nowhere in the middle of a desert with no scenery or civilization in sight. Apple ads (when they appear - like the Loch Ness Monster, they're seen but rarely photographed) tend to showcase professionals making brilliant use of the technology. IBM ads used to list off something you do every day and point out how OS/2 made it possible. Compaq ads used to show people with eccentric lifestyles and show how their laptops could accommodate them. But Microsoft ads show... not a whole hell of a lot of anything, except a pixelated jumble of icons. But I digress. Microsoft is clearly trying to sell to people who don't know what they'll do with a computer once they own one - and so seem to be approaching it from the attitude that the computer, or the OS, or some monolithic applications, or just plain Microsoft, is the end in itself.
Computers do not make people happy. It's what you DO with a computer that generates the happiness. It's not the case. It's not the keyboard. It's not the CPU. It's not the OS. And it's not even the software. It's what the user DOES with the software. The same is true of any technology - coiled wire and ferrite and polypropylene don't elicit smiles, even when they're assembled into speakers, unless there's actually something worthwhile playing THROUGH those speakers.
The Internet, for that matter, is not an end in itself - nor is the Web or Usenet or e-mail. But if Joe User defines happiness as "pictures of Jewel's nipples showing through her dress at the awards show" then yes, happiness will be found on the Internet. The Internet is unusual among geek-toys: it's more social than technological. End users neither know nor care whether they're on fiber or copper, whether it's TCP/IP or TCP/IP-over-ATM, whether they have static or dynamic IP addresses, or what OS their ISP runs - if they can get on-line and read today's news the minute it happens, and chat with people all over the planet, and do it for less money per week than AOL wants per hour, then it's the thing. The Internet has transcended its technology - in some respects it's also outgrown it, but that's another issue, and the Internet and its social structures and human cross-section will still be there no matter what pipes carry the data.
This is the goal we should work toward in technology. You can spot primitive technology by looking at the quality of the things made with it - an inflexible or backwards system will inevitably impose itself onto whatever you're trying to do with it. For proof of this, take a look at the Internet and notice how Web pages made with Microsoft FrontPage all look alike. You probably recognize a similar phenomenon - Greg Gorby of Nova Design, author of Aladdin, has mentioned on occasion how early Amiga 3-D packages were easily identifiable by the program that made them. Even today, you can spot the difference between an Imagine render, a Lightwave render, and a Real 3D render by sight. Hell, I'm to the place where I can spot Lightwave renders in TV commercials.
Why? Because computer software is still written by programmers, and in most cases, for programmers. Microsoft writes entire operating systems this way. Apple has started to go that route as well. As a result, computers have all evolved together into a sort of "meta-OS" where things look and work alike no matter what you're sitting at. The only real differences between various UIs these days is in the details: scrollbars, color schemes, fonts, which corner the close button's in, window style, button look, whether the menu's at the top of the window or top of the screen or is just "pop-up", etc. Underneath the hood, modern OSes are a mess as well - even the beloved Amiga is saddled with a decade's worth of compatibility kludges. We won't even talk about the Mac - the main reason it lacks preemptive multitasking is because most of the Toolbox and the 68000 emulation are not reentrant. Of course, the BeOS has the most modern underhood design of all these - but it also has a boringly conventional GUI. The BeOS should have been invented five years ago - it would just about have been state of the art, in the sense of "making an OS that has all the features a modern OS should have," which nobody else is doing. Most of what's in the BeOS existed in 1992 - thus the BeOS would have fit in just fine. And by finally fixing all the underhood problems of modern OSes, the BeOS or OSes like it would have paved the way for us to be using the next biggest-and-greatest thing by now.
That hasn't happened.
It's 1997 - and the GUI you're using is older than I am. The whole windows-icons-pull-down-menus metaphor was invented in the 1970's at Xerox PARC - it was commercialized with the Macintosh and has remained frozen in time ever since. The basic user elements haven't changed any. Resizable, movable, overlapping windows where one has focus. Scrollbars that indicate which part of a larger document a window shows you. Drop-down and pop-up menus. Clickable buttons. Collapsible trees - and other collapsible onscreen objects. Folders inside folders. Drag and drop.
Of course the GUI is an excellent thing. Just keep in mind two things:
On the second item, in particular, Microsoft has bitten its own genitalia repeatedly: in Windows 95, a user cannot tell a button from a menu from a window. They look alike - they don't always do the same thing or work the same way - and are linked together in such a way that the user never knows from context which they're seeing. The Start button is a blatant example of this - a button that brings up a menu. Menus should bring up menus - and everywhere else in Windows 95 they do. It also doesn't help that clicking on different parts of the Start Bar does different things, even though there's no visual indicator of what you need to click on or what it will do when you click on it. Microsoft has long had a habit of making icons that are unidentifiable - or otherwise unclear - and it has extended that graphical ambiguity to an entire OS. Fifty percent of the GUI widgetry in Win95 reacts as you'd expect it to visually - buttons sink in, etc. - and the rest, if it reacts at all, reacts nonintuitively. The whole idea behind onscreen gadgetry is to make the computer screen look like the console of a piece of consumer electronics - with buttons and sliders that work and whose functions you can identify. I would hate to try to use a Microsoft stereo - the volume dial and the power button would look exactly alike and not be labelled.
On the first item, yes, the Macintosh and GUIs like it have made it easier to get at our file systems. It's easier to show someone visually using a Macintosh how subdirectories work than to try to explain it from a DOS prompt. But why do we care about subdirectories? It's a place to store our stuff, of course, but modern OSes also think the OS componentry can be safely scattered throughout the hard drive. (To a newbie, there is no fundamental difference between an OS and an application - when they click on the icon it brings up the thing where they can type their letter to Mom, that's all they know or care.) The original Mac had this reasonably well managed - your files and the applications they went with weren't all lying loose in the same directory. Modern OSes don't even bother trying. I believe there is NO REASON WHATSOEVER that the executable of an application should show up in a file requestor when you're going to load or save a document into that application. But this leads clearly into the next issue: application centricity.
Look at Windows 3.1 for an example of how not to do things. Where's your term paper? Uh... here, lemme go into File Manager and look for it. Or better yet, I'll go to Word and load it - that's what I wrote it in. What, you can't just go double-click on the term paper itself? It's not in a program group? No, not unless I go into File Manager and drag it out manually. Windows 3.1 makes a deliberate attempt to separate users from their data - by forcing you to pick an application first. Since File Manager isn't the default shell, going into it was like going into an application - and it was too technical for most people to use - so it was better to just go straight to Word or Excel and load their documents from the program that created them. This is utter bull - the user just wants to write a letter, and don't really care what program they have to use to do it. Microsoft sees it the other way: they want you to run Microsoft Word, and don't care whether you actually write anything in it or not. It's pandering in the extreme - your "personal computer" ends up containing nothing but billboards for software companies: Word, CorelDraw, Photoshop, etc., while your documents, the things you actually BOUGHT the computer for, are nowhere in sight.
Windows 95 sorta fixes this - in that you can actually SEE your files. Note however that it does NOT add your documents to the Start menu by default - much like Windows 3.1, the most visible things in the system are the applications, not the documents.
Lest we get too haughty here, poorly written Amiga apps are just as bad - and the Amiga has no desktop. I've "created" one for mine using ToolManager - pieces of stories I'm writing are on a dock in plain sight - but it's something of a pain to add new chapters. ToolManager is powerful but not easy to use - it could use some improvements in how you configure it, and 3.0 is worse than 2.1. We won't even mention Imagine and how it copes with projects as subdirectories. The Mac has acquired the bad habit of trying to put your files in the folder with the application - but makes it relatively easy to put your files elsewhere, or in a "Documents" folder. The Mac also distinguishes between applications and documents, and knows which application owns a file, so in a file dialog you will usually only see folders and the documents your application will load, unless you specify otherwise.
The modern buzzword of "component architecture" so far has been nothing but fruitless attempts to tackle an applicationless, modeless electronic workspace. The Newton comes closest - in that you don't really have to readjust between applications and there is no file system to argue with - but it's not there yet. OpenDoc tried - but wasn't it either.
Consider for a moment a piece of paper as an operating system. It's more versatile than we in the digital age realize - we look at a piece of paper and consider that it isn't 3-D, it has no scrollbars, and doesn't animate. But it's interactive. It has infinite resolution. It is portable. It will do animation if you have more than one sheet. It has an easy-to-use interface - provided that you are capable of operating a pencil or pen. It's resource-efficient - truly a "thin client" - and can be premade to any size or thickness. It also supports 3-D applications through a folding process - but I concede that's its major weakness. It also supports erasure only with certain types of data - pencil data can be erased, but pen data usually cannot. Its file system is undefined - usually it involves more paper. It is limited to the edges of the paper - though you can use the back. And its font handling isn't all that great - no matter what you're writing, it always tends to look like handwriting.
But it has a richness of data types, and integration between those data types, that's unmatched. Pencil and pen can be intermixed and have text and graphics overlay each other - or you can get a piece of paper that already has something on it, and add pencil and pen to it. It's network-ready - provided you have an envelope and a stamp. It can be bound to other pieces of paper into giant meta-documents known as "books" - and depending on the content of those pieces of paper, the book can either be a sequential document, or an animated flipbook, or some combination of the two. It supports limited hypertext facilities - if you've ever read a "Choose Your Own Adventure" book you know what I mean - and indexing in paper books is, well, workable. Best of all, your data can extend to the edges of the paper if need be - no title bars to get in your way. CPU speed is not even an issue due to the design of the system, and it requires a minimum of hardware.
Some of this is semi-joking. But think for a moment about the notion of an applicationless OS. Instead of monolithic Words and Excels, you manipulate data types in an arbitrary "page". OLE and OpenDoc try to do this - but for whatever reason, your data types end up being rectangular and dependent upon the host application to appropriately arrange them in a document. In other words, an Excel spreadsheet OLE'd into a Word document will behave like any other graphic inserted into Word, and be subject to the usual Word placement rules.
I've recently learned the Xerox STAR had exactly such an applicationless structure - in fact it had applications, but they merely served as the midway between you and the page, a word processor would kick in when you wanted to type, a draw program would kick in when you wanted to draw, it would all be seamless in the end. I'm amazed that after all this time, all the things the industry has stolen from Xerox, one of their most beautiful ideas has been ignored.
I'd been kicking around "paper OS" ideas for years - wondering why no one had tried it - before learning that Palo Alto tried it and were ignored. I think it's a mechanism whose time has come, don't you?
The trick is, who's gonna make it?
Apple tried to incorporate such a mechanism into the Mac OS - as OpenDoc, it was just a way for an application to contain other applications. Windows OLE takes a similar approach - it's still Windows, and you still have to go through Windows, Word, and Excel to do something simple like typing a one-page memo and adding a chart of this week's sales. Simple things should be simple to do, sez Carl. Componentware thus far has not been simple. In fact the ONLY pervasive and thin object architecture I've seen is the World Wide Web - and only because I'm a Web developer do I find it easy to embed objects of my choosing side by side.
Be and Microsoft are both investigating "active desktop" approaches - Be with "replicators" and Microsoft with Internet Explorer 4.0 - where application components can live in an active layer on the desktop, live and interactive. This is a step in the right direction - but unfortunately still requires a round-trip to the Dock or the Start Menu to get the application you'll need to create the content in the first place. In Microsoft's case, the general approach is that the user won't even WANT to create their own desktop content - and so handily provides lots of ActiveX stuff with Explorer 4.0 that puts CNet headlines on your desktop. Nice trick, too - you can drag the headlines object BEHIND icons on your desktop - but hell, ZedWB can accomplish similar visual effects on an Amiga, and all you get are CNet headlines and anyone else who's written similar "headlines on your desktop" components. It presumes you're always on the Net - no good for dialup users - you can put Web pages on your desktop, but that's not the same as a component architecture, because it doesn't cater at all to documents you have on your hard drive. It amounts to little more than advertising - a newspaper embedded under the surface of your kitchen table.
Screw this approach. You turn the computer on and you get a blank white screen with a floating tool palette where every tool is clearly labelled in text. Draw with the mouse and lines appear - and to the side of the screen is a menu that popped up with options available like Line, Circle, and Eraser. Type and the menu is replaced by text editing options. Want more power than your text tool gives you? Install optional components - like "draw text in a spiral" or "synaptic AI grammar checker". Software pieces could appear anywhere in the toolbar scheme - always clearly labelled - and you could arrange toolbars however you like or hide them altogether and let the machine assume if you type, you must mean this component, and if you scribble, you must mean that component.
Things can extend up from the page - 3-D objects can be contained in the page, or can contain pages. Multiple pages can comprise a page - or can comprise a book within a page - or can comprise an animation within a page - or an animation within a book within a page on one facet of a 3-D object. Imagine a suitably equipped 3-D "workspace" with VR goggles and gloves, where you're working on several documents and you have them "posted" on the surfaces of a big column that you can spin. The component architecture treats everything as objects and everything is nestable - it's up to applications to do the object rendering and editing.
Clearly such an OS would never fly commercially - where would a company stick its brand name?
At any rate, this is all a wonderful pipe dream but has a fundamental flaw: the business at hand is to try to rescue the Amiga from the pit of obsolescence. And invariably when I hear people talk about future Amigas, it always looks like a modern (aka 1992) Amiga except with a RISC processor, bugfixed OS with a nicer look and crash protection, built-in 24-bit graphics, and the usual wish list of "better pen mapping, datatypes that don't use Chip, a 'new directory' button on ASL requestors, an 'Are you sure?' when you hit RomWack from the Debug menu by mistake, etc." Oh, and it has to run all Windows and Mac software for market share reasons, not to mention all Amiga software from Boing onward. Nobody ever even THINKS about fundamentally rethinking how we do things. Obviously current Amiga applications wouldn't port too well to a "paper OS" - but then, Wordworth runs just fine on the Amiga you have now. Why upgrade if you're happy with what you have? And that's a good question - why the rush to the future when there's no guarantee it'll make it any easier to do what you already do?
Or to bring us back on topic, will a 200MHz processor bring you happiness? Only if you're a 3-D modeler on a 68020 Mac running System 7.5. If you're a writer, you've probably found computers have gotten SLOWER over the last 10 years - as applications get bigger and more complex, OSes get bigger and more complex, so that when in 1987 you could turn on your CoCo and load VIP Writer in a minute, today from power-on to Microsoft Word is three or four minutes, with no guarantee that Word will do things VIP Writer can't. (Stack Telewriter-64 up next to Microsoft Word on a Pentium 75 - this is progress?) (What? You don't remember Telewriter?)
So why upgrade at all? Two reasons:
This is why I went from the Color Computer to the Amiga in the first place - in four years I have not found an Amiga word processor I loved as much as I loved Colorware's Max-10, but the Amiga would allow me to do things the Color Computer was simply incapable of even attempting - raytracing, for example. Today I can surf the Net at 33600 baud, print gorgeous pages on an HP inkjet, play MPEGs and QuickTimes, model and raytrace, and play MODs in four-channel stereo - any one of which would send that poor little 2MHz 6809 in the CoCo screaming all the way to the bathroom - and I can do them simultaneously. I gave up some functionality - a ROM-based uncrashable programming language, for example - but it was worth it for what I've gained.
For any new computing platform to be worth buying, it must offer exactly that choice to us: keep using Word, or learn something so much more powerful and flexible you'll never miss it. The issue ceases to be software compatibility - but HUMAN compatibility. And quite frankly, the majority of humans I've met in my life are incompatible with Windows 95. I think these people would be better off with a paper OS, mainly because there'd be a much smaller learning curve - while you could learn a Macintosh by watching someone else for five minutes, you could learn the fundamentals of a paper OS by tinkering for five minutes. There's a learning curve, but there's no reason it should be an exponential one. There's a market waiting to explode, selling computers to people who don't like computers - Microsoft isn't doing it. Apple isn't doing it. Be isn't doing it.
Which leaves... Gateway.
Friends, we are all now Gateway users. Gateway knows we're out here, a community full of the most loyal computer users the world has ever seen. We stuck by Commodore for a decade longer than they deserved - and have stuck by the Amiga five years after the last new model was introduced. New software on Aminet for an "obsolete" and "dead" platform is of a higher quality than what usually appears on Simtel and Info-Mac - and much of it's freeware. You will NEVER find something the level of ToolManager or Ace on Simtel - well, Pov-Ray is free on every platform, but it began on the Amiga as a little thing called DKBTrace and was free even then. Amiga owners are varied - and (usually) highly intelligent, if a bit opinionated and dare I say stubborn. (You HAVE to be to put up with what we put up with! Buying spare proprietary parts for a discontinued obscure peripheral for a discontinued obscure computer is not a picnic!) But most importantly, Gateway, by purchasing the Amiga, and by statements they've made, have told the world they're not satisfied with the current state of the industry.
Everything we're trying to tell Gateway now is short-term. Fix the Amiga OS, yeah, might take awhile, but we should consider that short-term. It will take a year to bring the Amiga OS to the level it should have been in 1995 - same story with the hardware. Some of the more advanced Amiga ideas I've heard push the projected release date well into 1999 - which doesn't set well with me. In five years, when we look back on the five years since Gateway bought the Amiga and the formation of the Industry Council, do we want to be able to say "We created the future of computing", or "We brought the Amiga OS up to date"?
The spirit of the Amiga was the notion of moving up a level. Sure, you could do all the things you expected a computer to do - even run Microsoft Basic if you wanted - but there was SO MUCH MORE. The Amiga revolutionized the world of computer graphics - twice - once in 1985 by proving photorealistic graphics could be had for $1300, and again in 1990 with the Video Toaster proving that computers and video weren't all that incompatible after all. I have to wonder now, whether the Amiga has exhausted its list of industry firsts, whether it's done all the revolutionizing it intends to do.
The Amiga's underlying OS is the one best-suited for a paper OS - more so than any other modern OS. It's FULLY multitasking - how many other OSes can you name that can simultaneously format two partitions on the same hard drive PLUS a floppy disk? - and while the existing OS doesn't support memory protection or crash protection or multithreading, Gateway has the source code and can add it much easier than Phase 5 or ProDad can. Intuition already provides a lot of layering support, movable objects, and varying kinds of transparency; it treats graphical data as objects. All that's needed is a reworking of the outermost levels of the GUI - the addition of some component-support libraries, new ways to open applications that don't involve creation of a window, new file-handing mechanisms so the underlying file system is as abstract as possible, etc. The core OS - which NO ONE ELSE HAS - is already done and it works better than anything else on the planet. Recompile Exec. Rewrite parts of Graphics and Intuition. Toss Workbench and put the Paper Operating System in its place. The resulting OS could actually run on 68K hardware! And plug into a TV!
But we must give up compatibility.
And we must give up Workbench.
I don't know all the answers here. Clearly I don't have all the elements of the Paper OS thought through. But these are surmountable problems - and all it'll take is someone willing to expend the effort and hire the necessary people to surmount it. Microsoft won't - they've already tried with OLE, and to try it again how I've outlined it here will mean giving up Windows. (Windows CE is proof they'll never do that.) Apple can't - they've invested $400 million in an insanely great company with insanely great ideas, but are now in the middle of the just-good-old-fashioned-insane project of porting the Mac OS to UNIX. (The Blue Box they're so proud of? It's basically Shapeshifter. It's a complete Macintosh emulator running inside a Mac-like NeXTSTEP.) They're spinning Newton off into its own company - and while Newton's the direction in which we're trying to go, it's still a bit too quaint, a bit too 1989, and a bit too Apple for most people's tastes - and you need several hundred bucks and a good pair of eyes to get in the door. Be might - but they're too busy right now trying to bring the underlying OS up to snuff before they can even begin to think about reinventing the user interface.
What I DO know is, the computer industry has stagnated. Exactly two fundamentally new desktop computers have been released in the last 10 years - both made by former Apple execs. Every OS available today looks like the Macintosh in some form or other. No one has a clean componentware solution. And while everyone's talking about $500 computers, no one I know has $500 to spare for a "stripped down" edition of a desktop. The whole world has formulated its idea of what a computer is and does - and thus it will never grow. Microsoft has so much as admitted it in recent ads - it's almost hilarious how they've realized the notion of a computer that isn't $1500 and has an Intel processor is actually a threat to them, and how they're trying to counteract it with ads proclaiming the wonders of "personal computers." (If it's so personal, why does it take a Ph.D to make it say something other than My Computer?) $500 computers cost $500 because we think they must be a subset of a $1500 one. Wrong - a $100 computer could be made that implements the "paper OS" I described, plus a modem and enough innards to decode Web pages. CPU? 68030. OS kernal? Amiga Exec. Display? Conventional television. Or add another $75 for LCD panel - backlit monochromes aren't as expensive as you'd think - make it touch-sensitive and ship it with a pen. Don't tie it too closely to the pen-driven architecture or it'll just be another Linus or Momenta - but don't exclude it either. Best of all, ship a $500 "cool toy" setup using the Paper OS in a 3-D incarnation with a set of 3-D goggles and gloves - and of course, a position-sensitive stylus that writes in the air.
Suddenly technology is exciting again.
Meanwhile, if you're using Netscape, I have just downloaded a file from your hard drive called "betchacan'tfindme.text" and it contains the word "zeeble" 5000 times. Send a representative with a bag of money to Denmark and I'll let you have the file back. :-)
No, I don't know what to think of that. Netscape's arrogant as all get-out - even Steve Jobs looks at Netscape and says "sheesh" - but at the same time, this group in Denmark are being REAL hush-hush about the technical details of the security hole, and it's entirely possible it doesn't exist and Netscape wouldn't find out until they hand over a bunch of money. I don't even know who to root for - sorta like Microsoft versus Ticketmaster.
(For those with no clue what I'm talking about, a team in Denmark found a hole in several versions of Netscape Navigator that allows a Web site to extract data from a user's hard drive - if they know the specific filename. They're holding the information for ransom - the $1000 "reward" Netscape offers for finding security bugs wasn't enough - they want a Netscape representative to show up in Denmark with lots of money and then they'll explain what they found. The hole is obscure and most users aren't at risk - if Netscape doesn't even know how to reproduce it, it's doubtful anyone else will either.)
And speaking of Ticketmaster, I bought tickets for Lilith Fair when it reaches Indianapolis - and have decided Ticketmaster must be destroyed. It has outlived its usefulness. Sections A, B, and C at Deer Creek Amphitheater were completely sold out inside two minutes - and I find it hard to believe they all went to fans. I'm ready to declare war - just as soon as I assemble an army and battle plan. Who's game?
So on that note, I'm gonna leave the Amiga off for most of the weekend. It's useless to me without a hard drive - I won't go back to floppies, the data I need is on the drive - and I don't wanna put any more miles on that Seagate than I absolutely must. If anyone has any ideas on that weird Conner behavior, lemme know.
[John's Homepage]
[Sarah McLachlan Stuff]
[Donna Lewis]
[Cabinet of Curiosities]
[Squidprojects]
[About John]
[John's Art]
[Email John]
[Guestbook]
[Message Board]