I feel grateful for the efforts of those who designed the excellent programs that I use.
My text processor is DeskMate Text, from 1990 and 1992. In my opinion, word processors have gotten worse rather than better and now involve much more fussing than they are worth. With DeskMate, I can use my own personal macros, a calculator, and the other features of a GUI. I use DeskMate in the DOS operating environment; that is, either MS_DOS 3.3 on my Tandy FD 1100 or PC-DOS 2000 on my Libretto. DeskMate is no longer offered for sale. I really enjoyed using an OS/2 text editor which allowed me to use REXX macros, but OS/2 is no longer fully supported.
System Commander allows me to keep multiple operating systems on one computer.
After I write my page, I modify it with a REXX batch file which I wrote, called DOC2HTM. The batch file turns the text program into an html file, and adds such features as paragraphs, blockquotes, red, oversized first letters, line breaks, and titles. The REXX processor is a part of PC-DOS, so I use it within the DOS environment.
To create my graphics, I am now using a different graphic program and different hardware. I was using PhotoDeluxe, but I am now using Adobe Photoshop Elements to convert my graphics to web pages. Although I found this program to be difficult to use when trying to make icons (a process that was abandoned), it quickly converted small bitmapped graphics made with MS-Paint (an awkward and buggy program furnished free with every version of Windows) into first-rate icons. When using it to convert scanned graphics into web pages, I found that using it is very intuitive; it practically taught me what to do, and I am able to crunch a 800 x 600 pixel graphic down to 70 kilobytes, more or less, very quickly (my only delay has been tracking down and removing dust spots, a time consuming process on old negatives and slides).
My hardware for my graphics task is a Mustek Plug-N-Scan scanner, a Tamarack ArtiScan film scanner, and a Dell Inspiron 3800 laptop. The laptop has been the one poor choice. Although it has removeable everything and a CD-RW drive, it has proven itself to be a heavy power user, using four times as much power as my Libretto, requiring me to double my solar power expenditures. In addition, it conked out, and it took three weeks on the phone to convince Dell to repair it. When I got it back, one of the keys promptly fell off, although the repairs had included replacing the motherboard and keyboard. I would not recommend that any one buy a Dell, unless you wish to purchase mine.
The next step is to create a web page. After leaving DOS and getting into that big and slow operating system, I paste my html file into a template, and I use Arachnophilia to make additional changes. Arachnophilia allows me to add tags by using macros or buttons or by typing them in, and its features can be edited. It's a free program, or rather a care-ware program. The author wants the users to do something to benefit others. This website shares his purpose.
The web browser I use to view the text while working on it is the free browser by Netscape; I have versions from 2.x to 6.2. Netscape 6.2 has solved the problems found in 6.1 and 6.0, or if you prefer, you can download the Mozilla version. I used Netscape Messenger for all my messages, and I am pleased I don't have the problems with viruses that people report with that other mail program. I also sometimes use the browser that came with the computer to check my pages after they are written.
To manage my files, I use Take Command and 4DOS. For example "copy d:\kenkifer e:\kenkifer /su" updates all the files for my website without overriding any newer files, very powerful. These are shareware, that is, not free. Howevever, JP Software allows me (one user) to use them on more than one machine.
From time to time, I need to make global changes in the site. Using a for-in-do with Take Command, I load all the files from a single directory into Editpad and then have that program make changes which applies to all loaded pages at once. Editpad is an excellent notepad replacement, and the classic version can be purchased with a postcard. I got my copy from Two Cows, a great source for freeware and shareware.
I was using WS_FTP to upload my pages to this website. Recently, Take Command was upgraded to allow for internet use, so I now use it to update my website from the command prompt. Besides commands being more powerful, I don't like drag and drop at all. I consider many of these enhancements to be no better than dirty tricks. For example, recently, I intended to hit enter to open a directory and hit the "\" key by accident; the entire directory dissappeared!
In using Take Command, I discovered that the /su command doesn't work dependably over the internet, so I started using the date command instead of the update switch: "copy /s d:\kenkifer 'ftp:' /[d-20]" uploads to my website all the pages and graphics that have been created or updated within the last 20 days. However, even then, there was a delay due to the "copy /s" switch trying to copy all the subdirectories.
Thinking about the matter, I decided I wanted to create a batch file that would automatically recognize which files needed to be updated and to copy them and only them to my site. This is more complicated that it sounds. I thought I would show the file here:
iff not exist c:\tc32300\webfresh.dat then & input /c /d /l2 How many days since the last refresh? %%num
else & set num=%@EVAL[%@date[%_date] - %@date[%@filedate[c:\tc32300\webfresh.dat]]]
dir d:\kenkifer /sb /[!*.log] /[d-%num] > c:\tc32300\webfresh.dat
if %@lines[c:\tc32300\webfresh.dat]==-1 echo All the files are up to date!
for %file in (@c:\tc32300\webfresh.dat) copy %file "ftp:%@instr[11,,%@replace[\,/,%file]]"
Here's what the lines do:
Line #1 makes the actions of the batchfile invisible. Line #2 checks for a file named webfresh.dat and if it doesn't exist, asks me how many days since I last refreshed, looking for a number no more than two digits long. Line #3, if the file does exist, subtracts the date of the file from today's date to find out how many days have passed. Line #4 ends the "if" condition. Line #5 does a directory search of all the files for my website, excluding the log files, finding only the ones that have been updated within the specified number of days, and copies their addesses to webfresh.dat, overwritting whatever information was there. Line #6 simply tells me if there are no files to update. And line #7 copies the files individually to the internet. This last line is a bit more complicated, and the line that I was not sure would work. Each line of webfresh.dat is used twice, first to tell the copy command which file to copy and second -- after getting edited to remove the computer location and to substitute the internet location ("ftp:") along with replacing all the "\" marks in the url with "/" marks -- to tell where the file must be copied to. The web location is quoted to avoid conflicts with DOS.
The sophisticated operations made possible with Take Command and 4DOS (for files) and Rexx (for text) certainly allow me to perform difficult tasks easily.
Getting Rid of Spam
I have been getting so much spam and so many viruses recently, that I was about to throw in the towel and no longer make my email address public. Many of the messages were addressed directly to me and had no characteristics that filters could stop. To delete them in Messenger, I had to click on them, and many of these spam messages were HTML pages designed to take over my computer, some causing my browser to crash repeatedly. Deleting them before downloading them by using webmail access was slow, and occasionally a new message came in after I pushed delete, causing the wrong items to be deleted. Inspecting new messages to view their content was slow, and some caused problems when viewed as webmail.
Now, I have found a program that adequately addresses these problems, MailWasher. Although the author of this program would like to receive $20, he does not insist. I open this program to check my mail before downloading any mail with Messenger. The program retreives header information from my mail service, allowing me to mark the messages which are spam, based on the headers and file size. Should I need more information, it will also quickly display the complete header information or a section of the text. Once I have marked a message as spam, all other messages from the same source will be deleted before I see them. The program then deletes all spam and virus messages from my mail server before I download them, and even sends ("bounces") a message back to the spam source, saying that my mail box no longer exists.
I don't know if the bouncing does any good, but I do know that I can clean out fifty spam messages within five to ten minutes, while it used to take much longer.