"FeedStation allows you to download enclosures that appear in your NewsGator Online or FeedDemon feeds. " That's nice. Sounds like I can automate downloads for any kind of file...
Thinking beyond podcasting: I've got a couple of applications on the web (like this weblog) for which I can do automated (local) backups (via a shell script run from a cron job), and currently I manually copy them over to my Windows laptop - from time to time, when I remember doing so.
How about setting up an ultra-simple PHP script producing an Atom (or RSS) feed with enclosures pointing to the files in that backup directory?
When I think about this, shouldn't automatically getting the latest copy of my favourite software be as simple as subscribing to its "download"-feed (with enclosures)? (And a naive question, since I haven't really thought about this - what's the difference between "photocasting" and automatic enclosure downloads from image feeds into my picture directory?)
Fri, 21 Apr 2006 16:43:40 +0200
While working on our new web application DC5, a total rewrite of its predecessor DC4, it occurred to me that you could say there's five ways of looking at a web application, five interfaces that matter to its users.
Getting any of these right, making it beautiful, is an art in itself. Getting all of these right seems to be the ultimate goal for a web application, and should result in a wonderfully versatile and transparent piece of software:
- The web interface: Naturally, the GUI (graphical user interface) will get a lot of attention while you're developing your web application. Functionality, design, usability, security, browser compatibility are the important things to consider. Often, there's two web interfaces, one for administrators and one for end users. Expect to see lots of web applications exposing a beautiful HTML user interface, but neglecting some of the following interfaces -
- Web services: Today, people start to expect that your web application will provide network access to its data in the form of XML over HTTP (be it SOAP, XML-RPC or REST). This is a good thing - integrating data from multiple sources and locations is becoming so much easier once your data has URLs and a well-defined XML format. Seems to me that choosing protocols, URLs and formats takes some time...
- Command line interface (CLI): Probably the most-neglected interface of web applications. Imagine you're an adminstrator and have to import lots of data into the web application. You will not want to do this through a web page, nor through a web service. Or you want to do a batch update, and the modifications you'd like to apply would be easy to express in Perl, or in XSLT: A good web application will provide you with command line executables or scripts, allowing you to process its data with standard Unix tools. You have the most power at your fingertips while you're at your server's command line!
- Application Programming Interface (API): Even if you weren't planning that users interact with your web application from within a programming language, you would still need to develop and maintain a consistent API for yourselves (the application developers) - you use it to implement the three interfaces above. Once you have such an API, you may as well document it and advertise its use to others. Encouraging users to program add-ons using your API may at best result in a community contributing valuable code to your application. And at least, the users who have knowledge of your application's programming language will less likely be drawn away because of missing features.
- Data structures: What will happen if you throw away all your code, keeping just the data of your web application? Will there be a meaningful directory or database layout, easily accessible and comprehensible by outsiders, or by yourselves two years later? It will be even better if your users know which data they may modify directly, without using your application; this would bring the power of SQL (or find/grep/sed) to them.
Thu, 16 Dec 2004 16:30:30 +0100
Wed, 13 Aug 2003 13:39:00 +0200
I'm using Instant Messaging for (second level) tech support more and more, and it's great for this purpose:
Interaction is just so much faster than with e-mail exchange, and it's not dragging my attention as far away from my current work as phone calls do. I can send text, URLs, files...
Two things we're used to having with plain old telephony still seem to be missing, though:
How can I setup an IM account that points to a group of people and connects me to the first one willing to accept a message? (Which would be just the way that help desk phone numbers are working.)
Secondly, how can I put someone through to another IM account when I find out - during the conversation - that there's someone more able to answer the question? I can do this easily with the phone. Well, there is an IM workaround: But inviting someone to a group chat and then leaving the chat is just not as intuitive (or am I just stuck in a telephone perspective?)...
Sat, 19 Jul 2003 14:04:55 +0200
... are a lot more fascinating to me than virtual communities are.
There's so many of which I'm already a part of: My family, my church, friends, my neighbourhood, etc.
I'd really like (closed) websites for all of them, with chat, forum, photos, family trees!
Wed, 28 May 2003 08:02:31 +0200
This should be possible even without patching Samba:
Let's scan a directory (including subdirectories) for changes. The user must put files (not symbolic links) into this directory.
Once a new file comes in, it is moved into another, invisible directory (make sure you don't move it away while it's still being written). RCS versioning is applied to the file, and a symbolic link is created in the original location. The user can now continue reading the file.
When writing to the symbolic link through Samba, the link is being replaced by a file. We can detect this and check in a new RCS version and recreate the link. If the linked file is changed directly (through local access), we can detect this as well.
Deleted files can also be detected, and we can move the RCS file into a trash directory.
Moved files can be identified using a file size/MD5 check sum index.
An automatically created subdirectory ("Versions"?) contains .url files (or static HTML files?) for version management: Comments can be added to files and versions, you can restore or download a previous version (or even do a diff). And you can peek into the trash directory here to restore or erase some or all of the files it contains.
All of this would work without a database. Probably only suited for single or a few users, and for not-too-frequent changes.
Wed, 29 Jan 2003 09:24:12 +0100
I really love joelonsoftware.com and useit.com regularly publishing free articles on a specific subject.
How about writing articles myself? On the common denominators of information science / knowledge management / information architecture and IT / software...
And on new approaches to information presentation in software...
Wed, 29 Jan 2003 09:01:02 +0100
Installed courier-imap-126.96.36.19921124 for testing "fake e-mail accounts".
Installation instructions: http://www.inter7.com/courierimap/INSTALL.html#userdb
Now I can simply put RFC822 text files into maildir/new, and they appear in the IMAP inbox (or in the POP3 inbox)!
Use this for push profiles (SDI), e-mail archiving (the user can simply drag&drop mails into this folder using his mail client), ...
User accounts are defined in /etc/userdb, need programs in /usr/lib/courier-imap/sbin (userdb, userdbpw, makeuserdb) for setting them up.
Thu, 28 Nov 2002 15:41:07 +0100
Somebody in the German PHP mailinglist recommended using fetchmail/procmail for fetching mails automatically from a POP3 server and putting their contents into a database.
E-mail archiving through BCC:?
Thu, 12 Sep 2002 14:57:28 +0200