Thursday, November 15, 2007

3000th download of FreeMiCal


November 12th the 3000th user downloaded FreeMiCal. I hope you folks are happy with the tool.

Wednesday, October 31, 2007

I'm not reading mags

I stopped reading magazines years ago for several reasons:
  1. they distract from the issue
  2. their net ratio of information transferred is bad
  3. cost per page of information is beyond acceptable ranges
Let me elaborate on this.

Magazines usually cover just one or two subjects that interest you or that are helpful to your current situation. The rest is either unsolicited input or, worse still, advertisement. I don't need advertisement. Who wants to learn about the n-th version of a charting package that can be embedded into your source code or the umpteenth version management with pink color coding of source files ending with the letter x. Take away cover, index, imprint, the info per pages ratio lies somewhere around 3%.

Even if an article covers an issue right in your focus, usually the issue is large enough to be broken into several articles. Publishers do not want articles to extend two pages and they want to sell the next issue as well. So, the article consists of 30% intro and repitition, 60% information and 10% links to supportive web pages, half of them not online otherwise you would have found them in previous web searches. So the info per space ratio lies around 60%.

Your average magazine costs about € 10,- for around 60 pages. This makes 17 cent / page.

Compared to a book with 1.000 pages (e.g. O'Reilly JavaScript Reference, € 46,99) the price of a magazine is about 3,5 times as high.

All that said, I found a magazine and an aspiring magazine that are worth reading:

dotnetpro is a magazine that concentrates on .NET software development. Much to my surprise it also covers Mono (in those much-hated multi-issued never-ending article-threads). On average, I can read 4 to 8 articles in it that interest me and the quality of code is more than acceptable.

pythonmagazine is an effort to get a Python developer magazine up and running. Issue 1 can be downloaded in PDF and offers an interesting blend of Python issues.

I am not going to change my mind: I read books, no magazines.

But I will keep an eye on the two.

Monday, May 14, 2007

First steps in Python

In my blog Python demystified I reflected on some thoughts about computer programming and the language in particular.

Ever since then I was aware and interested in this language. Remember I said:
Without any decent IDE (like Netbeans for Java) that allows for graphical programming and UI-design, I strongly doubt that Python will ever gain momentum.
Well, I found a decent IDE: ActiveState Komodo IDE.

Except for GUI it has everything, a decent IDE needs: Syntax highlighting, code completion and code folding, integration into version control (subversion), debugging and profiling.

So just as I was to change my attitude and general opinion on Python the snakes ugly head rose from the depth of my notebooks core:

(Briefly): Python allows for object oriented programming. Objects can be created and instanciated. When going out of scope they are subjected to the garbage collector for destruction. So far so good.

I tried on example program from a popular python book - it worked.

I tried to extend the program (for better understanding) - it crashed.

Well, it terminated with an exception.

Further investigation revealed:
Python stores class definition and object instances in a globally accessible list. When program flow exits the current scope, all objects within scope are subjected to garbage collection according to this global list (and in the exact order of appearance within it).

So there is the possibility that an instance "wol" may be destructed before the class definition "Person". Changing the object name to "wolf" brings it after the "Person" identifier in the globals list and thus there is an object still in memory and valid, where the class definition is destroyed. Any following destructor of the objects instance cannot be called. The code is not there any more.

Is it just me that I find these things on my first day with the language?

Other than that I am fascinated by this elegant and slim language. Worth a try.

Genuine Advantage, my eyeball!

I got this note today, that Microsoft tightens its grip on Windows Vista users. Not only will an invalid key disallow activation. An presumably invalid key will render Windows Vista practically useless.
The new Software Protection Platform, built into Windows Vista, makes the user experience noticeably different between a genuine version and non-genuine version of Windows Vista. When it detects a non-genuine version of Windows Vista installed on a PC, the Software Protection Platform will disable key features of Windows Vista, including the desktop, Start menu, and task bar. Windows Vista functionality will be restricted to the default Web browser for one-hour periods.
The practical implication is that your software will not run. Your browser will be available one hour, then the PC will shut down.

Well, you might say, serves you right for using pirated keys?

It turns out that Microsoft themselfs don't know which key is valid and which not. It also turns out that if you use a notebook and stay unconnected for a certain period of time, the "Genuine Advantage" will strike you and you have your one hour browser window.

I could understand that Microsoft will not update pirated versions of Windows. I hardly could understand that my product keys were not accepted online but telephone activation was accepted (I purchased 10 copies of Windows, Office and some Server CALs).

But I decline to see my advantage if MY PC shuts down on a business trip because Microsoft thinks, my product keys are pirated.

Thursday, April 26, 2007

Krieg der offenen Dateiformate

ODF vs. OpenXML

In seinem Blog [1] behauptet Brian Jones, Office Program Manager bei Microsoft, dass der Krieg der offenen Dateiformate beendet sei. Grund sei die Freigabe einer speziellen Version von OpenOffice durch Novell, welche Daten in Microsofts neuem Fileformat OpenXML lesen und schreiben kann. Tatsächlich handelt es sich nur um einen eigenständigen Konverter [2], der OpenOffice Writer Dokumente in Word 2007 Dokumente umwandelt. Ist der Krieg der offenen Dateiformate tatsächlich beendet?

Bisher speicherte Anwendungssoftware ihre Daten als ein Abbild des Hauptspeichers auf Festplatten. Dieser Vorgang ist schnell und effizient, langwierige Übersetzungen der Datenstrukturen unterbleiben. Er ist aber auch fehleranfällig. Speicherabbilder enthalten Querverweise und Verkettungen. Ein fehlerhafter Wert beim Datentransfer führt zu unbrauchbaren Gesamtergebnissen.

Im Lauf der Jahre wurden die internen Datenstrukturen immer komplizierter. Anfänglich bestanden Dokumente nur aus Zeichenketten. Später kamen Textformatierungen, Schriftarten, eingelagerte Bilder und Tabellen hinzu. Mit steigender Komplexität stieg der Platzbedarf auf Datenträgern und die Dauer, welche die Software zum Einlesen und Abspeichern benötigt. Mit jeder neuen Version entstanden Unverträglichkeiten mit den Daten der Vorgängerversionen. Die Datenübernahme anderer Hersteller wurde durch fehlende oder ungenügende Dokumentation der Dateiformate behindert. Besonders der Marktführer Microsoft verteidigte seine Position vehement, sowohl durch laufende Änderung der Dateiformate, als auch durch Verbot von Rückübersetzungen. So entwickelten Alternativanbieter Import- und Exportfilter welche rudimentären Dokumentenaustausch ermöglichen. Um das ursprüngliche Erscheinungsbild wiederherzustellen sind jedoch aufwendige manuelle Nacharbeiten notwendig.

Wozu neue Dateiformate?

Längerfristig entstanden Probleme beim Zugriff, bei der Lesbarkeit, Nutzbarkeit und Vergleichbarkeit von Dokumenten. Besonders im Bereich öffentlicher Verwaltungen besteht Bedarf, auf alte und historische Dokumente uneingeschränkt zugreifen zu können. Über Jahrhunderte wurde Papier als Informationsspeicher erfolgreich genutzt. Ein Ersatz durch elektronische Datenverarbeitung kann nur dann erfolgen, wenn die Nachhaltigkeit, Vertraulichkeit, Sicherheit und Datenintegrität gewährleistet werden kann.

Mit herkömmlichen Dateiformaten sind derartige Anforderungen nicht abzudecken. Parallel zum wachsenden Bedarf aus dem öffentlichen Sektor wuchs der Kostendruck in Unternehmen, hervorgerufen durch Unverträglichkeiten bei den Dateiformaten und daraus resultierender Ineffizienz der Arbeitsabläufe.

Langzeittauglichkeit gefordert

Führende Softwarefirmen entwickelten im Rahmen des OASIS Konsortiums [3] einen offenen Dokumentenstandard, der die gravierendsten Probleme wie Langfristigkeit, Lesbarkeit und Fehlerresistenz lösen soll. Das Open Document Format (ODF) [4] versprach ein Ende bisheriger Inkompatibilitäten zwischen Dateiformaten, sowohl versions- als auch herstellerübergreifend. Mit über 700 Seiten ist der Standard umfassend und für zukünftige Erweiterungen offen.

Öffentliche Verwaltungen und Institutionen erkannten das Potential des neuen Standards. Neben einigen Bundesstaaten der USA definierten vor allem Länder aus Südamerika und Europa sowie einige asiatische Länder ODF als verbindlichen Dokumentenstandard im Parteienverkehr und der internen Abläufe. Spätestens mit der Ankündigung des Department of Defence (DoD) im Jahr 2003, verstärkt Open Source Software und offene Dokumentenstandards zu nutzen, reagierte Microsoft auf diese Entwicklungen.

Nicht nur Mittel zur Datenspeicherung

Microsoft verfügt über einen hohen Marktanteil im Bereich des Basisbetriebssystems, der Standard-Anwendungssoftware und auch bei den Dokumentenformaten. Insbesondere die Dokumentenformate helfen Microsoft, Updatezyklen bei der Standardsoftware und den Betriebssystemen massgeblich zu steuern. Unverträglichkeiten zwischen den Versionen führen mittelfristig dazu, dass Firmen über ihre Aussenkontakte zu einem Update gezwungen sind, wenn sie nicht den elektronischen Anschluss an ihre Partner verpassen wollen. Automatisierte Systemnachbesserungen erleichtern es Microsoft, diesen Zwang nach belieben zu verstärken. Eine nachhaltige Neuorientierung grosser Kunden wie dem DoD gefährdet die Marktposition von Microsoft in Ihren Grundfesten.

Diese Abkehr war nur durch die Bereitstellung offener Standards im Bereich der Dateiformate zu verhindern. ODF als Dateiformat kommt aus marktpolitischen Überlegungen für Microsoft nicht in Frage. So wurde 2005 im Rahmen der ECMA ein neuer Standard ausgearbeitet: ECMA-376 oder Office OpenXML [5,6] wurde am 7. Dezember 2006 trotz zahlreicher technischer Einsprüche [7] durchgesetzt. Dieser Standard umfasst derzeit mehr als 6.500 Seiten, zahlreiche XML Schemaspezifikationen und deckt die Office-Anwendungen Word, Excel, Powerpoint und Access ab [8,9].

(Ironie am Rande: In seiner Proposalpräsentation vor der ECMA [10] verweist Brian Jones auf ein Dokument der EU zum Thema Open Document Standards und Vorteile der Nutzung. Das Originaldokument [11] beschreibt allerdings diese Vorteile unter eindeutigem Bezug auf ODF)

Was können die neuen Formate ...

ODF und OpenXML sind technisch sehr ähnlich. Sie speichern verschiedene Bestandteile der Dokumente als XML Dateien in ZIP Archiv ab. ZIP ist ein anerkannter und weit verbreiteter Kompressionsalgorithmus. XML ist eine erweiterbare Beschreibungssprache für hierarchisch gliederbare Datenbestände. XML Dateien sind zwar grösser als binäre Dateiformate gleichen Inhalts, lassen sich aber aufgrund der hohen Informationsredundanz besser komprimieren. Die komprimierten Archive können schneller auf Datenträger geschrieben und von dort gelesen werden. Dateiinhalte werden nicht unmittelbar in den Arbeitsspeicher übernommen, sondern zuerst analysiert und in maschinenverwertbare Form umgewandelt. Die Dateiformate sind fehlerresistenter als Ihre Vorgänger. Beide Dateiformate können aufgrund ihrer offenen Struktur automatisiert nachbearbeitet werden.

ODF und OpenXML erlauben die Einbindung binärer Informationsfragmente sowie Script- und Makrosprachen. Das führt zu neuen Sicherheitsrisiken. Keines der beiden Formate bietet hinreichenden integrierten Schutz vor ungewollten Änderungen von Dateninhalten. Sie bieten genügend Angriffsfläche zum Einschleusen von Schadcode. Es bleibt den Anwendungsprogrammen überlassen, dies zu verhindern. Microsoft Office prüft anhand der Erweiterung des Dateinamens, ob der Aufruf von Makros erlaubt ist. Dieser Schutz ist allerdings leicht zu umgehen und suggeriert daher eine nicht vorhandene Sicherheit.

Beide Formate gewährleisten den längerfristigen Zugriff auf Daten. In der Darstellung hängen Sie - wie ihre traditionellen Vorgänger - von zahlreichen externen Faktoren ab. Das optisch gleiche Erscheinungsbild kann mit den aktuellen Standards nicht garantiert werden.

... und wo unterscheiden sie sich?

ODF Dateien sind kleiner als ihr OpenXML Pendant. ODF speichert Inhalte gemeinsam mit der Formatierung. OpenXML trennt konsequent Text von der Formatierung. Das ist technisch sauberer und führt erstaunlicherweise nicht zu längeren Ladezeiten. Microsoft behält sich die Option vor, grössere Dateien auch in einem proprietären Format einzubinden, um eventuelle Engpässe in der Performance zu umgehen. Hier sind Inkompatibilitäten vorprogrammiert.

OpenXML greift nicht auf bestehende Standards zurück. Unter anderem wurden neue Standards für Grafiken, Texten, Tabellen, mathematischen Formel, Länder- und Farbcodes definiert. Damit wurde die Spezifikation aufgebläht. In der Umsetzung erhöht das die Fehleranfälligkeit von Anwendungssoftware und Formatkonvertern. ODF setzt in allen Bereichen auf etablierte Standards wie SVG, XML, mathML und standardisierte ISO-Codes.

Wie frei ist „Frei“?

ODF und OpenXML sind lizensierbare Standards, deren Nutzung unentgeltlich ist. ODF kann im Rahmen der Lizenzen durch Dritte ergänzt werden. Das Open Document White Paper verweist auf 9 Referenzimplementation [12]. Zu einigen davon ist der Quellcode verfügbar. Dem gegenüber verweist Microsoft nur auf eine Referenzimplementation, Office 2007, welche nicht quelloffen ist.

Ein Gutachten bestätigt ODF patentrechtliche Unbedenklichkeit. Sun, Hauptzulieferer zum ODF-Standard, hat einen ergänzenden Forderungsverzicht abgegeben. Der Ausstieg einzelner Mitglieder aus dem OASIS Konsortium ist klar geregelt, sodass in Zukunft Ansprüche von Altmitgliedern nicht zu erwarten sind. Eventuell zukünftig auftauchende Forderungen aus Patentrechten werden nicht vollständig ausgeschlossen. OASIS bestätigt dieses marginale Restrisiko, sieht aber selbst keinen Lösungsansatz.

Microsoft stellt die Nutzung von OpenXML jedermann frei. Microsoft verweist auf seine Patente im OpenXML Standard, gibt allerdings auch einen schriftlichen Klagsverzicht auf seiner Website ab. Dieser erschöpft sich auf die im Standard berührten Technologien und Patente. Patente, die von der ordnungsgemässen Umsetzung der OpenXML-Spezifikation in Anwendungssoftware berührt werden, sowie eventuelle Patente von Drittherstellern sind von dieser Freistellung nicht betroffen. Insbesondere bei der Einbindung in Anwendungssoftware sehen Rechtsexperten ein nicht unbeträchtliches Rechtsrisiko welches Microsoft nicht entkräftet.

Sowohl ODF als auch OpenXML sind derzeit sowohl frei zugänglich, frei nutzbar, frei von Kosten sowie frei von patentrechtlichen Einschränkungen. ODF ist für Softwareentwickler leichter zugänglich. Der überschaubare Umfang der Spezifikationen erlaubt wirtschaftliches Einarbeiten in das Thema. OpenXML mit über 6.500 Seiten und zahlreichen Schematas drängt sich dagegen dem Interessenten nicht unmittelbar auf.

Attraktiv für den öffentlichen Bereich

Öffentliche Verwaltungen und Regierungen fordern die Nutzung offener Dateiformate aus zwei Hauptgründen:
1. soll die langfristige Nutzbarkeit auch auf unterschiedlichsten EDV-Systemen gewährleistet sein. Diese Forderung wird hauptsächlich in Ämtern und Behörden gestellt.
2. soll die starke Abhängigkeit von Softwareanbietern reduziert und - wenn möglich - lokales Know-how genutzt werden. Diese Forderung stellen vornehmlich Regierungen.

Hier hat ODF einen deutlichen Vorsprung. Das Format existiert bereits seit längerer Zeit, ist einige Male implementiert und in quelloffener Form zugänglich. Der Standard wird sowohl von einigen grossen Softwareherstellern als auch von einer umfangreichen Entwicklergemeinschaft unterstützt. ODF hat noch einige Einschränkungen, die den Einsatz im öffentlichen Bereich behindern. Die langfristige Erweiterbarkeit ist noch nicht nachgewiesen. OpenXML kann auf keine substantiellen Vorteile verweisen, welche einen Einsatz zwingend notwendig machen würden. Allerdings besitzt Microsoft eine breite installierte Basis, auf die das Unternehmen starken Einfluss über seine automatisierten Updates ausüben kann.

Öffentliche Stellen können bis zu einer endgültigen Lösung der offenen Probleme die ersten Schritte in Richtung Automatisierung gehen. Notwendige Ergänzungen lassen sich in weiteren Phasen der Umsetzung nachziehen.

Attraktiv für Unternehmen?

Unternehmen agieren in kurzfristigeren Innovationszyklen. Nur ein geringer Teil der betriebsnotwendigen Informationen haben langfristige Relevanz (Verträge, Finanzinformationen) und müssen entsprechend archiviert und gewartet werden. Der Rest der Arbeitsdokumente hat eine geringe Halbwertszeit.

Eine grundsätzliche Entscheidung bezüglich eines Dateiformates ist nur dann möglich, wenn die Entscheidung bezüglich alternativer Anwendungssoftware zur Disposition steht. Das Erstellen von ODF Dateien mit Microsoft Office (Word, Excel, Powerpoint, Access) ist heute nicht möglich und zukünftig nicht absehbar. Umgekehrt ist das Erstellen von OpenXML Dateien aus alternativer Anwendungssoftware derzeit nur eingeschränkt möglich. Da Microsofts Anwendungsprogramme in Unternehmen besonders stark verbreitet sind, ist der Einsatz von OpenXML zumindest dort vorhersehbar.

Bekannte Migrationsprojekte, wie jenes der Stadt München oder der Lufthansa, sind weitgehend von strategischen, politischen oder ideellen Motiven geleitet. Wirtschaftliche Vorteile bei der Betrachtung der Gesamtkosten sind marginal oder nicht vorhanden. Wer sich letztlich durchsetzen wird - öffentliche Verwaltungen und Regierungen die ODF bevorzugen oder Unternehmen die Microsoft mit OpenXML nutzen - ist derzeit nicht prognostizierbar.

Der Krieg der offenen Dateiformate ist demnach noch nicht beendet. Wir erleben bestenfalls eine kurzfristigen Waffenstillstand.


Bibliographie:
[1] Blog Brian Jones
[2] Novell OpenXML Translator
[3] OASIS Gründungsmitglieder
[4] Open Document for Office Applications
[5] ECMA-376 - Office OpenXML
[6] Office OpenXML Fact Sheet
[7] Objections to JTC-1 Fast-Track Processing of the Ecma 376 Specification v. 0.1, 27.1.2007
[8] ECMA-376 OpenXML White Paper
[9] Introducing the Office (2007) Open XML File Formats
[10] Start of TC45: Presentation to GA
[11] TAC approval on conclusions and recommendations on open document formats
[12] Open by Design, ODF White Paper

Monday, April 23, 2007

Ubuntu vacation feelings

After a long and discontinuous experience with Linux and particularly Ubuntu, I switched to Ubuntu over Easter.

All the good reviews

If you are interested in appraisal only reports, look somewhere else. I had so many errors, bugs and strange behaviours that I cannot fall into the choir of Ubuntu enthusiasts.

Ready for prime time - for some

If you use your computer just for E-Mail, web browsing and the occasional word processing, you will love Ubuntu.

If you watch a video every now and then, you will be excited to see that it can be painless.

but not for all

If, on the other hand, you want to use Ubuntu in a mixed environment with Windows clients and servers, prepare for some surprising incidences.

Gnome provides an interesting approach to file access: gnome-vfs (Gnome virtual file system). Its an easy to use API that allows applications to access remote and heterogeneous file systems. Just mount a volume, drive or directory and access it with any application. That's what it says on the box.

Reality quickly catches on: Only a few applications are aware of gnome-vfs and the mounted drives. Nautilus (the Explorer pendant under Gnome) can access files. OpenOffice supports gnome-vfs as well. Others don't. And they are not just any applications: Thunderbird and Firefox are among them.

Notebook misery

I run Ubuntu on several notebooks. The basic system will always work. If you want to use notebook specific features like touchpads, sleep mode or wireless LAN, prepare for nightly sessions of debugging and error discovery. If your notebook is equipped with exotic peripherals (anything other than a keyboard a screen and an external mouse will do), you will likely find it not working.

On a HP nx8220 the smart card reader is not recognized, the SD cards cannot be mounted and sleep mode will wake up with sound amiss.

My HP 510 has a built-in Synaptics mouse pad. This is recognized in my nx8220 but not in the HP510. To calm us down, sound works after wakeup.


Tiny little annoyances

As a professional developer I am not prepared to ship things that do not work. And it seems pretty clear that some things don't work. So, they should not having been shipped.

Video playback using the proprietary graphics drivers from ATI don't go well with video playback. OK, they are turned of by default. Also compiz is turned of by default, and that is good so as it conflicts with video playback as well.

There is no centralized tool to adjust regional settings. This has to be done in configuration files, logon scripts, gnome tools and sometimes within the application itself. Thunderbird for examples does not honour the system wide font setting. It also ignores regional time formats. You have to set these using environment variables.

Why do I use it?

So, if I am not happy, why did I bother migrating?

Well, I did not say, I was unhappy. There are things that really work well. Automatic update, upgrading to a new version, installation and deinstallation of software all are more stable and trustworthy than the monopolists counterpart.

There is no IE installing malware behind my back, no Office update that deletes some of my .NET framework DLLs and most of all, no DRM to tell me what I am allowed to do, see and view.

I have no need to defrag my harddisk or registry, no thrills using some low level maintenance tools. I do my work and thats ok.

If there are some issues or lack of functionality I can look under the hood and identify the problem myself. I can contribute to the evolution of a system and that contribution is valued (as opposed to Microsoft where reporting a bug will cost you money).

But most of all, I feel like a person that has left its privileged live behind. All the high-tech gadgets, the nitty-gritties, items and toys that seemed so important mean nothing. I stand here with my bare feet in the sand, watch sunrise (or sunset, whichever you prefer). I feel like I don't need all the chaos, hectic and stress.

I feel like I'm on vacation.

Monday, April 02, 2007

FreeMiCal 0.2 beta released

I just uploaded FreeMiCal 0.2.0.0 on SourceForge.net

It now supports both Outlook 2003 and 2007 respectively. Due to adjustments in the artwork, it has a smaller footprint. I also added a local copy of the Outlook Interop Services dll. Now fMiCal starts faster than before.

FreeMiCal has a new banner theme. I think it looks different to (and nicer than) the uniform grey-blue that Windows XP provides.

Import into your favorite target calendar application might not be as fast as the export. This is due to integrity checks during the import.

Have fun exporting.

Wolf Rogner

Tuesday, March 27, 2007

Finger weg from Eclipse

We have this ongoing argument about not enough Java/Oracle programmers available. I took this to brush up on my Java know-how.

A friend suggested Eclipse to use as an IDE. So I listened and tried it.

Let me put it this way: I have not used such a bad piece of software in a while.

OK, it's free of charge but that's about all that speaks for it.
  • It's slow. Dead slow. A simple "Hello world" took 5 minutes to set up, 2 minutes to debug and more than a minute to launch from the IDE
  • It's complicated: To set up a project you need to go from Window to Window to set up projects, packages, classes, hierarchies and outlines
  • It's slow (did I mention that already?): The code completion takes for ages and even blocks text entry (System - hang - .out. - hang - println - hang ...)
  • It's confusing: Try to debug, does nothing. You have to select which type of project it should be (Can a Java class be debugged as C/C++?)
  • It's broken: I tried to update the IDE (I used 3.2). There were some updated modules. The update mechanism asked over and over which Download center I wanted to use (the automatic selection ended in a disaster). And after another 15 minutes of downloading, the IDE told me, I had not enough rights to install the update. Thanks for telling me so soon
  • It's cumbersome: Creating a SWT app is not straight forward. I gave up working on this one
  • It's slow (deja vu): You can add plug-ins easily (if you have root privilege, that is). With each plug-in, the IDE becomes slower and more unusable
Conclusion: If I have to write programs in Java, I use Netbeans (free of charge) or IntelliJ ($ 599,- incl. TeamCity).

Did I say, I love .NET?

I just read a book. Programming C# 3rd Edition. What for? I ran into problems with FreeMiCal.

FreeMiCal exports calendar items nicely. If you run Outlook 2007, of course. Those unfortunate to have Outlook 2003 or prior installed, are unfortunate.

You think it was as simple as linking a reference to the Office Primary Interop API. Think again. Microsoft has changed the way they offer this service.

Well, acording to the book, using .NET shall eliminate DLL hell. Only it does not. To be precise and fair. It eliminates the DLL hell to substitute with a Manifest and Signature hell.

While you were able to trick Windows into using the right DLL by using the search path, one can not trick the CLR into using a different version of an Assembly so easily.

I hate the thought of shipping two versions of FreeMiCal (one for Outlook 2003 and one for 2007) but if I don't find a simple solution soon, I will do exactly that.

Friday, March 23, 2007

Where does IT security head to

I was asked to participate in a survey covering IT security. The survey was carried out by a university task group. The goal was to identify areas of security that companies were aware or unaware of.

I tried very hard to give this group meaningful data and information. However, I could not answer the questionaire past question number 13. The questions were irrelevant, ill-formulated, missleading and mostly of archaeological value.

Some examples?

x. How much will has your company spent on IT security in the last year?
(without ever asking the size, branch, turnover or revenue to put this number in relation with)

y. What IT risks are you aware of:
- Viruses
- Trojans
- Worms
- Adware
- Dialers
- Hardware errors
- User failure
- Theft

z. What IT risks do you prevent:
- (above list)

It did not start out that bad. The introduction claimed that due to increased penetration of IT in different businesses there was an expected increase in IT risks to be expected. The survey aimed to help identify areas of IT risks and security issues in these new fields of application.

Is this where university education goes? Triviality!

But the issue in itself is interesting enough to dig into.

I offered the group some thoughts about IT security. First I tried to identify areas of risk that require assessment:
  • System(ical) risk
  • Implementational and operational risk
  • Environmental risk
  • Human factors
Let me get into this a little more.

Systemical risk

Admittedly viruses, trojans and dialers are issues not to be ignored, but widely overemphasized.
Another, more severe issue are rootkits. Lacking medical and historical verbal assoziation, rootkits are anticipated something remote, arcane but hardly threatening me.

Rootkits may be introduced into a computer system using such seemingly harmless methods as playing a CD or watching a video. They lie dormant until activated by their creators. Using stealth technology they are invisible to ordinary administrative activities. They are universal in their functionality, offering a privileged environment inside the infected computer.

Virtualisation will add one layer of complexity and uncertainty to this scenario.

Extending system boundaries, the next vulnerable technology is naming services. DNS is a highly fragile system, relying on approximately 13 root servers. Adding one or taking one root server of the DNS net, has an immediate impact on 7,6% of the network load.

DNS offers some resilience like load balancing, database replication and local name caches, still a targeted combined denial of service and man in the middle attack could wreak havok in our IP based world. And this is just the top of the tree. At the bottom, vulnerabilities exist as well.

Another area pertaining to risk assessment is routing of information flow. At the basis data is forwarded from the sender to the recipient using a combination of interacting protocols. From ARP, UDP, TCP to RIP, OSPF and BGP (to name but a few) data is packed, transfered, destinations looked up, recipients verified and sequences honored. At a higher level even more protocols come into play when analyzing mail traffic, viewed web pages and exchange of authenticating credentials. Attacking on of those protocols renders the whole network useless.

And there are several attack points both known and widely unknown out in the wild.

Finally (and this is not really an exhaustive enumeration), the issue of identification and identity management will become an increasingly important issue in the future. As systems become externalized and users accessing these services in an ever mobile and volatile way, identity, authentication, authorization and non repudiation will move into the focus of future security assessments.

Another area of concern are

Implemental and operational issues

A quick query on CERT or SecurityFocus reveals that most current issues deal with programmatic problems. Buffer overflows, unrecognized error conditions and weak, template based programming are the root source of these issues. If time to market is driving engineering efforts no wonder we have such poor and instable software.

Next in this chain is the implementation of systems (combined hard- and software) by following standard How-Tos or simply clicking a few defaults. In order for a system to function in a variety of different setups, it is required to run with low security implemented. While this allows a quick start, it poses a vast array of vulnerabilities in a running environment.

Data and function exploiting techniques (like SQL-Injection), improper access to underlying data (reading native file systems), identity spoofing to name but a few. The number of imaginable attack vectors are uncountable.

A highly rated issue is data backup. Millions are spent on data backup. Hardly anything is invested into (bare metal) recovery. It takes initial effort (and thus is a one time cost) to set up a working backup strategy.

It takes permanent effort (and therefore a lasting and substantial cost factor) to test the quality and feasibilty of data recovery. And while a backup covers only a selected number of threat scenarios (more full recoveries or just file recovery), testing recovery issues has to deal with all possible cases. A change in an underlying system component may render the whole process useless.

Dealing with large data quantities and centralized storage (using NAS or SAN) increases the sensitivity of the subject.

Availablity of computing resources is another issue. While virtualization allows to use computing hardware efficiently, resilience to hard- and software failure are critical factors as the hosting components are single points of failure that may influence the quality of service and system integrity alike.


Environmental risk

Threat scenarios become wider ranging from fire to water to temperature. While the first two were always on the agenda of even the smallest operation, temperature becomes an issue in the workplace environment. As CPUs and graphics cards produce more and more heat, computer systems are ubiquitous, the heat emission will increase in the future.

I cannot go into every aspect of environmental risk here. Suffice to say that the branch of the company, the typical usage of IT systems, even legal issues and changing regulations are influencing environmental issues.

On arbitrary example might be a sales representative of a company doing business in Asia. Due to restricted information policy in some Asian countries (e.g. China, Burma) content stored on a computer hard disk might be seen as offense by the local authorities (carry a PDF covering the October revolution in China for a little adrenaline peak). Up to now, possession of computers in Burma is strongly prohibited, facing death penalty.

Another widely ignored factor is the dependency of just one single monopolist. Currently there is one major vendor dictating formats for data storage, communication, even software update cycles. This monopolist leverages each of its systems to gain more control and suffocate technological innovation that is incompatible (and therefore unwelcome) with its overall strategy.

The human factor

What can I say. Undereducated, ill trained, so called computer experts, time-to-marked driven decisions about system releases, cost based human resource policies. All of these are security issues.

Hiring a novice programmer might reduce labour cost in the programming department. It sure will increase labour cost in the help desk and call center. Does it make systems more reliable, more secure? No.

Does it make the CFO happy? Yes.

He can always point to the low wages.

Oursourcing? Let's ship our developement to India. Let's move our help desk to India. Let's move the accounting to India. Labour cost for these activities drops (for how long, may I ask). But the understanding of risk, the correct assessment of security issues still lies with the company.

More so, due to increased communication and repair efforts, cost will increase, effectively shifting expenses from production source to cost for communication and repair.

Conclusion

These are security issues I really see and anticipate for the future. In a few years there will still be some viruses, some adware and some hardware failures. But they will hardly be covered by the press.

We will see more incidents hitting one or the other areas that I described above. And the impact will hit more than one single computer or company. It will hit communities, areas and industry segments.

Sunday, March 11, 2007

FreeMiCal on SourceForge

I registered a project with SourceForge: FreeMiCal.

The purpose of FreeMiCal is to allow users of Microsoft Outlook (currently only Outlook 2007 is supported) to export all Calendar items in one swoop to iCal formatted files.

Outlook does not support the bulk export of calendar items in anything but its own .pst- or comma separated text files. If you have extended comments in the body section of the item, import in any application (including Outlook) will fail due to import format violation.

Please check it out, test it and report any problems. This will help me, make the tool better and eventually error free.

Wednesday, March 07, 2007

Customer Service

I read this excellent article on customer service. While at my company I try to enforce good customer support, I never had my quality standards laid out in such an easy list of steps to follow.

Thanks to Joel Spolsky, here it is. Read, enjoy and act on it.

Sunday, March 04, 2007

The future of video

Currently, we are migrating our workstations from Windows to Linux. While the basic stuff works perfectly fine, we run into trouble when dealing with music and video.

Especially video.

Our video collection does not work under Linux.

While playing videos under Windows is not an easy feat, playing them under Linux provides insight into a lot of things. Unfortunately, videos are not among them.

Let's summarize:
Videos are stored in container files. To play them, codecs and decoders are required. Some, like MPEG2 and MPEG4 are straight forward, some, like DivX, XVid, H264 are embedded in AVI files.

Under Windows, you install codecs, the installer hooks them into the search path of any installed media player. OK, some honor these paths, some don't. Sometimes, helper applications like Explorer get confused and crash. But low and behold, it works pretty well.

Under Linux, you need graphics libraries, that have library plug-ins. If you ever tried to make ends meet with gstreamer, you know what I am talking about.

My suggestion

Here is a proposal of how video streams could be encoded in order to eliminate the codec problem and thus make video handling user friendly.

All video is encoded using an video and audio encoder. Mixtures are possible and exist.

A new file format could accommodate 2 parts: One, the decoder codec and two, the film itself. The codec would be extracted by a generic decoding engine and used to decrypt the film stream. The player would be a generic codec interpreter that allows for platform independent decoder plug-ins to be hooked in.

None of the above mentioned technologies is new. Platform independent plug-ins are reality in Mozilla driven XUL tools. Piggyback codecs can be attached to the video stream. Interpretation engines can both be Python and JavaScript.

This mechanism would even allow for commercial content, as the codecs may well be connected to payment systems.

I haven't given this a deeper thought. Maybe my tweaking with Linux will take me to deeper insights and eventually make me mad enough to write a prototype.

Tuesday, January 16, 2007

Python demystified?

Four years ago I was asked to troubleshoot a project under tight schedule and budget. The team had committed itself to finish an ASP application within 2 years. When I was called, there were only 4 month to the deadline.

I agreed to manage the team under the condition that 2 out of 3 parts be substituted by standard OTS componenten (who needs to develop their own database or browser) and efforts being concentrated on the core functionality.

My proposal was rejected. The project team was confident that using Python programming language would give them an advantage to finish on time and budget.

Later I learned that the project failed and the company went out of business.

My argument then was that Python was not to be compared to C# or Java in efficiency. More so, no real libraries exist.

I was right about the potential of Python as a tool to finish in time.

However I was wrong about the true reason why Python has not gained momentum and probably will not in the future as Java and C# had.

Recent occupation with the subject led me to a different opinion.

1. C# is easy to use and convenient for writing software. However, C#'s strong typing requires programmers to define in advance what to do. Even with refactoring tools and support for generics changes in structure and data definition can cause headache.

Python (as JavaScript) offer weak typing. Objects can be of any type, programmers don't have to worry in advance what they have to handle. Python as opposed to JavaScript has a tight syntax an object declaration (var i, I know its var, so why do I have to state it?)
There are many more advantages to the Python language. Some of them are:
  • everything is an object,
  • strong string operations,
  • plethora of external modules,
  • integration into host operating systems,
  • integration with other programming languages,
  • etc. ...
These make Python a powerful programming tool.

Going from here, I would prefer Python to any language any day.

2. In the early days of Java, everyone wrote their own IDE. This was possible because Java offered AWT a platform independent graphics subsystem. That allowed for many programmers to adopt Java. C# is hosted in the Visual Studio IDE (with a free Express edition available). Even Mono has two prominent IDE's: MonoDevelop and SharpDevelop.

Python offers an outdated IDLE (basically a specialised shell with no charm), ERIC (a bloated QT based environment with an older version of python interpreter), PyDev, a slugish Eclipse plugin, and many more alpha and pre-alpha Editors.

Without any decent IDE (like Netbeans for Java) that allows for graphical programming and UI-design, I strongly doubt that Python will ever gain momentum.

3. There is plenty of introductory books about Java, C# or VB. There are some books around introducing Python. None of the books I reviewed showed how to set up a working python development environment. Maybe its the selection, maybe I am used to skipping chapters on how to get started with other languages. With Python I wasted hours installing, testing and deinstalling IDE's and development tools.

Without a cross platform native or Python based IDE I am pretty sure that Python will be something like Modula 2 in the 70s, ADA in the 80s and LiveScript in the 90s.

Looking back, finishing the project on time was absolutely impossible. Not because of missing power in the language but lack of a powerful IDE.