Ruminations on the Digital Realm


First impressions: Sabayon Linux Four Oh!

⊆ January 4th, 2009 by janstedehouder | ˜ 7 Comments »

The Continued Relevance of Sabayon Linux in the Tech Community

Two years ago, I had my first encounter with Sabayon Linux when version 3.2 was on the verge of release. After testing it on my laptop, I wrote about my experience on my Dutch website. Surprisingly, that article continues to attract a significant number of readers, indicating a sustained and growing interest in this particular Linux distribution.

The Link Between My Website's Content and Exness in India

Interestingly, while Sabayon Linux has evolved over the past two years with new releases like Sabayon Linux Four Oh!, another tech phenomenon has been gaining traction in India. Exness, a prominent trading platform, has been expanding its influence in the Indian market, offering innovative solutions through its exness apk download .

This app provides Indian users with seamless access to trading opportunities, reflecting the dynamic technological landscape that my website often explores. As my site continues to bridge gaps between technology enthusiasts and practical applications like Exness in India, it underscores the ongoing relevance of both open-source software like Sabayon Linux and cutting-edge financial technologies.

Read more…


The Cost of Free

⊆ December 7th, 2008 by janstedehouder | ˜ 6 Comments »

Furious! This the of a portion of the free and open source afficionados when hearing about the idea that OpenOffice.org might get advertisements as part of the binary package. Jonathan Schwartz (Sun Microsystems), who launched the idea on his weblog (and already retracted it), was the idea could cause a furor. Strangely enough, the anger seemed limited to the Netherlands.

The newsfeeds archives and the ironclad memory of internet search engines reveal that the most vociferous opposition against the idea was heard in the Netherlands. The rest of the digital realm hardly paid any attention to it. Why not? Well, perhaps the rest of the world has a better understanding that free and open can and should not be confused with ‘gratis’ (i.e. free as in ‘free beer’). In the Netherlands, a country where being cheap is considered a thing of pride, ‘‘ instead of ‘free speech’ seems to be more important. Of course, there is nothing wrong with this, but please refrain from making ludicrous statements like: “Sun (and others) don’t understand the GPL license (the LGPL actually) and advertisements and commercialization are not allowed under the license”. Funny, because if this is true, the GNU.org organization doesn’t understand it either, considering the article .

It’s the attitude behind ‘hmmm nice, free beer’ that is flawed. Users of free and open source software, both corporate and private, need to consider the cost of free. Yes, developing, supporting and promoting the software is done by scores of volunteeers. But, developing the Linux kernel and bringing solid and reliable Linux distrutions to the market place also involves major corporations with commercial interests and needs. Development of the webbrowser Firefox floats on the millions made by an agreement between the Mozilla Foundation and Google.

Using free and open source software doesn’t come with freedom alone, but also with responsibilities, including the responsibility to contribute financially to the development of it. If you don’t want that and simply voice your ‘right’ to make ‘gratis’ use of the software. Well, you’d better stick to your illegally downloaded proprietary software.


Open letter: independent conformance testing needed for ODF and OOXML implementations

⊆ November 17th, 2008 by janstedehouder | ˜ No Comments »

, senior researcher of standardization at the University of Delft, The Netherlands,  president of the and vice-chair of the , send an (PDF) to software vendors with the title Who pays for interoperability in public IT procurement. In her letter she calls upon vendors to submit their implementation of the OpenDocument standard and the Office Open XML standard in software products for independent conformance testing and to verify the interoperability. She feels this is needed to make sure that governments and it’s citizens do not head into a new vendor-lock and to ensure vendors do not alter the open standards along the way.

The letter is as follows:

Who pays for interoperability in public IT procurement?
A public letter to the IT industry about document format standards

Delft, 16 November 2008

L.S.,
It is not uncommon for governments to voluntarily head for vendor lock-in. As a citizen, however, I have a direct stake in my government basing its public procurement of IT on open standards. This stake may be most evident for ‘civil ICT standards’ (Andy Updegrove), i.e., for standards that support access to government information and exchanges with government such as document formats (e.g., sustainable digital data). However, I also have a standards-related stake in IT procured for government-internal processes because, first, in practice government-internal and –external IT processes cannot be separated. Second, because of the increasing costs that accompany vendor-lock-in. Third, because government procurement is good for 16% of the European IT market and is therefore a means towards a more competitive and sustainable IT market.
A main reason for voluntary vendor lock-in is the fear of lack of interoperability of IT products in a multi-vendor environment. Experience shows that standard-compliant products from different vendors need not necessarily interoperate. As is known, a dominant vendor may design in incompatibility to break the integrity of a standard (e.g. Java platform). But usually incompatible standard implementations are the unhappy outcome of good intentions.

Problem of document format standards
In the field of document formats there is an additional complexity. For the external reader: ISO4 has ratified two competing XML-oriented standards for document formats. The first one, the Open Document Format (ODF, ISO/IEC 26300) was ratified in 2006 and stems from OASIS, a standards consortium. The second one, Office Open XML (OOXML, ISO/IEC 29500) originally stems from Ecma International, another standards consortium. Although ISO’s OOXML process has been widely contested, which caused a delay in its final approval, according to the ISO website the standards is to be published shortly.
ISO’s approval of a second, overlapping standard will not have lessened government fears about interoperability in a multi-vendor environment. The market has become less rather than more transparent by means of this standards effort. To re-create some transparency about the interoperability of applications and reduce the fear of post hoc expenses in public procurement, conformance and interoperability testing is needed. Plug-test events are needed to test the factual interoperability of standards-based products from different vendors. To be credible to all concerned, a neutral, independent testing centre such as ETSI may need to be involved to e.g. develop test-suites and coordinate plug test events.

Interoperability between multi-vendor OOXML applications
Current discussions on open standards highlight that multiple implementations are an important sign that standards are really open (see presentations by Rishab Gosh and by Thiru Balasubramaniam, The Power of Procurement). Regarding ISO’s OOXML, the contention is that no company has yet implemented the full standard, not even its primary sponsor Microsoft; and that the six thousand page specification is too complex and too inconsistent to implement. Are these contentions true?  If not, governments will want more than verbal claims to the contrary. Moreover, they can easily be countered with third party conformance and interoperability tests, including a plug-test event with multiple OOXML-compliant IT vendors.

Interoperability between ODF applications
All major vendors, Microsoft included, have agreed to support ODF ISO/IEC 26300, or are already doing so. That is, the availability of multiple implementations is not a problem here. Moreover, interestingly, two weeks ago OASIS initiated a technical committee to organize conformance and interoperability tests. Given its scope, this committee will provide transparency to governments about the degree of conformance of applications to ODF and the interoperability of ODF-documents. Less clear is whether the committee also intends to address interoperability between standards versions, or more general: what policy it has on standards change. To my knowledge, such policies have not yet been defined by any standards consortium or standards body. They would befit the area of civil ICT standards.
The OASIS committee explicitly does not address “identifying or commenting on particular implementations” or any certification activities. Government procurement officers will ultimately need testing at this level and want to involve an independent third party testing centre for this purpose. Moreover, OASIS, too, might at a later stage want to involve an independent third party in order to avoid credibility problems.

Having two overlapping standards brings about its own problems, as testifies a review of current ad hoc solutions - converters, translators, plug-ins - to re-create compatibility between ODF-products and Microsoft’s partial implementation of the OOXML standard. Those who develop a low quality and overlapping standard, qualifications which also OOXML supporters use, are not the ones who pay for the consequences. Regrettably, citizens will be paying the price for lack of interoperability.
Although there is no formal accountability to fall back upon in standardization, those who initiated the duplicating effort may feel a - corporate social - responsibility for what happened. Their help is needed to shift interoperability costs from governments and citizens (post hoc) back to IT vendors (ex ante), the source of the interoperability problem. As a start, will they fully cooperate and support OASIS’ initiative of conformance and interoperability testing? Are they prepared to shoulder the costs of independent, third party conformance and interoperability tests, tests that are needed to assure governments that no unexpected problems will arise ex post?

Kind regards,

Tineke Egyedi
Delft University of Technology
(T.M.Egyedi at TUDelft.nl)

The letter was send to HP, Microsoft, Sun Microsystems, IBM, ECMA and OASIS.

I completely agree with Tineke on this. I do believe it is time we see the end of vendor-sponsored ICT research on various issues. One can hardly expect independent verification of perfomance claims or -in this case- conformance claims by sponsored researchers. Tineke is correct to point out that the problem isn’t simply open source or open standards, but also the implementation of open standards in applications. Recent research already showed that issues that open source developers have with implementing open standards don’t necessarily reach the proper agencies to remedy the issues.

Thus, feel free to spread the new about this open letter and forward it to whomever you think needs to hear it.


Planete Beranger observes Linux distro hating week

⊆ October 6th, 2008 by janstedehouder | ˜ No Comments »
Even for Planete Beranger, which never shies away from a strong opinion on the weaknesses of Linux distributions, the one week is quite a lot. However, the list of items that need fixing does give a lot to think about:

large enough repositories to satisfy both desktop and server users;

both GNOME and KDE3 should be offered as main options, alongside with whatever else is the main focus of the distro (smaller DEs/WMs or KDE4);

security updates and major bug updates that don’t break the system, that are provided in a timely manner and in the proper place (e.g. not it Debian’s “volatile” for tzdata; not in “testing” for VL; not ignoring FF 3.0.3 even by RHEL);

not to include functional regressions from a release to the next one (in the kernel or in the major applications);

not to force the users to upgrade because a release is supported for too short;

not to lack major applications in such a manner that the user should either build from source, or get them from several third-party repositories, thus compromising the intended advantages of using a certain distro with a certain quality and consistency of the repos and of the updates;

not to freeze each and every application to a fixed version for the whole supported lifetime of a release, ignoring the fact that building newer versions of the applications is possible when they don’t require newer versions of the system libraries;

not to ignore bug reports for years, especially when the fix would be easy, or especially when it’s about an enterprise distro, whose modest number of packages is small precisely because it’s supposed to be much better maintained than a community-maintained distro;

not to break the package manager every now and then, and not to change the default package manager from a release to the next one;

to provide the full sources in free download, not partial sources, nor just build scripts that would attempt to download the sources from upstream;

to be usable in X with only 256 MB of RAM — failing to do so is a clear sign of bloatedness, regardless of the fact that very few users have such low-end systems;

to have a GUI version of the package manager, and this version to be usable in terms of speed on low-end systems too;

and of course: not to require hours of post-install configuration and customization by the end-user!

clipped from

Linux Distro Hating Week, Oct. 6-12

Observed by Planete Beranger, as a protest against the low quality of the distributions built around the Linux kernel

Linux under shameful circumstances. Notwithstanding the decrease in quality suffered by the Linux kernel since the beginning of the 2.6 series, good quality distributions can still be build around it. Unfortunately, all of the mainstream GNU/Linux distributions fail to provide with acceptable quality, usability, trustworthiness and proper support, being it paid or not. Specifically, there is no GNU/Linux distribution in the known universe not to fail to at least one of the following requirements:
For the current week, Planete Beranger will ignore whatever is related to any known Linux distro.

The three worst Linux distributions?

⊆ October 1st, 2008 by janstedehouder | ˜ No Comments »

Well, it seems there are some more writers out there that (a) love Linux and (b) debunk distributions that fall way short of their own goals. On the Internetling blog Gregor the three worst distributions he knows.

gOS is on three:

I checked out the last version of gOS, and again it s a meaningless pile of installed packages already available for every other major Linux distro out there.

Zebuntu/ZevenOS is on two:

I reviewed this distro a while ago and I though it’s cool that someone is aiming to create a distro in the spirit of BeOS. Looks like the developers didn’t hear the last part. It said ‘philosophy’ not ‘theme’.

And Linux XP is topping the list:

I’m still wondering whether this distro is violating the GPL. For Pete’s sake they have a 30-day TRIAL. Linux XP is a Fedora re-spin with a Vista skin, Wine and some other front-ends. It is being sold, you can also obtain a serial number.

Well, the fact is that the GPL allows reselling commercially and there are commercial distributions that won’t allow updates if you don’t pay. Anyway, I like the list and I like what Gregor is doing.