Studies/Papers


Watts Water Technologies needed to replace 1000 old shop-floor terminals with more flexible desktops. They ended up choosing SUSE Linux Enterprise Desktop on Neoware thin client hardwares along with ZENworks to help manage the environment. You can also check out the Open PR blog entry for some info.  From the customer success story…

After evaluating several desktop and thin-client solutions, Watts Water Technologies selected SUSE Linux Enterprise Desktop for use in a thin-client deployment, as well as Novell ZENworks to manage more than 1,000 desktops.

“Linux really shines and Novell has a great Linux strategy,” said Ty Muscat, Data Center Manager for Watts Water Technologies. “We have almost every platform imaginable and are moving more and more to SUSE Linux Enterprise desktops and servers. We like having an open platform with a lot of flexibility.”

The results:

“Without Novell, we would have had to invest far more to get anything similar to what we have with SUSE Linux Enterprise Desktop,” said Muscat. “The ongoing management and maintenance costs of other options would have been overwhelming for us.”

MacGyver knew his stuff when it came to building a flame thrower out of popsicle sticks, chewing gum, dental floss and a styrofoam cup — plus he always had that cool Swiss Army knife. But I bet even he wouldn’t have been able to use eight PlayStation 3’s, Linux and some technical hacker-know-how to do some scientific supercomputing. But someone’s done it!

This interesting blog article from ZDnet talks about how a researcher from University of Massachusetts built a very low cost “supercomputer” capable of about 200 GFlops all running on PS3’s. While the Linux distro used wasn’t SUSE Linux Enterprise (it was Yellow Dog Linux)… and while there are several other considerations which keep the PS3 from being the scientific computing platform of choice, it’s definitely another fine example of how flexible Linux can be compared to other OS’s.

So, if you’re looking for an excuse to get approval for a purchase order of equipment for your gaming– er, “supercomputing lab”… look no further.

From the article:

The emergence of global standards for measuring the energy efficiency of datacentres moved a step closer yesterday with the launch of a raft of new research papers from green IT industry consortium The Green Grid.

The consortium has released an updated version of its Datacentre Energy Efficiency Metrics whitepaper that incorporates infrastructure efficiency into the original metrics.

It also said that it expects its Power Usage Effectiveness (PUE) and Datacentre efficiency metric for assessing the proportion of power going into a datacentre that is used to power the IT kit to be adopted by the industry and used by all datacentres to report their efficiency.

More here.

Sick of hearing about “Green” yet? Better learn to deal with it, “Green”‘s drumbeat is really just beginning and it’s not just a fad, it’s something that fits a condition we have in IT, and it’s a way to get more money and headcount for managers, so listen up.

What is “Green” computing? Here’s as good a definition as I could find, click through for more from Techtarget.

Green computing is the environmentally responsible use of computers and related resources. Such practices include the implementation of energy-efficient central processing units (CPUs), servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste (e-waste).

One of the earliest initiatives toward green computing in the United States was the voluntary labeling program known as Energy Star. It was conceived by the Environmental Protection Agency (EPA) in 1992 to promote energy efficiency in hardware of all kinds. The Energy Star label became a common sight, especially in notebook computers and displays. Similar programs have been adopted in Europe and Asia.

How “Green” is your office environment? Take the Greening the Cube Farm quiz and see!

Last but not least, is buying “Green” storage for business continuity, disaster recovery and archival enough? Not nearly enough, according to the marketing director of Overland Storage.

RossB

Will Virtualization Doom Server Sales?

From the article:

The promise behind virtualization has long been that one well-equipped server could do the work of several. So what happens once customers begin following that idea — and buying fewer servers?

That scenario is cause for concern, according to industry analyst Infiniti Research. This week, the firm published a study indicating that server sales will trail off in coming years, and even decline, as virtualization reduces the need for physical hardware.

The company’s TechNavio online research unit released the findings to coincide with the upcoming Storage Expo conference in London next week.

The study suggests that sales will slow to two percent in 2008 — representing a marked decline from the 5.9 percent annual growth rates that fellow market researcher IDC saw in 2006, and the 8.9 percent from a recent Gartner study.

Read the rest of the article.

A new Aberdeen Group study reports that as Virtualization keeps expanding both in it’s role in the datacenter and as a tool for consolidation of services/storage and cost savings, it’s becoming even more vital as a way to provide Business Continuity, High Availability and Disaster Recovery.

For us, virtualization is a given. Our system utilization was low and if there was a peak, it only happened for an hour.

The rest of the time our systems are idle. Our application servers are just not using enough of the physical resources.

— Manager of Portal Operations for a Consumer and Applications Portal Company

The report includes a number of case studies and significant findings, such as:

  • 54% of firms use virtualization to support DR plans
  • 48% use virtualization to support HA strategies
  • 50% use virtualization to support BC implementation

For the typical organization who suffers from excess capacity and the costs associated, virtualization is a must. Along with that move to enterprise level virtualization comes the need for enterprise level business continuity planning.

Since the use of virtualization for BC, HA, and DR purposes is still merging, it is imperative that companies make sure it is implemented with the careful planning and testing of systems. This also will help insure there are no unnecessary redundancies and more efficient process in data recovery management. This latter issue, which is just starting to take hold within the physical world, is certainly going to be the next big issue as more companies use virtualization to support BC, HA, and DR processes.

Recovering data generated from virtualized systems will become a crucial discussion in the coming months.

Register for a free copy of the report here.

Enjoy,

RossB

AMD’s Virtual Experience is a pretty cool marketing/virtual tradeshow where you can view videos and short presentations on a variety of technologies related to AMD — such as SUSE Linux Enterprise in the Novell booth.

Of course, you could visit the other vendors at the virtual trade show, but why not start by checking out the Novell booth and learn how SLE takes advantage of the technologies in the latest generation of AMD Quad-core Opteron processors…

WALTHAM, Mass.— 02 Oct 2007— The Instituto del Fondo Nacional de la Vivienda (INFONAVIT), Mexico’s largest mortgage lender, has tapped SUSE® Linux Enterprise from Novell® as its platform for customer transactions. Providing services to more than 12 million people and 830,000 employers, INFONAVIT has deployed SUSE Linux Enterprise as the basis for its new payment collection system.

and furthermore,

INFONAVIT estimates it has saved approximately $3 million in hardware costs and $1.5 million in software licensing, and the stability and reliability of SUSE Linux Enterprise Server will also help keep the long-term cost of ownership low.

The full press release is here.

Let us know what you think about YaST?  Whether you’re using YaST in openSUSE, SUSE Linux Enterprise Server or Desktop, take this short survey and let us know how you’re using YaST today and how to improve it

The Survey will be available until mid-November and the results published on opensuse.org

According to an SGI press release and article on CNN.com, they’ve had several recent customer wins. Here’s a list of the organizations they specifically mentioned who are using SUSE Linux Enterprise Server 10 on their recently purchased SGI servers:

  • IFREMER, the French Research Institute for Exploitation of the Sea
    • SGI Altix ICE 8200 blade cluster powered by 256 Intel Xeon cores and a 16TB SGI InfiniteStorage 4500 system
  • The Center for Parallel Computing (NACAD) at the Federal University of Rio de Janeiro (Universidade Federal do Rio de Janeiro: UFRJ)
    • 152 Intel Xeon cores and 304GB of memory. UFRJ also upgraded its SGI Altix 450 system to a configuration with 32 Intel Itanium 2 processor cores and 64GB of memory. The SGI systems, which run SUSE Linux Enterprise Server 10 from Novell, are backed by an expanded 32TB SGI InfiniteStorage 4050 network attached storage solution.
  • University of Minnesota Supercomputing Institute for Digital Simulation and Advanced Computation
    • 2,048-core SGI Altix XE 1300 cluster,… Outfitted with more than 4TB of memory across 256 compute nodes. Minnesota’s system also is linked via a high-bandwidth InfiniBand connection and runs SUSE Linux Enterprise Server 10 from Novell
  • Dedic, the contact center company from Portugal Telecom Group
    • Powered by a total of 56 Intel Xeon cores and 72GB of memory
  • Universidade Estadual de Campinas (UNICAMP), School of Chemical Engineering
    • 136-core, 272GB SGI Altix XE1300 cluster supported by an 8TB network-attached SGI InfiniteStorage 350 solution
  • Universidade Estadual de Campinas (UNICAMP), CENAPAD, a National Center for High-Performance Computing
    • 176-core SGI Altix 450 compute solution and a combination of new and upgraded SGI InfiniteStorage systems that added more than 56TB of capacity
  • The Translational Genomics Research Institute (TGen) in Phoenix AZ
    • SGI Altix 4700 system with 576GB memory and 48 Intel Itanium 2 cores running Novell SUSE Linux Enterprise Server 10
  • University of West Florida’s School of Science and Engineering
    • SGI Altix 450 system with 32 Intel Itanium 2 Dual Cores (64 core total) and 248GB memory, with SUSE Enterprise Linux 10
  • European Molecular Biology Laboratory (EGBL)
    • 256GB of memory, all of which can be made available to a single data mining problem, the 16-core SGI Altix 450 server

I don’t know about you, but I’m always impressed by the compute power that organizations are implementing these days… especially when talking about SGI.

Looking for more examples of who’s using SUSE Linux Enterprise? Stop by Novell’s customer success home page or the list of Novell press releases and see who else has a documented success story for SUSE Linux Enterprise.

Are you a Novell NetWare pro but new to Linux?  There are several resources (look here, here and here for starters) available to bring you up to speed.  How many of them have you taken the time to check out?  Free training/resources is GOOD!!  These are good quality resources, BTW — not shabby training.

Here’s yet another resource for you to consider… a Cool Solutions article which gives you the “Reader’s Digest” version of the key points.  Will you let this one pass you by too?

Overview

We know that over 50% of IT organizations currently use or are doing pilot programs using Virtualization, thanks to Forrester Research’s recent surveys, what we should know know more about is both the security benefits of virtualization and the best practices of how to secure those virtual servers.

Note: In this article a Virtualization Server (VS) is the machine that Virtual Machines (VM) are virtualized on. A VM can be anything that runs in a virtual container, desktop, server, appliances etc.

Security Benefits of Virtualization

The security benefits of running VSes are many, including:

  • Isolation – Running an OS in a VM helps secure it from other apps, you can have each application in it’s own OS container, keeps bad things that happen to the individual VM from spreading to others
  • Rollback – Experienced sysadmins know how important it is to be able to rollback changes that don’t work, getting the system to a previous stable state is paramount for production machines, and VM’s are much easier to rollback, being software only
  • Abstraction – The VM’s have limited access to the physical hardware, the drivers are easier to manage and there is less chance of physical issues with the VM’s than with an OS that runs directly on the hardware
  • Portability – The ease of which you can take the running VM and either migrate it to a new VS or get that VM up and running on another server can make the difference for disaster recovery. With the ability to virtualize the OS and data, it’s much easier to swap out to replacement machines, making patch testing and upgrading much easier too
  • Deployment – Deploying instances of individual servers is 10x easier with VM technologies, physical machine deployments are much more dependent on the physical hardware. Individual machine and OS security settings on the VS are important and the ability to surround the VM’s with appropriate security from the VS is also important (such as using AppArmor to wrap a VM, allowing only a set number of functions) to the security of each VM instance

Security Drawbacks of Virtualization

The chief security drawback of Virtualization is anything that could affect the functioning of the VS, which include any applications, services or activities that might negatively affect the VS’s ability to provide services to and properly host it’s VMs. You would not believe the things we have seen running on VS hardware, everything from BitTorrent to MP3 Shoutcast Radio Stations to very intensive file and print sharing.

It’s important to pare down the VS’s processes to the bare minimum, remove or disable all daemons that might be running, using chkconfig or the YaST Runlevel Editor. The typical VS might have up to 100 running daemons in runlevels 3 and 5, most of which are not necessary. Running the VS in runlevel 3 (no X started by default) will save a number of MB or RAM used, and decrease the load on the CPU for graphical tasks.

Wrapup

SearchServerVirtualization has a set of articles (some of which “inspired” this article) by Anil Desai which are excellent and right to the point in helping you secure your VS’s and VM’s. In particular, his tip articles “Virtualization Security Benefits” and “Improving VM Security” are both good overviews and contain valuable drill-down explanations to help you secure your VS/VM environments.

Enjoy,

RossB

When does it make sense to use a Virtual Server as opposed to a Physical Server? That’s a question that a lot of people are currently discussing amongst themselves and with us on the technology side.

Before an organization can think of using Physical or Virtual Servers, they will need to be aware of Virtualization as a whole.

Forrester Research recently reported that the number of IT organizations implementing or piloting virtualization reached and exceeded 50% in 2006 with the split being roughly 40% implementations and 11% piloting. The growth year on year was about 11% for implementations and flat for piloting. The number of respondents that were aware of virtualization stands at 92%, with only 8% professing to know nothing about the technology.

I’d say that awareness of virtualization is pretty high among our readers, but we all know someone who is just happy in their distributed server single instance world, either they haven’t had any situations where virtualization was an answer, or more likely, they can’t quite grok the concept of what is going on.

Rackspace (a hosting provider) recently polled their own customers (and therefore already a savvy bunch) and found that 57% of their hosting customers had virtualized infrastructure and over 70% of those surveyed said they would host mission-critical apps on virtualized platforms.

Not surprisingly,  of those surveyed who would host such applications on a virtual platform, over 70% said it would be preferable to do so with a hosted provider’s help, like say, Rackspace?  Also not a shocker was the fact that of the 60%+ that didn’t currently use virtualization said they would try it with the help of a hosting provider, again, Rackspace.

The reasons customers used virtualization turned out to be primarily Development and Testing (37%), followed by Web Applications with (22%) and lastly Application Servers at (12%).  No mention of virtualized firewalls or storage.

Last but not least was the mix of virtualization platforms and vendors, with VMWare (60%) Microsoft’s Virtual Server (14%) and Xen (11%).  Of course our own SLES 10 SP1 uses the Xen hypervisor to great advantage, you can get an evaluation version (no timeout, but limited support for 60 days) for any of the platforms you support.

RossB

A Vancouver B.C. law firm has overruled Microsoft Windows’ objection to being replaced with SUSE Linux Enterprise Desktop (SLED) 10. The firm’s IT manager, Richard Giroux says that level of downtime he’s seen in other firms is:

“Simply unacceptable.”

After testing a number of competitive desktop Linux distributions, Giroux chose SLED 10, citing it’s speed and included applications as a deciding factor in SUSE’s favor. To handle a number of problem or non-cross-platform applications the firm uses Citrix clients running on SLED, including it’s dictation and audio functions, along with Microsoft Office suite and other applications that primarily run on Microsoft Windows.

“Having an open environment with Linux gives us the opportunity to select from thousands of high-quality open source programs,”

One of the other features about SLED that Giroux likes is the subscription model, as it’s not categorized as a capital outlay expense, rather it’s an operational expense. The flat subscription costs are much more predictable for budgeting and the inclusion of many standard applications in SLED is an added plus.

“By nature, open source software has to integrate well with other applications, so we can implement them easily and cost-effectively. One application for transcription playback has already saved us thousands of dollars.”

As a final shot across Microsoft’s bows, Giroux cites his ability to do the entire office upgrade in a single weekend and the (conservative estimate) 20% maintenance savings effective immediately.

Read more about Whitelaw-Twining’s summary judgement in favor of Open Platform Solutions in the Novell Customer Showcase.

Enjoy,

RossB

WALTHAM, Mass.— 14 Aug 2007— Novell today announced that global electronics giant Casio Computer (Casio) is using SUSE Linux Enterprise Server from Novell with integrated Xen virtualization software to reduce the cost of consolidating servers while improving flexibility compared with alternative, proprietary virtualization software. Casio is using SUSE Linux Enterprise Server and the paravirtualized drivers in the SUSE Linux Enterprise Virtual Machine Driver Pack to consolidate Windows and Linux servers in order to improve productivity by standardizing communication tools and to promote better IT control and cost efficiency.

Here’s the official press release.

Another major win for SUSE Linux Enterprise – this time in India! From the press release

The Electronics Corporation of Tamil Nadu (ELCOT) in India is rolling out SUSE Linux Enterprise across 30,000 desktops and 1,880 servers in Tamil Nadu’s schools, after ELCOT itself has migrated its entire IT infrastructure from Microsoft Windows to SUSE Linux Enterprise Server and SUSE Linux Enterprise Desktop from Novell.

They are so excited about the benefits they’ve seen from their adoption of open source, they even created their own success story video and are sharing it on YouTube. The video highlights their use and experience with SUSE Linux Enterprise, common questions, and even talks about the opportunities that this strategy is opening for them — including being able to bring an ATM to market for 1/5 the cost of competitors! Very cool! Here’s where you can see the official Novell success story and PDF.

More Novell announcements from Linux World.

It’s certainly one of the hot topics of discussion in OSS circles… should video drivers be open source or proprietary? Ideally two of top three major video chipset manufacturers (namely ATI and Nvidia) could be convinced that they should fully open source their video drivers for the benefit of all. ATI info is here and here.  Nvidia info is here.  Intel already releases open source drivers for at least some of their chipsets, if not all. The general reply to those requests has always been that they won’t because it would expose their secret sauce to their competitors and they would lose their performance edge. I don’t know enough about the techie details to comment on the validity of those claims, but one thing is certain – end users don’t really care. Typical end users just want their video cards to work. That’s why you’ll find SLED 10 offering support for both proprietary and open source drivers out of the box.

Will the proprietary and open source drivers offer the same features and performance? In a word – No. An interesting review (a bit dated now, but still interesting) from Phoronix directly compared the open source driver to the proprietary driver for an ATI video card. Check out the results here to see what they found out.

Clearly, pressure is mounting as Dell and Google have been ramping up the requests/pressure on ATI to get more open drivers for these chipsets. Hey, the more the merrier! This will be an interesting space to watch as things develop…

From the press release:

SUNNYVALE, Calif., July 23 /PRNewswire-FirstCall/ — SGI and NASA today announced that the agency has selected a record-setting SGI(R) Altix(R) supercomputer in its evaluation of next-generation technology to meet future high-performance computing (HPC) requirements. The system was acquired as part of NAS Technology Refresh (NTR), a four-phase procurement process that eventually will replace the Columbia supercomputer system, powered by SGI Altix.

NASA’s new SGI Altix system is expected to be installed in August at the NASA Advanced Supercomputing (NAS) facility at the Ames Research Center at Moffett Field, Calif. The new system will be the first supercomputer to operate 2,048 processor cores and 4TB of memory under a single copy of Linux(R) — creating the largest Linux single system image (SSI) in the world. A larger SSI can accelerate scientific research by making all of the system’s processors and memory available to solve a single problem, or several problems at once.

Driven by 1,024 Dual-Core Intel(R) Itanium(R) 2 processors, the new system will generate 13.1 TFLOPs of compute power. The system’s dual-core processors allow more computing power per square foot, enabling NASA to pack more computing power into its supercomputing center. NASA also acquired an ultra-dense 240TB SGI(R) InfiniteStorage 10000 system to efficiently handle the massive data storage requirements.

Read more. 

Now that’s a lot of processors in one spot, guess they’ll be putting in a couple of new Carrier AC units to keep that one cool!

RossB

Wow, sometimes you think you’re really tied into a product and who’s using it and where, but even I was surprised at the amount of people and institutions deploying, using and enhancing OpenOffice.org. The vast majority are kind enough to take the time to document their experiences so others can take advantage of those findings.

I can’t tell you how many discussions, presentations and other encounters I have had lately where people have been so kind as to share their objections and misgivings about OpenOffice vs Microsoft Office, and partly I wanted to make this roundup into a set of references that could easily and quickly be investigated by those who are OO-curious and want to go and see who is doing what and how with this great office suite.

Blogs and Articles

A very interesting blog posting recently by one Nate Grondin on the subject of OO in Schools started off this research project, and as I write this, there are over 50 Firefox tabs with relevant info waiting to be included somehow, this is an incredibly rich area of progress on the Open Source front.

Another interesting site and resource is the OpenOffice Training, Tips and Ideas blog, with a lot of good articles, links to free and commercial training and books etc, subscribe to the RSS feed and keep an eye on this one.

Projects and Coordination Sites

The Education Project, hosted on the OpenOffice.org site is a great starting place for those who want to help educators and students, either in classes or individually to help develop OO.o. The goal of the project is: “to help teachers as well as students or anybody involved in education to enter the OpenOffice.org project and find a place where to contribute or to find informations.” With it’s tools and development categories, it’s user and development lists and other documentation, this is a great resource for the Education community.

The openSUSE.org site has a great education page, run by our friend James Tremblay from Newmarket NH. The goals of the project are to:

  • Catalog and collect all educational software built or converted to run on Linux
  • Separate all cataloged software into server and desktop categories
  • subdivide all collected software into it’s curricula discipline and age groups
  • build the “Edu-cd” (which is an add-on CD/DVD specifically for education tools/programs)

Additionally, on this page you can find links to the Education News, how to make an account and get involved, the IRC channel info, links to the Education Application Index including the currently collected lists of Desktop and Server Education Programs, HowTo’s and a Wishlist. This is an excellent project to get involved in, it’s very easy to contribute and you’ll get a lot out of it.

Another resource on the OpenOffice.org website is the Major OpenOffice.org Deployments Wiki, where a complete world-wide overview of deployments of OpenOffice.org are listed by categories including Governments, Schools and Universities, Private Sector and other areas. There are 14+ major deployments listed in North America in the Schools and Education category, there must be more, get your deployment listed and join in.

Other major resources you’ll find as part of the OpenOffice.org site are the OpenOffice Marketing Project, the Why OpenOffice Wiki and the very comprehensive Case for Switching (to OpenOffice.org) Wiki pages.

The most authoritative location for documentation of OO.o is the official OpenOffice Documentation Project site. With it’s plethora of information, links and resources, and broken up into Users and Developers sections, this is the main location for OO.o documentation and it’s creation. If you want to contribute to the project, go here and get started.

The OSDI project is one that wants to distribute OO.o CD’s to people in dis-advantaged environments, they’re raising money and almost have the amount they need to do the first big push, give ’em a few bucks and help get OO.o into people’s hands and their computers and free ’em from the Microsoft Tax.

Focusing on the K-12 market is a great site called K12OS whose goal is to provide news, links, resources and discussion about the use of Open Source in the K-12 market. With it’s discussion forums, listserv’s and loads of links, this should be on every educator’s daily rotation.

Training and Tutorials

Linked off the official OpenOffice Documentation Project siteand the OpenOffice.org Wiki, but not obvious, is an incredible resource called the OOo Help Outline, with it’s long list of FAQ’s, HowTo’s and per-application help documentation it’s a must for everyone.

Then visit the somewhat simplistic but very useful Tutorials for OpenOffice howto site. Check out the various categories, organized by the application in OO.o, and contribute if you feel so moved.

Next up, is the ByteBot site with it’s OpenOffice.org training materials. You can download the materials and use them, just please do ask them if you use the materials commercially. They also have a rudimentary Linux training course you can get the same way. ByteBot also has an archive of the mysteriously missing OpenOffice Unofficial FAQ, which should be located at OOAuthors, but seems to not be linked properly.

A very nicely done site and great resource is the iTrainOnline site, it offers Open Source documentation and courseware, including OO.o Write and Impress mini-courses.

Statistics, Other Roundups and Misc.

The OpenOffice.org Market Share Analysis site is excellent for those who want to see how OO.o is making inroads in their market, or to show others that a grassroots change is happening, or just to keep an eye on the numbers and see what is happening in each area for OO.o deployments.

Last, and surely winning an award for the most links in a single HTML page is the Why Open Source in Schools article. I have yet to fully investigate the articles, lists and other resources listed there, but I’ll add the most useful ones to the Education page for our blog.

Hopefully this roundup is helpful for everyone, we’re very serious about helping Open Source get implemented in the Education environment. If you have resources or think we missed something, or might have a cool project you want to get some free publicity for (subject to review and approval) leave a comment or email me.

RossB

Novell recently held an internal event in which developers were encouraged to work on projects of interest and passion, rather than the ones they are normally working on. This great article from ARS Technica interviewed some of the developers involved and gives a glimpse of some of the projects they have been working on. Be sure to hit the subtle “Next Page” link at the bottom of that website’s page so you can see the details on some of the cool projects.

Of the ones listed in the article, my favorite has to be Joe Shaw’s web interface for Banshee music player. Think of it as a “sling box for Banshee”. Very cool! You can even check out his live demo site – (but please be nice to his test server).

Another very interesting project actually was awarded the best-overall-project honors…

Stephan Kulow and Richard Guenther, who extended the SUSE Build Service platform so that it can automatically recompile Debian source packages and turn them into RPMs with dependencies properly mapped to other SUSE packages.

You can check out the dozens of other projects that were being experimented with during the week at the openSUSE Idea Pool

(Updated 8/30/07) Nice related audio podcast discussion and review of their favorites from Erin and friends at Novell Open Audio

Next Page »