Published as »What Does “Personal Computer” Mean? Communicating a New Paradigm«. (c) UXPA, (2010). Reprinted from User Experience Magazine, Volume 9, Issue 4, www.usabilityprofessionals.org/upa_publications/user_experience/past_issues/2010-4.html
This article is about Desktop Virtualization and the implications on the PC as we all know it for the better part of 30 years. If we say “desktop” then we mean a category of graphical user interfaces which has its origin in Xerox PARC. Windows, icons, menus and a pointing device are essential to GUI-Desktop systems like Mac OS, Windows, and Gnome. The desktop is a metaphor to provide a familiar environment to the user. To that extent, the desktop is virtual already, and we have to pay attention to the words we use to describe the shift towards the cloud(!) and to desktop virtualizaion, where the physical hardware on your desk(!) or lap is less important. Early usability issues have been addressed to make VDI solutions a reasonable alternative for enterprises with hundreds or even thousands of seats, or companies who want to provide Desktops as a Service.
What’s a PC anyway? It is easy to point at a computer box and assume that this is it. In fact, it’s not.
Things have changed significantly since the advent of the personal computing paradigm. The idea of reserving an entire machine for just one single user was absurd in the late 1960s. Then the future was invented at Xerox PARC with the Xerox Alto computer (also known as the Interims-Dynabook) and its graphical user interface with early components of the desktop metaphor, Ethernet access, and laser printing. Since the mid 1980s, PCs have become standard equipment for almost every office and knowledge worker. Today, the family tree of PCs extends from desktop PCs – placed on top of or underneath the desk – to laptops, notebooks, sub-notebooks, net-books, and the like. However, all these terms just refer to the form factor of the hardware case. They miss the point of the personal relationship between users and their digital desktops.
The user is key in this equation; therefore it is the P in PC that needs the care and attention of user experience experts. It’s the data, the tools, and preferences that make up the user’s personal working environment. Damage to or loss of any of these components can have a severe impact on the usefulness and perceived robustness of the system. This is most obvious for data loss, but tools that are not backward-compatible have the same effect. Furthermore, an unexpected change of user preferences, by updating the operating system, for instance, causes a drop in efficiency because the user’s familiarity with the system suffers when the system does not behave as expected any more.
The scope of the issue is even broader today than it was 15 years ago, because network-based services and social software now draw the user’s attention to the World Wide Web. Formerly local and private documents are now transformed into social objects by uploading them to photo, video, slide sharing, and collaboration and community sites.
What’s a PC anyway? It is the personal relationship between users and their digital work environment with their documents, applications, customized settings, and online data and connections that matter. The hardware is only a means of lighting the pixels that make up a magical window into the digital world. The PC continues to offer a familiar local desktop environment, but regarding the online environment, it is a mere access device that can be exchanged for any other computer with a web browser and Internet access.
Things are changing once again and will make the computer box as we know it obsolete – or rather, it will be replaced by a virtual box.
The requirements for PCs in large and medium-sized enterprises are ease of administration, low energy costs, and, last but not least, flexibility for employees to move to other work places, to work from home, or to work mobile at any other location. Physical PCs do not sufficiently address these requirements because the personal working environment is confined to the hardware where it is installed and running. An alternative is the virtualized PC, which runs the operating system on emulated PC hardware. This is called desktop virtualization.
It is quite easy to confuse the “desktop,” as used in “desktop computer,” with the one used in the “desktop metaphor.” The latter is, in a sense, a virtual desktop already. Now, in addition, desktop virtualization turns the computer into software by introducing a new layer between hardware and the operating system. The virtualization layer consists of a hypervisor that emulates PC hardware on top of a host computer, to run standard operating systems, such as Windows or Linux.
Running thousands of desktop instances in data centers is quite energy- and cost-efficient compared to the same amount of actual PCs. And defining pool policies, for sizing, cloning and recycling of desktops, as well as group assignments between user directories and desktop pools, saves a lot of work for the administrator. He or she decides if the assignment between employee and virtual machine should either be personal, as it used to be with a PC under the desk in the office, or flexible to grant temporary access to PCs in call-centers, classrooms, or Internet cafes.
What’s left from the PC on the user’s end? There is still a mouse, keyboard, and monitor. But, since all computation takes place in the data center, neither a powerful CPU, nor memory, nor a hard drive is required on the client side. The fan is obsolete as well, which leads to an access device with no moving parts and no noise; thus, lower power consumption, longevity of the client, and improved ergonomics at the work place are advantages of the virtual desktop. Maintenance costs are also low or non-existent, as, for instance, the Sun Ray thin clients don’t even have a local operating system anymore. Mobile access to the desktop session is possible with any RDP (remote desktop protocol) client. And in another case, the Oracle Virtual Desktop Infrastructure uses a Java-enabled web browser.
In order to deliver a competitive end-user experience to the clients, certain areas need to be considered. CPU, memory, and storage do not usually pose a problem on the virtualization side, because the requirements for a specified number of machines can be estimated in advance. The usage of other shared resources is more difficult to predict, such as access to the network and the storage system by hundreds of simultaneously running virtual machines. And when it comes to motion graphics, the bandwidth between data center and thin-client, as well as the client’s display performance, becomes an issue.
Response times are typically discussed in three orders of magnitude from 0.1 to 10 seconds. A system response that takes longer than 0.05-0.2 seconds is no longer perceived as co-instantaneous. For example, a skilled typist produces 300 characters per minute. This equates to pressing a key every 0.2 seconds on average. If the response time is about the same, then the output is at least one letter late! Between 0.2 and 2 seconds is the range where the user feels in control of the system. The delay is recognized, but the loop of command and result feels like a smooth dialog with the system. If an operation takes longer than two seconds, a progress indicator should be displayed. But even a progress bar cannot keep the attention of the user for longer than 10 seconds. After that, the user has to recognize the system state once again, and plan the next interaction steps to accomplish the task. The measurements for the perceptual level, dialog level, and cognitive level have been well known for decades. Thus, web developers should remember that rich Internet applications should also respond in time, or provide appropriate feedback to improve usability in cases when network or computation latency is too high.
In order to perceive smooth animations, a fourth time range, that is even an order below 0.1 seconds, should be considered. Movie cameras use a standard exposure of 24 fps (frames per second). However, the human eye is able to detect frequencies of 60 Hz, which is also the typical frame rate for HD TV. And when it comes to games on large screens with fast animations, rates up to 100 fps are necessary to maintain the illusion of motion.
Today, games and other applications with high-frequency full-screen updates, for instance, video editing or CAD, are out of scope for desktop virtualization. But fast refresh rates are necessary anyway for cursor movements and direct manipulation tasks such as window dragging or scrolling through long documents. Otherwise, the user might perceive hiccups in the flow, and may become confused or even irritated by interacting with the system. Adobe Flash movies and other video content should run smoothly as well. Here, it is better to sacrifice the image quality a little bit instead of running out of sync between image and sound. This is managed by the protocol between the thin client and the virtualization server.
Desktop virtualization is ready for prime time because virtual desktop infrastructure (VDI) systems provide a level of quality and service for enterprise customers that is comparative to the classic gray PC boxes. Flexibility, mobility, and total cost of ownership considerations convince hospitals, universities, telcos, banks, and other companies with hundreds or thousands of employees to deploy virtualized desktops. For example, there was an installation for JavaOne 2009 at the Moscone Center in San Francisco. Throughout the week, every conference attendee could access three personally assigned virtual machines running Windows 7RC, Ubuntu 8.10, and OpenSolaris 2009.06. All together, 12,000 desktops were created with four Sun VDI hosts, five hypervisor hosts, and three storage servers.
The PC is no longer the center of the digital universe. On one end of the spectrum, the user’s data and attention moves to social web services; on the other end, the personal computer itself will move into the data center and become part of the “cloud” service. New kinds of business models are being developed right now. Sooner rather than later, service providers will offer desktop as a service to companies of all sizes.