Simplifying Android Migration: Using Mobile Virtualization to Reduce Time, Risk and Cost
By leveraging mobile virtualization, developers can migrate to Android with minimal porting cost for legacy code.
The Android mobile device platform from Google and the Open Handset Alliance has ignited the imagination of mobile original equipment manufacturer (OEMs), developers and end users. Since its introduction, Android has enjoyed a rapidly growing market presence and bullish prospects for new deployments. Handset manufacturer HTC reports that it alone shipped more than one million Android-based handsets in 2008 and Google has announced there will be more than two-dozen other handsets from other manufacturers by the end of 2009. Moreover, Androidís success as an open-source environment gives it additional momentum and rapid acceptance, and drives a fast-growing ecosystem of application developers.
However, the underlying standard software components and an active developer community have not necessarily made it easier for OEMs to bring Android-based devices to market. Smartphone developers must still cope with basic board support issues, from accommodating ARM-based chipsets to building device drivers, as well as refactoring and tuning legacy embedded and desktop code to run well on the new platform.
The first issue is simply the size of the task. Android is not just an operating system (OS), but a complete handset platform, combining a mobile OS kernel, a Java run-time (Dalvik), a telephony interface, and other middleware, plus browser and application environment. Integration of such a comprehensive package in a handset design would appear to demand an all-or-nothing approach. As tempting as developers might find a blank slate, existing investments, innovations, and expertise built on current platforms cannot realistically be abandoned in favor of an all-encompassing new technology. This would be problem enough, but the time-to-market pressures in this rapidly evolving market make the OEM’s life even more difficult.
Additionally, although platform standardization has its benefits, it presents a corresponding problem – how can developers create a competitive edge for their product? Indeed, much of the differentiation and branding of existing devices is locked up in the mobile OEM and mobile network operators’ (MNOs’) proprietary legacy code. The migration path to the new Android environment for legacy can be long and winding.
Legacy software migration may also require retargeting for new processor architectures. The more capable and thereby more demanding Android platform likely overwhelms many legacy hardware designs. In face, most current Android designs build on dual-core high-powered, dedicated ARM11 application processors. Beyond the additional effort required to port to a new CPU architecture, resulting higher costs can significantly restrict mobile OEMs from addressing the needs and volume of the middle and lower end of the market.
Streamlining Migration with Virtualization
Figure 1: OKL4 microvisor with Secure HyperCell Technology
Despite these daunting challenges, it is possible for developers to migrate to Android quickly and efficiently while preserving legacy code and the competitive edge it provides. By leveraging mobile virtualization, developers can migrate to Android with minimal porting cost for legacy code.
Mobile virtualization allows mobile developers to reuse existing code together with the OS that supports it in one virtual machine (VM) alongside a second VM hosting the Android environment. The availability of a separate VM enables re-use of existing software in its original form as part of an integrated solution that also incorporates Android and new Android applications. Mobile virtualization helps OEMs and other developers turn a potentially lengthy migration process into a modular integration task. Developers and integrators can preserve legacy code intact as it was developed on its legacy embedded OS (RTOS), inside a VM (OKL4 secure cell). Android resides and executes in its own VM, and communicates with other VM-hosted code via inter-VM communication services provided by the hypervisor. After the initial repartitioning into separate VMs, developers can selectively migrate functionality from one environment to another.
Moreover, mobile virtualization can enable savings when there is a requirement to support multiple software platforms for different handset products. For example, Android applications running in one VM can leverage tested and working Symbian drivers resident in another VM.
Figure 2: Fully virtualized Android architecture
A key advantage of virtualization lies in abstracting the particulars of underlying hardware as presented to guest OSes and applications. This abstraction limits dependencies and eases porting Android drivers across system designs by providing a largely unchanging abstraction. Moreover, once a design is virtualized, subsequent ports to new hardware designs are greatly simplified.
Mobile virtualization also offers abstraction of available physical processors as realized in virtual CPUs. This abstraction lets software previously running on multiple processors execute on a single CPU core when extra performance is not required. The ability to deploy the same software architecture, based on multiple VMs, on one or more physical processor cores offers developers the opportunity to use the same software architecture on different mobile handsets with different amounts of processor horsepower. The same flexibility can also be used within a given handset using multicore processor architecture to allow the workload to be consolidated on a subset of the available cores in order to reduce power consumption when peak-processing capacity is not required.
Open Kernel Labs
As intelligent devices are updated over time, development teams may need to migrate to a newer version of their chosen embedded OS. In some cases the OS may change completely from one generation of the OEM product to the next. For example, the adoption of Linux for use in embedded systems has resulted in many products migrating from a proprietary embedded OS with a new version of the end product.
These scenarios highlight the need to reuse software that has already been developed, debugged, and validated with legacy platforms. The traditional approach is to port legacy software line-by-line to the new OS for each new project. However, the effort required to port and retest this software on a new OS significantly reduces the benefit of reuse.
However, mobile virtualization lets legacy software of all kinds to be reused in its original OS environment, while new software is developed for new platforms like Android. By providing a VM for each required operating system or version, the virtualized environment reduces the effort required to integrate legacy software with new software, allowing creation of a new product release in less time and at lower cost.
Virtualization also gives developers the freedom to use any or all Android components in combination with other non-Android software and/or operating environments. For example, OEMs or integrators could combine WindowsMobile 7 or Symbian environment and applications with Android. Without mobile virtualization, such integration design would be extremely challenging if not impossible.
Mobile virtualization technology streamlines migration to Android by allowing developers to maintain existing legacy investments and reduce development overhead. Using mobile virtualization provides the developer with the greatest range of hardware and software choices, supports product differentiation, speeds time to market, and greatly mitigates migration risks.
Mobile virtualization eases choice and changes in hardware architecture. It also makes transparent support for single- and multi-core CPU designs giving OEMs greater latitude in bill of materials and end-product price points.
Mobile virtualization opens the door to a new class of Android devices for the mass market, supporting deployment with lower cost SoCs with a single ARM processor for both application and baseband software. Together, virtualization and Android can meet emerging demands for smart phone functionality at a feature phone price.
Postscript: The Need for Speed
There is an understandable concern that the many benefits of mobile virtualization may come at a price particularly performance overhead. Perhaps surprisingly, in many applications – especially resource-constrained single-processor designs – there is little or no loss of performance compared to native implementations. This table shows some numbers from a real application using both native and virtualized environments:
The table shows performance figures (in microseconds) of GtkPerf benchmarks executing on an ARM926ejs with 126MB RAM, running at 240MHz. The environment was Linux 2.6.24 and OKL4 3.0.1.