It’s something that we take for granted in the IT Business Solutions world: thinking that everyone has an in-depth understanding of all the terms, functions, and outcomes of IT infrastructure. So, every once in a while, we need to ask IT professionals to step back and discuss IT as it relates to the rest of the universe.
In this particular case, the word “virtualization” gets thrown around a lot. And though many of us understand the term, the meaning and many of its possible outcomes, virtualization unto itself has a far greater future ahead—perhaps far beyond what many realize.
Historically, virtualization has been used for a plethora of day-to-day business processes such as backing up an entire operating system, backing up servers, testing new software and installing upgrades or new configurations. The commonplace things that IT needs to accomplish every day.
However, as technology continues to evolve there is also the need to run older applications. And virtualization is key in this regard as many companies don’t move as fast as the outside world of innovation and new releases. For instance, imagine a large utility company needing to change applications every few years—just to stay current with software releases.
Plainly put: It’s never going to happen! Consistency in business processes and application infrastructure is key to stability. Pair this with the change management involved to launch a new application across a large organization, and the want and need for change slows down immensely.
These examples are still commonplace and even mundane to those in the know. But, like everything else in our fast-paced lives, needs change as a result of an ever-evolving external business environment, which is often out of our control. These changes are where virtualization is lending a much needed helping hand—particularly with data, how it’s shared, accessed, and used.
One of the newest forays into the virtualization world is not so much around the concept of IT security, but more so around the practice of simply being safe as it pertains to surfing the web, accessing data, and sharing across a multitude of platforms.
Of course, virtualization has seen inroads into such safe practices; for instance, using it for safe web browsing, creating secure build environments, and more. But now it seems that virtualization is spreading a little further—this time into the realm of mobile devices.
As of late, the National Security Agency (NSA) is strongly suggesting that many, if not all, government departments and businesses start implementing a smartphone platform secured through the use of virtualization—the technology currently only applying to laptops and tablets. This new “suggestion” comes in conjunction with the addition of the first virtualization-based smartphone security system on the US Commercial Solutions for Classified (CSfC) list—an HTC A9 smartphone security-hardened by Cog Systems. The CSfC is a program developed by the NSA to help US government agencies and the businesses that serve them to quickly build layered secure systems from approved components.
Now, even though this device isn’t fully approved, it brings about an interesting look into the future of virtualization technology and its potential impact on consumer markets: the HTC A9 smartphone is set to compete with non-secured devices including Samsung, BlackBerry, LG, and more.
And though I don’t expect this new market to rear its head in the near future, the not so far off future may still see its way into the enterprise market, bringing about an approach to endpoint security that will eventually steal market share from the largest mobility management vendors.
So, knowing that a new future is coming, understanding the breadth and depth of virtualization may be one of the most important moves an IT professional can make. Because one day soon, calling your virtualization partner from your current smartphone may not be an option.