SHARE

Virtualization

Another popular, yet outdated, buzzword used in the computer realm is “virtualization.” The term originally referred to using a virtual machine, where host software would create a simulated computer experience that could be accessed remotely, utilizing resources from another system.

It was first used by IBM to take advantage of the large mainframe systems they built back in the 1960s. By splitting the resources into individual logical virtual machines, they could run multiple applications at the same time and maximize efficiency. This efficient use of resources eventually led to the “client/server” format that was prevalent in many applications in the 1990s, where a centralized system contained important data in a very organized and secured environment. It became common practice for businesses to integrate this technology into their infrastructures.

A more modern example of virtualization would be the Remote Access feature in Windows XP, where someone could take control of your PC from another location in order to help troubleshoot your PC. However, there was a significant performance loss on the remote access side, while security flaws opened up potential hostile PC takeovers from viruses and other malware.

Virtualization still has a place in the computer world, but cloud computing is taking all the key benefits of virtualization and providing them in an Internet-based delivery model that is much more robust and facilitates far more users.

1
2
3
4
5
Rajiv Kothari is an industry insider who moonlights as a computer enthusiast, providing a different perspective and insight to new technologies.