Welcome to the 2009 edition of Predictions for the technology sector.

This is the eighth year in which the Deloitte Touche Tohmatsu Global TMT Industry Group has published its predictions for the year ahead. The volatility of the global economy in 2008 and the anticipated challenges ahead in 2009 have made this set of predictions particularly challenging, but also particularly important, to compose.

Some have questioned whether predictions are feasible amid such turbulence. Colleagues have asked how accurate they can be, given the uncertain outlook and many of the unprecedented conditions being experienced today.

Anticipating the course of the next 12 months is likely to be hard. But, in my view, that makes having a considered perspective more crucial than ever.

Predictions, by their nature, are not facts. But properly developed predictions should encompass a diverse array of views and inputs, which can kindle debate, inform possible directions and even identify necessary actions.

Every year, the methodology for Predictions is revisited, to assess how the approach could be made more robust. This year, our standard methodology has been bolstered through a program of in-depth interviews with 50 CXOs at some of the world's largest TMT companies. I am most grateful to all the respondents who offered up their insights and experience, at a time when their attention was particularly in demand

2009 is likely to challenge all of us. The technology sector is expected to be buffeted by grueling macroeconomic conditions in the year to come. But we should not forget that the need for the technology sector to deliver cost efficiencies, drive productivity improvements and provide the foundation for new products and services remains as vital as ever.

In short, while global growth may be cyclical, the need for technology is, and will remain, constant.

I wish you all the best for 2009.

Igal Brightman
Global Managing Partner
Technology, Media & Telecommunications
Industry Group

Making Every Electron Count: The Rise Of The SmartGrid

In 2009, over 16 percent of all energy used is expected to be in the form of electricity, up from 9 percent in 19731. Currently, the average efficiency of the world's legacy electricity grids is around only 33 percent. This contrasts with 60 percent efficiency for grids based on the latest technology2.

Just at the transmission and distribution levels, energy losses are around 7 percent3. Further, the cost of power outages and power quality disturbances is estimated at $180 billion annually in the United States alone4.

But there may be a solution: SmartGrid technologies. These have the potential to reduce up to 30 percent of electricity consumption5 and dramatically reduce the need for the construction of new power plants or the operation of environmentally harmful sources of generation.

Broadly speaking, SmartGrid companies add computer intelligence and networking to what is otherwise a 'dumb' electrical network.

For example, some SmartGrid technologies assist with load leveling of the electrical grid. This allows a powergenerating company to run cleaner power sources, such as nuclear or hydroelectric, at full output, 24-hours a day, while reducing the need to provide more carbon emitting gas, coal or oil plants in a surge (usually for only a couple of hours per day), to meet peak demand. Further, by reducing variability in demand, fewer new power plants need to be constructed.

Other examples of SmartGrid activities include: making the process of traditional electricity generation more efficient; connecting sustainable energy sources to the existing grid, and smart meters.

In 2009, SmartGrid companies may generate $25 billion in revenues, and represent the biggest and fastest growing sector in the GreenTech – possibly even the entire – technology market6. In late 2008, SmartGrid solutions providers were enjoying 50 percent revenue growth and an 80 percent increase in bookings7.

The continued growth of smart energy in 2009 may catalyze the creation of a smart energy stock index over the year. The transition of SmartGrid solutions providers, from addressing early adopters to undertaking largescale implementation, may encourage a spurt of mergers and acquisitions (M&A) activity in 20098.

The SmartGrid is likely to reach the consumer. Smart metering technologies are expected to enable consumers to 'time shift' their power usage to take advantage of off-peak rates, saving 20 percent on their bills9. In fact, although historically most electrical equipment was purchased by utilities, in 2009 more than half of all electrical equipment, both SmartGrid and older technologies, is expected to be purchased by consumers and enterprises10.

Development of a SmartGrid not only allows for more efficient use of the existing infrastructure, it also makes it more resilient, flexible to changing population and usage patterns, and able to accept sustainable but fluctuating sources of alternative energy.

Bottom line

Major manufacturers and utilities should explore partnerships with, and consider acquisitions of, smart energy companies. Companies should not be distracted by falling oil prices. Supply remains volatile, and demand uncertain. And while the price of oil has dropped over 50 percent from its 2008 peak, energy costs remain well above their long-term trends11.

Governments around the world should look at the cost effectiveness of trade offs between sustainable energy subsidies compared with commitments to upgrading the existing grid. The global downturn may make significant government support for SmartGrid spending unlikely, although some administrations are likely to adopt a policy of stimulative infrastructure spending on their electrical grids, some of which will be for SmartGrid equipment. But profit-oriented utilities and enterprises should continue to explore and deploy SmartGrid technologies that offer high returns on investment, even without government support, to conserve costs.

Governments unable to finance SmartGrid investment could instead promote the technology via information campaigns and stimulate adoption through tax incentives. And as governments increasingly focus on energy security, investing in the SmartGrid could be used to reduce dependence on non-domestic energy sources. It could also make the grid more resistant to military or terrorist attacks, by physical or digital means12.

Venture capitalists (VC) should devote increasing resources to understanding smart energy technologies. VC investment in the sector remains strong, even during the current economic crisis, with SmartGrid companies receiving the second largest slice of the GreenTech pie, behind only solar energy13.

Gadgets for free!* (*subject to contract)

Plummeting consumer confidence in much of the developed world has made the public wary of purchasing high ticket-price goods, including devices such as televisions and PCs14. This situation is likely to endure through 2009 and possibly beyond15.

Given the illiquid environment that may remain through 2009, companies are expected to become more focused on generating steady flows of income.

The combination of these two trends sets the scene for a significant expansion of the device subsidy model in 2009. This approach has long been practiced for mobile phones, set-top boxes, and broadband routers.

It has also been tried out, to a more limited extent, with GPS for navigation bundled with mapping data and low-end laptop computers with broadband subscriptions16.

In 2009, this approach is likely to be pervasive. It is likely to be extended to a widening array of devices, including televisions (bundled with subscriptions), music equipment (bundled with music) and high-end computers (bundled with everything from technical support to remote back-up services).

Bottom line

Bundling products and services together may prove essential in 2009 to stimulate an otherwise nervous, stalled market.

All companies in the value chain should develop a view on where they would want to be positioned in any bundle. For example, should they lead the offer, or just supply elements of it? They should offer a broad range of bundles, appealing to all market segments, from low end to high end, and from early adopters to more conservative users.

They should understand how their positioning, as leaders or suppliers, affects their customer support obligations17. If they do lead, self-service and premium-rate support services could lower costs and raise overall revenues. Companies should consider if there are any factors that could impede service delivery in the medium term: for example, could netbooks, bundled with mobile broadband, suffer service quality issues in the medium term18?

Companies should consider how the way consumers perceive their products may change as a result of being part of a bundle. A device that is offered nominally at no cost may cause the value to be perceived as lower than its true worth. This could have a positive effect if low nominal price encourages demand and if the subsidy and risk is being taken by another party in the bundle. However, it could also erode brand equity. Companies offering bundles should also manage risk related to defaults on payments. With mobile phones and set-top boxes, there are strong incentives for maintaining payments. Without service, the devices are useless. Televisions and home music-centers, however, are not subject to such constraints.

Companies should also consider what might cause bundles to be regarded adversely. Rising unemployment, for example, could make potential customers nervous of taking up the long-term contracts that are an integral element of some bundles. Some customers may deliberately avoid bundled deals in the view that they represent poorer value for money than disaggregated products. In some markets, a growing proportion of mobile phone users are increasingly opting for SIM-only contracts, which offer lower tariffs in lieu of a new phone19.

Finally, companies should note that bundles may not suit every customer's needs. Some customers may prefer a variant on the bundle whereby a single product or service is acquired via monthly installments. This approach is similar to the hire-purchase or hire-to-buy model that was popular in some markets in the 1970s. The need to pay a final balance to acquire the product outright could help maintain value. At the conclusion of the hire-purchase contract, a customer could be encouraged to use the residual value in the product to contribute to a higher specification replacement device.

Disrupting the PC: the rise of the netbook

The netbook, also known as the mini-notebook, is likely to be the fastest growing PC segment in 200920. It may represent in excess of 15 percent of all portable PC sales, or roughly 25 million units21.

As of the start of 2009, the established definition of a netbook was a notebook computer with a lowpowered x86-compatible processor (compatible with PC standard software), small screen (no larger than 10 inches), small keyboard, equipped with wireless connectivity, lightweight (under three pounds) and no optical disk drive. Netbooks are typically low cost, relative to other notebooks.

The appeal of netbooks has been categorized as making "great second computers for normal people, third computers for techies and first computers for children"22. Netbooks have become a favorite of travelers, who like their small screens and keyboards, especially on planes23.

In 2007, hardly any netbooks were sold. At the beginning of 2008, a few manufacturers offered netbooks; by year-end, most manufacturers offered them or planned to, and sales were forecast at 11 million, or 7 percent of the market24. At the time of writing, 8 of the top 10, and 14 of the top 20 selling mobile PCs on Amazon's US site were netbooks25.

Netbooks are expected to affect computer industry revenues materially and adversely, due to their low average selling price, which during 2009 could fall below $250. In 2009, despite probable growth in unit volumes, global PC sales measured in US dollars may fall for the first time26.

At first, netbooks do not seem to be fundamentally changing the way PCs are used. But over time the idea of inexpensive, portable PC-equivalents is likely to create new applications and uses for the PC. The netbook's architecture will also be used in non-mobile PCs, known as nettops. This may further deflate the value of the global PC market in 2009. However by year-end 2008, there were some early indications that nettops may not emulate the success of netbooks27.

Netbooks are likely to feature a wide variety of operating systems (OS) in 2009. The first netbooks were predominantly Linux machines, but as of year-end 2008, only about 30 percent still were28. However, Linux's share of netbooks could fall further in 2009: its returns rate was markedly higher than competing OS in late 2008, a trend that may have prompted some manufacturers to consider ceasing sales of Linux netbooks29.

The netbook's lower price point and its portability is likely to be causing wireless carriers to view them as being equivalent to large smart phones that merit subsidization to lock in wireless data subscribers. Subsidies for netbooks are likely to become available in North America in 2009, and have already been popular in Europe in 200830. Half of Europe's netbook sales were made by a telecommunications operator. A quarter of netbook sales in North America could be via carriers in 2009.

In 2009 the momentum behind netbooks should grow. First generation netbooks with suboptimal processors and insufficient storage are likely to be replaced by improved models with better processors and adequate hard drives.

Bottom line

PC manufacturers should pursue the netbook opportunity, but with care, since this approach could threaten already thin margins31. They should consider the market for premium netbooks, whose appeal may be esthetic rather than technical32.

Netbook manufacturers and distributors should make it clear to consumers what buying a Linux machine entails – and then be willing to offer support for those buyers. Initially high sales of Linux netbooks suggest that there is a market there, but the high returns indicate that more education of non-technical purchasers may be required.

OS manufacturers have already disclosed that the success of netbooks – which use non-premium versions of the OS – can have an adverse impact on margins. Their response should be to develop OS that are designed specifically for the netbook market33.

Other technology companies should be poised to take advantage of the proliferation of a new generation of inexpensive low-power CPUs. These are becoming much cheaper with better performance, thanks to the popularity of netbooks. Over time these chips may 'leap the fence' and proliferate in the embedded, consumer electronics and smart phone markets – with unit sales measured in the hundreds of millions and revenues in the tens of billions of dollars.

Manufacturers of home-media systems, DVRs and games consoles should take advantage of the new CPUs to reduce their bill of materials. On the other hand they need to make sure they take steps to prevent their proprietary devices from being supplanted by a general purpose device.

IT departments could deploy netbooks instead of conventional PCs for office workers. Netbooks could replace field force workers' clipboards or PDAs.

Carriers should consider incorporating netbook subsidies into their current cash-flow estimates. They should also analyze the impact wireless data usage driven by netbooks could have on the network34.

Moore's Law and risk

Gordon Moore's observation on falling prices for processing power has held for over 40 years35.

A corollary has been falling prices for digital storage and a rise in the types and speeds of communications networks36. However, these trends may have caused a corresponding growth in the risk associated with information leakage and data theft.

A memory stick costing a few dollars can hold tens of millions of items of data37. A terabyte drive costs under $20038. Media players with hundreds of gigabytes of storage are available for several hundred dollars, but more importantly, unlike terabyte drives, can be taken into the workplace without arousing suspicion39. Software designed to facilitate the transfer of files onto MP3 players is easily available from the Internet40.

Yet at the same time, confidential files, such as an individual's social security number, occupy the same paltry number of bytes as 10 years ago. It is therefore becoming ever easier for massive volumes of sensitive data to be lost or stolen.

The basic principle of Moore's Law has also applied to bandwidth: speeds, over fixed and wireless networks, have become ever faster. As speeds have risen, modems and routers used to connect to networks have required replacement. Obsolete communications equipment, that has been discarded or sold on, may still have passwords saved on it, which could allow the new owner to access confidential networks. If this hardware were to fall into the wrong hands, it could be used for multiple accesses to an organization's data41.

In 2009, over a billion items of personal data may be lost or stolen, and thousands of companies' data losses may be made public42. And it is likely that in many other cases, companies may never realize that their data had gone missing, or that intruders were regularly accessing their networks.

Bottom line

Risk needs to be mitigated by responsibility.

Employees at all levels need to be trained, ideally via in-person training, in how to minimize data risk. In some cases, it may be appropriate for the IT environment to be made secure by default. In other words, all stored files should be encrypted.

The growth in practices such as working while traveling, or working at home, can improve productivity, as well as address work-life balance. But any such innovations in working practices should be accompanied by a thorough appraisal of how they change the risk profile. In some instances, if highly sensitive data is involved, workers may have to be prohibited temporarily from working while in transit or in any potentially insecure location. Employees should be encouraged not to keep back-ups of files on personal storage devices, no matter how good their intentions may be.

Companies should develop policies not just for the deletion of data, but also for the secure disposable of any equipment that has held sensitive data, whether customer records or passwords providing access to internal networks.

IT departments should also consider alternatives to standard passwords, which may simply not be sufficiently secure. Passwords were designed by engineers, for the use of engineers. They were not originally designed for mass market use. IT departments may need to create new, easy to use, more secure alternatives to passwords, such as biometric data43.

There may also need to be firmer restrictions on the use of corporate IT by members of an employee's immediate family. For example, letting children use a laptop that also holds any sensitive data, in commercial or personal contexts, may be asking for trouble. For regular home workers dealing with confidential data, a secure, locked room may become a prerequisite to working outside of the office.

Companies should also remember that data loss is never likely to be confined solely to digital environments; compromising records on paper are still occasionally found in dumpsters. Loss of analog data, and the need to secure analog copies, should not be overlooked.

The common sense of green and lean IT

In January 2008, the price of oil hit $100 a barrel for the first time. By May, the first forecast of oil at $200 a barrel had been made44. By year-end 2008 companies that had appeared to be exemplars of caution by hedging oil at over $100 now seem short-sighted.

The sudden change in outlook, within just five months, underlines the volatility that remains in the price of energy45. Furthermore, volatility in supply continues, particularly as alternative sources of energy, from biofuels to solar, remain uncertain and geopolitical directions appear unpredictable46.

The imperative for companies to take control of their power consumption, for technology and more generally, therefore remains acute. In 2009, of the power management tools available, one of the most powerful may still be simple, plain, common sense. Some of the biggest saps on power this year are likely to remain ill-conceived planning and poor co-ordination and execution47.

In 2009, the aggregate volume of the world's data centers is likely to continue to grow, albeit possibly at a slower pace than in previous years. The efficiency of data centers is, however, likely to vary considerably. The latest, purpose-built data-centers should attain a power-unit effectiveness (PUE) rating of 1.2 or better48. A typical enterprise data-center is likely to achieve a PUE of 2.0 or worse49.

There are still many data centers located in buildings that were originally designed to house people. Here, assumptions, such as the rate at which air needs to be refreshed, were made for people (typically three times an hour), not machines (typically once an hour). This is leading to a waste of energy in buildings that house only machines. In some data centers in winter 2009, heating may still be programmed to temperatures suitable for humans, only for dedicated air-conditioning units, deployed for the data center, to cool them down again.

Another trait common to some data centers may be to cool all equipment to a uniform temperature, although not all devices require the same degree of coolness.

In 2009, there are still likely to be offices with banks of desktop computers that are left on at all times, despite the significant cost savings that could be available, if they were to be turned off during non-office hours.

The good news is that there may well be plenty of straightforward options still available that can deliver quick reductions in power usage, rapid returns on investment and may not require significant new spending.

Bottom line

Companies should not get complacent about the price and availability of oil50. The outlook for energy, in terms of supply and price, remains uncertain.

Organizations should consider as many options as possible for reducing energy.

Ready savings may still be available at data centers, which have become a significant IT facility at most enterprises. A first step would be to consider a data center as a collection of facilities, each of which has different working tolerances. A common sense approach would be to compartmentalize the data center, based on temperature requirements. A difference of just one degree could have a significant impact on the cost of cooling51. The installation of simple, flexible partitions, of the type used to separate cool stores in supermarkets, may provide sufficient insulation. A company could also replace existing equipment with more heat-tolerant substitutes.

Other simple tweaks and changes could lead to significant improvements. Analyzing any forms of power loss, for example by assessing the efficacy of a building's universal power supply, could reveal significant but easily remediable problems. An underperforming power supply could leak over 10 percent of power before it even entered the building52.

Undertaking thermal scans of air-cooling units could reveal their ineffectiveness, simply due to vents being poorly positioned, through air being directed at the wrong end of equipment, or through cool air being extracted before it reached its target destination.

Companies should also evaluate whether outsourcing data centers makes better sense. Data centers are likely to improve their efficiency steadily. Given this it may make more sense to tap into the latest technology available in the most modern, most efficient outsourced data centers. The optimal approach, however, may well be to change the underlying ethos of data centers, capping their size rather than assuming their inexorable expansion.

Energy consumption for IT should be linked to the overall approach to energy for a company. At present IT budgets are not linked to facilities management bills. Therefore, the IT budget is not currently affected by its power consumption53.

All departments can have a role to play in making technology more efficient. Human resources could provide a structure to provide incentives for staff to use technology more efficiently. At a basic level this would include requiring staff to turn off technology at the end of the day. Workers could also be encouraged to provide ideas on how technology energy consumption could be lowered, with the best ideas being rewarded by a share of the savings.

Downsizing the digital attic

No matter how big the attic, most people accumulate possessions at such a rate, they exhaust all the available space.

The same principle has held for digital storage. However, the usual response of people who have exhausted their disk space has been, in recent years, simply to add more. The steadily falling price of digital storage has, so far, enabled enterprises to keep digital capacity one step ahead of digital possessions.54

The danger of this approach is that users start to assume that space is infinite. As a consequence, their approach to file management becomes reckless. If storage space costs next to nothing, then next to nothing need be discarded.

One outcome is global digital storage centers the dimensions of which appear to be ballooning inexorably.

While digital storage has become cheaper, the associated costs, from raw power to maintenance and from metadata to search engines, have not kept up with the proliferation of data. Although the power required to maintain a unit of data has fallen, the cost of facilities which house the digital storage has tended to rise55. The cost of labor has also generally risen over time.

In 2009 there is therefore likely to be a changed approach to data. The imperative for companies to cut costs is likely to include attempts to control the escalation of storage costs. This could in turn cause a fundamental change in the way employees perceive the total cost of ownership of storage.

In 2009, companies may halve what they spend on physical storage. Spending on new servers may increase by just 4 percent56 to $58 billion57. Even so, the cost of keeping servers powered-up and cooled is expected to increase by over 15 percent, to $35 billion58.

Enterprises are still expected to create several exabytes of additional data per month59, highlighting the fact that current approaches to data storage appear neither lean nor green.

Bottom line

Companies should assess whether their total cost of storage is growing faster than revenues, and if so, whether this is beneficial to them.

Enterprises should review all aspects of digital data use and management. This includes behavior, such as training users on how to manage their 'data footprint'. Just as in real life it is prudent for collectors to discard old memorabilia from time-to-time, periodical equivalent assessment and pruning of digital files may also be useful.

Companies should assess technological approaches, such as de-duplication tools that could free space on existing servers by reducing the quantity of duplicate copies of the same file.

There should also be an assessment of individual applications to identify any particularly profligate tools: email is estimated to take up 25 percent of enterprise storage capacity60.

One response to lack of space in physical and digital worlds is offsite storage. In the digital context, this could mean using third-party providers to store overspill data. Companies using this approach should monitor costs and regulatory implications. External storage could be anywhere in the world, with 20,000 different regulations worldwide controlling the data61.

Companies should also evaluate third-party storage specialists' vulnerability to attack by hackers. These organizations may be more likely to be targeted by hackers, as penetrating storage specialists' security may provide access to many companies' data, rather than just one. However, such companies may also be far more secure than a typical enterprise. If so, there may be a case for transferring a greater volume of files to storage specialists.

Hardware makers should reorient their sales approach in a manner similar to energy companies. Rather than selling just hardware, they should also integrate software and additional services aimed at minimizing consumption.

To read Part Two of this article please Click Here.

Footnotes

1 Key World Energy Statistics, International Energy Agency, 2008. See: http://www.iea.org/textbase/nppdf/free/2008/key_stats_2008.pdf

2 Ibid.

3 Electricity distribution loss among 'highest' in Europe, The Times of Malta, 2 November 2008. See: www.timesofmalta.com/articles/view/20081102/local/electricity-distribution-loss-among-highest-in-europe

4 Overview of the Electric Grid, US Department of Energy. See: http://www.energetics.com/gridworks/grid.html

5 The evangelist of smart energy, Business Week, 5 August 2005. See: http://www.businessweek.com/magazine/content/05_31/b3945083_mz009.htm

6 See: Global electrical infrastructure: generation, transmission & distribution equipment & services: a market and competitive analysis 2007-2017, Goulden Reports, July 2007.

7 For example, see: RuggedCom reports 204% increase in profitability on revenue growth of 54% for Q2 fiscal 2009, CNW Group, 14 November 2008.

8 Acquisitions in SmartGrid: Get Used to It, Greentech Media, 14 November 2008.

9 Free electricity meters will help us save money and the planet, Scotland on Sunday, 6 January 2008.

10 See: http://www.smartgridnews.com/artman/publish/grid_modernization_initiatives/How_Private_Investment_Is_Pushing_Utilities_to_the_Edge

11 Oil down 50% from July high, The Daily Telegraph, 16 October 2008.

12 America's vulnerable energy grid, Council on Foreign Relations, 24 April 2007.

13 US venture capital investment in Cleantech companies reaches record $1.6 billion in Q3 2008 with a surge in later stage financings, Market Watch, 30 October 2008.

14 Consumer confidence, concerns, spending and attitudes to recession, Nielsen, 20 June 2008.

15 Global chip sales forecast to drop 5.6 percent in 2009, Reuters, 19 November 2009. See: http://www.reuters.com/article/rbssTechMediaTelecomNews/idUSN1935931520081119

16 Laptop subsidies: how aggressive will operators be? Fierce Wireless, 3 November 2008.

17 Ibid.

18 Mobile broadband: has 3's time come?, Ovum Euronews, 2008. See: http://www.ovum.com/news/euronews.asp?id=6857

19 SIM-only mobile services take off, vnunet, 22 May 2008.

20 Gartner: Mini-laptops keep PC market from sinking, Computer World, 15 October 2008.

21 Acer: We'll take half the netbook market in 2009, Slash Gear, 3 November 2008.

22 Not interested in a Netbook computer? Consider the Honda Fit, CNET News, 9 November 2008.

23 Computer lite: Netbooks are a viable alternative to laptops, WalletPop, 14 November 2008.

24 Worldwide PC microprocessor market hits record levels of unit shipments again in 3Q08, but outlook for market has grown murky, IDC, 3 November 2008.

25 See: http://www.amazon.com/gp/bestsellers/pc/565108/ref=pd_ts_zbw_pc_565108_more?&pf_rd_p=364098901&pf_rd_s=right- 4&pf_rd_t=101&pf_rd_i=565108&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=0P2GF5FHAQDGY9VTW3QP

26 PCs: will netbooks currency mean big '09 revenue decline?, Barron's, 30 October 2008.

27 Intel optimistic about future nettop performance, but PC vendors not so sure, TG Daily, 26 June 2008.

28 Microsoft missing netbook growth as Linux wins sales (Update2), Bloomberg, 6 November 2008.

29 Netbook return rates much higher for Linux than Windows, Slashdot, 5 October 2008; Apricot drops Linux for its netbooks, Tech Radar, 23 October 2008.

30 Does Windows have a future on netbooks? Microsoft Watch, 3 November 2008.

31 Smaller PCs cause worry for industry, The New York Times, 21 July 2008.

32 Eee PC S101: A premium netbook, CNET News, 6 October 2008.

33 Windows 7 is 'netbook-friendly', Windows for Devices, 31 October 2008.

34 Dongles gobbling network capacity, Top 10 Broadband, 18 September 2008.

35 Moore's Law states that the 'number of processors on a chip will double about every two years'. For more information, see: http://www.intel.com/technology/mooreslaw/index.htm

36 For further background on the falling price for storage, see: Wired 11.05, View, Wired, May 2003. See: http://www.wired.com/wired/archive/11.05/view.html?pg=5 For more background on the falling price of connectivity, see: Wholesale Internet Bandwidth Prices Keep Falling, GigaOM, 7 October 2008. See: http://gigaom.com/2008/10/07/wholesale-internet-bandwidth-prices-keep-falling/

37 Firms ignore MP3 and memory stick security risk, vnunet, 14 July 2008.

38 Prices obtained from Amazon websites in various countries in November 2008.

39 Firms ignore MP3 and memory stick security risk, vnunet, 14 July 2008.

40 Pod slurping: the latest data threat, Search Security, 15 February 2007.

41 Second-hand VPN kit sold on eBay for under a pound automatically opens a connection to Kirklees council's private networks, ITPRO, 29 September 2008. See: http://www.itpro.co.uk/606618/second-hand-vpn-leads-to-security-breach-at-council

42 Legislation requiring companies to disclose data losses, rather than only to comment if the loss were to become public knowledge, is likely to become more widespread. Legislation may also be enacted that requires companies to undertake deeper levels of data protection. For example see: http://www.mass.gov/?pageID=ocaterminal&L=3&L0=Home&L1=Consumer&L2=Identity+Theft&sid=Eoca&b=terminalcontent&f=idtheft_201cmr17&csid=Eoca

43 Five biometric technologies business could use, ITPRO, 13 November 2008. See: http://www.itpro.co.uk/608235/five-biometric-technologies-businesses-could-use

44 Oil price 'may hit $200 a barrel' BBC, 7 May 2008; Crude falling despite news of OPEC meeting, MSNBC, 14 November 2008.

45 See: http://www.opec.org/home/basket.aspx

46 Investors suffer as ethanol boom dries up, Financial Times, 21 October 2008; also: BP: 'We should see volatility increase', Euractiv, 1 October 2008.

47 Cassatt Survey finds massive data center energy waste, Mission Critical, 1 May 2008.

48 Google: The world's most efficient data centers, Data Center Knowledge, 1 October 2008. Note: PUE is an emerging standard promoted by The Green Grid and others in the data center industry to provide a consistent way to measure the ratio of power delivered to IT equipment compared with the total amount of power used by the facility. PUE allows data center managers to calculate how much power is driving the actual IT equipment versus non-IT elements such as cooling and lighting. A PUE rating of 1.0 is considered the optimal. Most companies' data centers have a PUE of closer to 2.0.

49 Ibid.

50 Needed: $26 trillion. Cash required to avert energy supply shock, Calgary Herald, 13 November 2008.

51 The effect of data center temperature on energy efficiency, Thermal and Thermomechanical Phenomena in Electronic Systems, May 2008.

52 Putting your data center on an energy diet, Computerworld, 21 August 2007.

53 IT departments waste energy, survey finds, Green IT News, 16 June 2008.

54 2007: Biometrics Grows, Storage Shrinks, "Free" disappears, TechNewsWorld, 9 January 2007. See: http://www.technewsworld.com/story/55063.html

55 Analysis: the growing cost of data and information storage, Public Technology, 21 August 2006.

56 Storage industry braces for 2009 slowdown in storage spending, Search Storage, 16 October 2008.

57 The Diverse and Exploding Digital Universe, IDC, March 2008.

58 Ibid.

59 Based on: Storage in the sky, ARN, 2 July 2008; The diverse and exploding digital universe, IDC, March 2008; Exabyte era emerging, Fat Pipe, 15 September 2007; Keeping pace with digital storage, LSI, September 2008.

60 The future of enterprise storage, CRN, 5 August 2008.

61 IBM estimates that there are over 20,000 regulations worldwide that affect data storage, accessibility, and retention requirements. For further information, see: http://www.redbooks.ibm.com/redpapers/pdfs/redp4284.pdf

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.