Foreword

Welcome to the 2008 edition of the DTT Global TMT Industry Group's Technology Predictions.

As predicted in last year's Technology report, the environment has become increasingly important to all of the activities of technology companies, from the greenness of products and services through to the sustainability of production processes. The user interface has also taken center stage, in all parts of the technology sector. Storage, and its cost, became a sensitive issue, with research suggesting that the volume of data created during the year would approach the total amount of storage available1. Biometric technologies continued to make progress, and were included in a growing range of products, from laptops to smart phones. Free offers in the technology sector continued to represent a double-edged sword, and in some markets, free services such as VoIP began to struggle. Carousel fraud continued to frustrate law enforcement agencies and technology companies alike, as gray markets continued to thrive. New combinations of existing technologies made a strong show, as suggested. Social networking continued to provide strong growth, and technology companies played their part in trying to monetize such services. Parasitic power systems made steady progress, though mostly on the margins of the sector. And the bionic human has marched – and run – steadily forward.

The outlook for the technology sector in 2008 is similarly varied. In this year's Predictions: virtualization of the enterprise working environment is likely to make steady gains, although companies are likely to question its panacea status. A greater skills shortage may emerge, sometimes as a result of erstwhile legacy technologies being viewed as the future. LED technology may start to supersede the incandescent light-bulb. The image of nanotechnology may be enhanced through a growing awareness of its ability to better the environment. The ability to be anonymous on the Internet may decline as users, traders and regulators call for more widespread authentication of users' identities. Earning revenues from PCs may become less about selling equipment and more about selling services, particularly for data protection. Privacy may become a key selling point for many online businesses. XBRL, a new reporting language for corporations may well become increasingly widespread. A digital divide separating advanced digital users and their own data may deepen, due to incompatible standards. And finally, all the while, man's most precious resource – water – is likely to become ever more scarce.

I am often asked how the DTT TMT Global Industry Group's Predictions differ from the many similarly titled reports produced by other organizations. I believe Predictions has a unique combination of objectives and methodology.

The Predictions series has been designed to provide a diverse selection of views and thoughts that challenge, inform and engage industry leaders and executives. It neither aims, nor claims to be a comprehensive forecast of every anticipated event. Its aim is to provide a point of view, but by the very nature of predictions, the outcome may differ from what was originally expected.

The inherent unpredictability of the global technology sector can be mitigated by having a robust methodology that synthesizes multiple sources of information and a wide body of opinions that require thorough peer reviews. The 2008 series of Predictions has drawn on internal and external inputs from conversations with member firm clients, contributions from DTT member firms' 6,000 partners and managers specializing in TMT, and discussions with industry analysts. As last year, Predictions for the technology sector has been able to draw upon the insight gleaned from a series of 20 interviews with leading executives from around the world on the key industry theme of digitization. These interviews have been published in a book, Digital Dilemmas, available online ( www.deloitte.com/tmt).

I hope the result of our endeavors provides you with plenty of food for thought for the year ahead. On behalf of DTT's Global TMT Industry Group, may I take this opportunity to wish you all the best for an enjoyable 2008.

Igal Brightman
Global Managing Partner
Technology, Media & Telecommunications

Executive summary

Virtualization was one of the most talked about technologies of 2007. In 2008, enterprises may ask more probing questions about the limits, as well as the potential, of virtualization, possibly leading to a more measured deployment of the technology. This should benefit companies and suppliers alike. Principal questions that IT departments might ask concern the timing, robustness of security and cost of ownership. While virtualization is still likely to be the future for many parts of a company's IT environment, a cautious, longer term view may replace the haste of the recent past. Companies may also ask themselves how will it change the way we work; how will we manage it; which business objectives would virtualization address and which problems could it solve? Full costbenefit analyses should even consider factors such as the potential impacts of virtualization on tax planning.

Just 10 years ago, the consensus was that the mainframe was doomed to extinction. Yet, over the past few years mainframes have been going from strength to strength. In 2008, this trend seems set to continue. The installed base of mainframes is rising at about 5 percent a year. Every day in 2008, mainframes are expected to process more than 30 billion transactions, representing 70 percent of the world's business data. While the mainframe has proven to be resilient, the workforce appears not to be. In 2008, while the importance of mainframes may rise, the number of staff skilled in their maintenance and development may fall. What used to be regarded as legacy may then, as has happened with mainframe computers, become viewed as the way of the future. Enterprises, as well as the industry that supplies them, should review their long-term talent requirements regularly. There are several potential approaches for dealing with talent scarcity. One would be to migrate applications away from a platform with a skills pool that appears to be poor. But this is not always feasible. A second approach would be to train staff. A third approach could to be to make the underlying technology easier to use.

The conventional, highly inefficient, incandescent light bulb may finally start to be superseded by the white LED in 2008. The conventional light bulb represents as much as 25 percent of a typical household's yearly power bill. The luminous efficiency (the ratio of light output to power input) of the traditional light bulb is, at best, 2.6 percent. Current LEDs boast a luminous efficiency of up to 10 percent; next generation models offer up to 22 percent. In 2008, the LED should become commercially viable and its challenges with regard to intensity and color should have been resolved. Given that the LED is a semiconductor technology, it should benefit from Moore's Law, with manufacturing costs expected to decline by 50 percent every 18 months. The semiconductor industry should improve its LED technology, especially since it is one of the few highgrowth areas in the sector at present. Governments should consider subsidizing LED purchases with rebates or tax deductions.

The public image of nanotechnology has recently become tainted, despite mass market use of nanotechnology-enabled products. People are concerned about the possible malign consequences of the release, accidental or otherwise, of engineered nanoparticles into the environment. Along with the growing distrust of nanotechnology there has been a steady rise in concern over the environment; nanotechnology could have an important role to play in healing, rather than harming the planet. Therefore in 2008, the public's demonization of nanotechnology could be reversed, and a green-tinged halo could replace its horns. Nanotechnology is already being used to address several environmental issues: generating clean power, reducing existing power consumption, providing drinkable water, cleaning contaminated land, reducing harmful emissions and enabling long-life portable power. While the industry should explain where nanotechnology has made a positive difference, it should be wary of succumbing to hype or overpromising. The sector should always bear in mind that, as with any emerging technology, adoption is always going to be an economic decision, whether for private consumers or for businesses.

It is often argued that one of the great benefits of the Web is anonymity. Contributions to the Internet, whether in the form of blogs, comments on products, video uploads, dialog within chat rooms and contributions to online encyclopedias, can all be made under aliases. In some respects this could be viewed as an extension of freedom of speech. But anonymity can give rise to abuses. In 2008, there is likely to be an increasing clamor, from regulators, users and online traders, for the Internet to require people to provide authenticated identity every time they make any transaction. A move to online authentication, while initially being regarded as an affront to liberty by some, could ultimately be good for business and for users. For example, bolstering consumer confidence in e-commerce, online auctions, Internet chatrooms and other transactional websites should help sustain growth, suppressing fears about the growing volumes of online fraud and other malign behavior. The industry should, however, bear in mind that full authenticity is unlikely; as in the real world, those who really want to remain anonymous are likely to be able to do so.

Since the launch of the first PC, the volume of and, more critically, the value of data stored on computers has grown exponentially. Credit card information, address books, clients' business plans and paid-for MP3 tracks may all be kept on the same computer. One way of measuring the value of data is in terms of the replacement cost. But the price paid, in terms of inconvenience, time and loss of credibility in dealing with the aftermath of lost data, may be even greater. As a result, some consumers can end up spending more on protecting their data than on the underlying device. This trend may extend beyond the PC to other devices, from MP3 players to mobile phones, from DVRs to external hard drives. Computer manufacturers, which face falling margins as a result of declining average selling prices, may need to launch a set of complementary services, including data protection.

In 2008, resistance may grow to the volume and depth of information that websites capture about consumers' online behavior. While most use of behavioral data is likely to be innocuous, this perception does not always tie in with fact. A few negative headlines about the abuse of individuals' data may be sufficient for Web users to demand that tracking be moderated or stopped altogether. If the depth of consumer knowledge becomes regarded any more negatively, some websites may differentiate on the basis of privacy. Getting the balance right may prove difficult during 2008, not least because government regulation in some territories may force some websites to move in the opposite direction. Online companies should educate users in online privacy, in a clear manner that neither trivializes nor exaggerates the way in which data is used. Surfers should then be given the autonomy to select the level of privacy that they would prefer. However, users should also be aware that one of the benefits of losing some anonymity is low-cost, or even free, online services. Laws concerning online privacy should be updated, and ideally standardized across regions, if not globally. The online world should also be careful not to over-react to demands for privacy. Companies should be careful to distinguish between the clamor of a vocal minority and the genuine concerns of a more passive majority.

The technology and financial services sectors have already bestowed a generous collection of acronyms on the world. But in 2008, a further acronym, XBRL may become as well known as HTML or GAAP. XBRL, or eXtensible Business Reporting Language, provides a standardized approach to tagging the financial information contained in company reports. XBRL makes the analysis of financial data far easier, as the process is more readily automated. Governments and their financial authorities are likely to push for XBRL's adoption because of the potential benefits in terms of productivity and efficiency. For companies, the biggest benefit may come from creating a more accessible pool of financial data that can be analyzed using standard business intelligence techniques. For investors, the major benefit of having machine readable accounts would be quicker analysis. Governments considering or planning to implement XBRL reporting should think about the changes this might imply in terms of their standard processes.

One of the fundamental benefits of digitization is the conversion of data into zeros and ones. But not all zeros and ones, it would seem, are equal. We may not even be able to access all the data that we own. Some data storage formats used just 10 years ago are now effectively obsolete. A further legacy issue relates to operating systems and computing platforms. This digital divide is most vexing when the existence of multiple standards for a particular type of file limits the utility of current computing systems. X-ray images are now commonly captured, stored and distributed digitally. However viewing software has not been standardized at all. The technology sector may have to take a more pragmatic approach in establishing, agreeing and maintaining long-term data storage formats. While the owner of a de facto standard is likely to gain economically, other players may also want to balance the downside of not owning the standard with the benefits of having similar standards for everything.

The human race seems to have a peculiar talent for making previously abundant resources scarce. This is especially the case with water. In 2008, it is estimated that more than one billion people will lack access to clean water. More than double that number lack access to sanitation. The lack of the most important form of liquid in the world is therefore a fundamental issue, and one that the technology sector can play a major role in addressing, in 2008 and beyond. The potential value in the opportunity appears to be as significant as the problem. The investment shortfall for the global water industry may be more than $1 trillion over the next 20 years. In 2008, technology companies should look at how their products and solutions can add to existing supply, as well as reduce current usage. While this should make good business sense, it is also of profound social importance. Technology could also be used to improve management of the existing supply in myriad ways. One of the biggest challenges – and opportunities – is reducing water leakage. Another powerful application of technology could be to model the impact of subsidies for industries such as farming, or the draining of wetlands. Technology could also be part of the solution in developing less wasteful approaches to hydrating crops.

Getting value from virtualization

Virtualization, a form of software first used in the 1960s, was one of the most talked about technologies of 2007. It has been lauded as a technology that offers a compelling combination of benefits. Virtualization is claimed to deliver cost savings, better security2, more efficient use of resources3, better disaster recovery4 and lower power consumption5.

By the beginning of 2008, every Fortune 100 company and 80 percent of the Fortune 1000 companies had already deployed virtualization in some parts of their businesses6.

In 2008, enterprises considering deploying or extending virtualization may ask more probing questions about its limits, as well as its potential. This could lead to a more measured deployment of the technology, which should benefit companies and suppliers alike. The outlook for virtualization looks positive, with one industry analyst forecasting that 50 percent of all servers would be virtual by 20107.

A principal question that IT departments might ask concerns timing. While 2007 was characterized by a rush to evaluate or deploy virtualization, in 2008 companies may be more cautious. They may invest more effort in determining which aspects of their IT environment should be virtualized and when. Some companies may find that virtualization is not suitable for all applications. Machines deployed to run Internet services, which require in-built redundancy to handle peak loads, may not work within a virtualization environment8.

Corporate databases, particularly those not based on mainframes, may also be unsuitable, as they are already using up most of their available capacity. In addition, a spate of new product announcements expected in 2008 may also encourage IT departments to wait and see how upcoming products compare with current ranges before committing to a deployment.

A key area that enterprises may scrutinize this year is the robustness of virtualization's security. Questions may be asked about the software's intrinsic security. A continuation into 2008 of the issuing of patches to address vulnerabilities could cause concern. There may also be concern that conventional safety approaches, such as IP-based security tools may not work because virtual machine communications within a server may not enter the physical network9. Security specialists may identify even more vulnerabilities. One company claimed that malware could escape onto an operating system10.

Enterprises may take a closer look at the true cost of ownership of their virtualization environment. While the technology can deliver savings in some areas, it may cause other costs to rise, such as for software licenses and new infrastructure management software. Furthermore, wherever virtualization is deployed in an ad-hoc manner11, a company may find it challenging to have an accurate record of exactly how many servers it has. Attempts to audit virtual and secure server estates can be complicated by the many-to-one ratio of virtual to physical servers12.

Thus in 2008, while virtualization is still likely to be the future for many parts of a company's IT environment, a cautious, longer term view may replace the haste of the recent past.

Bottom line

Virtualization is most likely to remain a significant technology, delivering a range of benefits to enterprises in 2008. But its impact, positive and negative, is likely to vary by company.

The bottom line is that each company has to ask itself key questions such as: how would it affect costs; how would it change the way we work; how will we manage it; which business objectives would virtualization address, and which problems would it solve? Companies should avoid the trap of deploying the technology simply to follow the crowd.

The assessment of costs should include an evaluation of the total cost of ownership, and particular attention should be paid to issues such as software licenses. While virtualization is generally expected to lower cost, lack of diligence in understanding how current software providers' licenses would work in a virtualization context could lead to a nasty shock in the shape of an unexpectedly high bill. IT departments need to consider just how large deployments need to be. If deploying in a live environment, the deployment needs to be large enough to balance out the resulting costs, such as support, and the development and implementation of new processes.

Companies should also consider how virtualization could improve internal processes. For example, the technology may enable internal cross-charging by processor, RAM utilization and disk space. It may also make cost management easier and clearer.

Enterprises should ensure all systems and processes to manage virtual machine implementation are in place prior to rolling out the technology. Processes should include a policy for decommissioning virtual servers. Just because virtual servers cannot be seen does not mean that they do not exist, or indeed, run up costs.

Companies should look at all possible benefits from the technology. This would even include looking into areas such as how virtualization could help with tax planning. One possible area would be the ability to move server-based intellectual property between tax jurisdictions more quickly. While relocating digital intellectual property between countries or states to obtain lower tax rates is possible without virtualization, the technology could save a company time because it wouldn't have to commission new servers in the new location.

Finally, companies should take a long-term view. While virtualization is a decades-old technology, it still lacks maturity in some environments. In 2007, the industry is likely to see further consolidation as key players purchase companies with complementary technologies. This activity, and the continuing maturation of the technologies, should lead to significant advances in virtualization for the foreseeable future. An organization's decisions regarding the how, what and when of implementing virtualization could be critical.

How to manage talent when legacy becomes the future

Just 10 years ago, the consensus was that the mainframe was doomed to extinction, rendered obsolete by the growth in popularity of client servers and distributed computing models13. Massive contractions in the size of deployments in companies around the world had suggested that the age of 'big iron' would soon be over.

Yet, over the past few years mainframes have been going from strength to strength. In 2008, this trend seems set to continue. The installed base of mainframes is rising at about 5 percent a year14. Furthermore, mainframes' use of power, measured by watts per transaction, is lower than any other type of server, making it both environmentally friendly and cost efficient. By comparison, the power consumption of servers, per $1,000 of acquisition cost, rose from 8 watts to 109 watts between 2000 and 2006. It is forecast to rise to 417 watts by 200915.

Every day in 2008, mainframes are expected to process over 30 billion transactions, representing 70 percent of the world's business data16. Mainframe capacity for the largest mainframe vendor, IBM, has grown steadily from under 3.5 million MIPS in 2000 to 11.1 million MIPS in early 2007.

Traditional industries are not alone in consolidating their mainframe assets. A Brazilian new media company is using a mainframe as the basis of a massive multiplayer online game17. One of the attractions of the mainframe appears to be its robustness. According to vendors, the average mainframe breaks down every 38 years or 456 months18. This compares with just 18 months for competing technologies19.

While the mainframe has proven to be remarkably resilient, the workforce, it appears, has not. In 2008, while the importance of mainframes may rise, the number of staff skilled in maintenance and development of mainframes may fall.

A key mainframe-related skill is the ability to work with COBOL, a language invented in the 1950s, when many COBOL programmers started training. Over the past few years, the number of staff capable of maintaining mainframes appears to have contracted. Younger programmers entering the workforce are more likely to be trained in Java and .net. In 2008, over half of all IT workers with mainframe experience are expected to be over 5020.

At the same time as the fall in the pool of COBOL programmers, the volume of COBOL code has risen. A survey of IT managers found that almost three-fifths of respondents polled were developing new, strategic COBOL-based applications21, with a focus on vital back-end financial systems22.

Bottom line

It can be hard to judge the lifetime of a particular technology. What used to be regarded as legacy can, as has happened with mainframe computers, become viewed as the way of the future. Enterprises, as well as the industry that supplies them, should review their long-term talent requirements regularly.

There are several potential approaches for dealing with a scarcity of talent. One would be to migrate applications away from a platform with a skills pool that appears to be poor. But this is not always feasible. In the case of mainframes, migration could be time consuming and expensive23. Such a project would probably require the recruitment of a highly paid team of programmers with knowledge of COBOL as well as current programming languages. Using lower cost, inexperienced staff would likely be a false economy. And the full benefits of such a migration may only be realized in the mid-term, making the business case for such a move challenging to argue.

A second approach would be to train staff. This is happening with COBOL. Some companies are pairing together new recruits and experienced COBOL programmers to accelerate the transfer of knowledge. Companies could even recruit recent retirees on part-time, short-term contracts, to assist with training. One supplier, IBM, is in the middle of a five-year initiative aimed at training more than 20,000 people in mainframe administration24 . Companies should note that it may not be enough just to train in a new language; there also needs to be instruction in the discipline of programming in a robust, error-free way.

A third approach could to be to make the underlying technology easier to use. One vendor is investing $100 million to make mainframe management easier25.

Let there be light emitting diodes

How many years does it take to change a light bulb? About 130 years and counting in the case of the incandescent light bulb, a technology which has long been recognized as an imperfect approach to shedding light. But in 2008 the conventional light bulb may finally start to be superseded by a viable replacement: the white LED.

Incandescent bulbs provide light via passing current into a fragile filament that ruptures with the slightest shock – sometimes even just the impact of being turned on is sufficient to break it. This filament is housed within an equally fragile glass tube. And while the aim of the bulb is to illuminate, the chief output of the bulb – more than 90 percent – is actually heat. The bulb is so effective at generating heat that a single bulb has been the sole heat source for more than 16 million toy ovens!

The luminous efficiency (the ratio of light output to power input) of the incandescent light bulb is, at best, 2.6 percent26. The halogen bulb, a derivative of the incandescent bulb with a sealed gas enclosure for the filament, is only slightly more efficient. The only less efficient form of lighting than the standard incandescent bulb is the candle, whose luminous efficiency is a meager 0.04 percent.

Given the inefficiency of incandescent bulbs, it is not surprising that they represent a major chunk – as much as 25 percent – of a typical household's yearly power bill27. The global lighting bill in the developed world adds up to $138 billion per year.

Since the first incandescent light bulb was introduced, some competing technologies have emerged. But none has managed to offer a sufficient combination of low cost, power consumption, size and lighting up time to challenge incandescent in a significant way.

In recent years, CFL bulbs have grown in popularity. Initially CFL bulbs were a niche market as the technology was relatively expensive, the bulbs were bulkier and it could take several seconds for the bulbs to turn on fully. CFL technology can be over three times more efficient than incandescent bulbs, offering luminous efficiency of up to 8.8 percent28. Demand for CFL bulbs may grow in some regions over the next few years if the incandescent light bulb is banned29.

Another competing technology that has been developing in the background is the LED, whose advantages are multiple. It is far more efficient. Current LEDs boast a luminous efficiency of up to 10 percent30; next generation models offer up to 22 percent31 32 33 34. That is more than 10 times better than incandescent bulbs and three times better than CFL. LED bulbs offer superior longevity, of up to 50,000 hours, equivalent to 17 years with an average eight hours light per day. This is 50 times better than incandescent and five times better than CFL.

For years, however, LEDs have been uncompetitive on other lighting metrics. They were not bright enough, they didn't emit the right colors, and they cost too much. LEDs had limited successes: some traffic lights, a few high-end vehicles and consumer devices.

But in 2008, LED should become commercially viable. LED's issues with intensity and color appear to have been resolved35. A recent breakthrough offers an entirely new approach for making white light LEDs36. While the upfront cost of an LED is high, its long life makes the total cost of ownership more than competitive37. An LED bulb and holder may cost up to $50, but energy and replacement cost savings, per bulb, over a 20-year period add up to about $26538.

Not only does longer life help reduce the overall lifecycle costs, many light sources are in difficult or dangerous to replace locations, such as high ceilings.

While the initial price of an LED bulb may still frighten off the mass market in 2008, manufacturing costs are continuing to decline. As the LED is a semiconductor technology, it should benefit from Moore's Law, with manufacturing costs expected to decline by 50 percent every 18 months39. Conversely, nothing much has changed in the way incandescent bulbs have been manufactured in the last 50 years; the cost certainly has not fallen. But the energy costs associated with them are not going down – they are rising.

The first widespread use of LEDs was in digital watches. Owners of these timepieces probably associate LEDs with the color red. In the medium term, it is likely that the most common application for LEDs will be to give off white light, but their true hue may really be green.

Bottom line

LED lighting manufacturers should ensure that they communicate their positive environmental credentials. As well as offering low power consumption, the sector should also promote its other characteristics, such as clean manufacture and disposal. The average CFL contains about 20 milligrams of mercury, which is toxic. The United States' annual consumption of all fluorescent bulbs would produce enough mercury to contaminate 20 million acres of water if not recycled or otherwise properly disposed of40. Making these points could make a stronger case for subsidies, which are currently provided to some renewable energies, such as solar and wind.

The semiconductor industry should improve its LED technology, especially since it is one of the few high-growth areas in the sector at present.

Governments should consider subsidizing LED purchases with rebates or tax deductions. Adoption of LED lighting and its resulting reduction in power consumption could lessen the need to build further power plants.

Finally, architects could let their imaginations run riot in creating lighting in inaccessible locations. With a 17-year life span, they should not worry about the challenge of changing an LED bulb.

From zero to green hero: the renaissance of nanotechnology

The public image of nanotechnology – the manipulation of matter at the atomic or molecular scale – has recently become tainted. This is despite mass market use of nanotechnology-enabled products, from smoother sun cream to portable MP3 players and faster processors41.

The impact of nanotechnology on new or improved products and services has already been significant and its potential remains considerable. Matter behaves in fundamentally different ways on the nanometer scale. Previously inert materials can be transformed into catalysts; solids can become liquids, even at room temperature; insulators can become conductors. According to advocates, nanotechnology could even be the basis for the next industrial revolution42.

Despite, or perhaps because of, the potential of nanotechnology, it has scared the public more than it has thrilled them. People are concerned about the possible malign consequences of the release, accidental or otherwise, of engineered nanoparticles into the environment. They are also uncertain about the fate and toxicity of nanoparticles and how they behave43. The insurance industry has debated whether some nanotechnology risks can be covered as not all the potential negative impacts are known or can be quantified at this time44.

One influential commentator has labeled nanotechnology as "gray goo"; it also has been suggested that it could be more threatening than nuclear power45. On a personal level, there have been concerns about the long-term impact of nanotechnology based anti-wrinkle creams46. In addition, some nano particles have been found to be carcinogenic47. It is the subject of several official inquiries around the world and has even driven the plot of a bestselling novel48.

Along with the growing distrust of nanotechnology there has been a steady rise in concern over the environment. And it is becoming increasingly apparent that nanotechnology could have an important role to play in healing, rather than harming the planet49. Therefore in 2008, the public's demonization of nanotechnology could be reversed, and a green-tinged halo could replace its horns.

Nanotechnology is already being used to address several environmental issues: generating clean power, reducing existing power consumption, providing drinkable water, cleaning contaminated land, reducing harmful emissions and enabling long-life portable power.

The combination of global population and economic growth generates an increasing need for energy. Carbon-based fuel reserves may be insufficient to meet demand, and, more worryingly, may have too great a negative impact on the global climate. So cleaner energies are required, one of the most appealing of which is solar power. But solar energy is not yet price competitive with oil, gas and nuclear energy, and costs up to $5 per watt. Nanotechnology could allow the manufacture of solar panels based on plastics instead of silicon50. This could lower production costs, allow a greater range of form factors, and more than halve the cost of solar power to just $2 per watt, which is far closer to 2007 prices for fossil-fuel-derived electricity51.

Nanotechnology can also be used to conserve energy. As highlighted in 'Let there be light emitting diodes', lighting represents a significant share of domestic power consumption. The efficiency of traditional incandescent light bulbs is very low, with most power being converted into heat, rather than light. Nanotechnology could enable an alternative light source, LEDs. Presently most LEDs are based on crystals; nanotechnology may allow the use of thin films of polymers or organic modules that could improve electroluminescence efficiency fourfold52.

As highlighted in 'The challenge and opportunity of water scarcity', another of the world's gravest concerns is water supply. One of the solutions to this problem is the conversion of salt water into drinkable water. Nanoscale membranes, based on carbon nanotubes, offer a dramatic improvement in productivity when compared to reverse osmosis, currently the most common form of desalination53. The nanotechnology approach requires less energy and fewer filter cleaning agents than the current approach54 55 for an equivalent volume of drinking water56.

Similarly, a major source of emissions is motor-vehicle exhaust. Nanotechnology catalytic converters could reduce the quantity of precious metals required to make a standard three-way converter, enhance the lifetime of the converter, increase efficiency and substantially reduce its cost57 58.

Engineered nanoparticles are showing considerable promise as a means of cleaning up contaminated land and groundwater. Various industrial processes and pesticides create a class of pollutants called chlorinated hydrocarbons, some of which are known to suppress the human immune system and have been linked to cancer59. Trials in the United States, Canada, and Germany have indicated that nanoparticles are capable of binding to such pollutants very efficiently, enabling safe collection and separation. With one-third of the world's population obtaining their potable water from aquifers, the majority of which are polluted, nanotechnology could have a major impact by saving lives60.

Billions of batteries, of all sizes, are manufactured every year, powering everything from toys to power tools. A large proportion of these are discarded once exhausted, which can cause the leakage of toxic residues. Nanotechnology is being used to develop an ultra long-life battery substitute based on reinvention of the capacitor, a centuries-old technology. Capacitors charge faster and last longer than conventional batteries; a nanotechnology-based ultra capacitor could recharge in seconds and provide power for many hours61.

Public awareness of these issues and the environment in general, is growing daily, and as a result, its new public tagline may well start to shift nanotechnology from gray goo62 to green good63.

Bottom line

Like many scientifically advanced technologies, the main strength of nanotechnology is also a weakness. While nanotechnology holds enormous potential, public understanding of what the technology does is actually as minuscule as the atoms it manipulates.

Companies, research institutes and industry bodies should make a substantial, sustained effort to make nanotechnology more understandable to the layperson64. That should help the public feel less threatened by the technology. Part of the dialog should include direct and open responses to concerns about nanotechnology. As the size of the impact of nanoscale innovation grows, the list of concerns is likely to expand proportionately. If the industry does not provide evidence and explanations, scaremongers may fill in the gaps.

While the industry should explain where nanotechnology has made a positive difference, it should be wary of succumbing to hype or over promising. Any emerging sector is vulnerable to exaggerated and premature claims; nanotechnology industry bodies around the world should strive to keep such diversions to a minimum.

The sector should always bear in mind that, as with any emerging technology, adoption is always going to be an economic decision, whether for consumers or for business. Only once cost is competitive with existing approaches will nanotechnology become mass market.

Footnotes

1. The Expanding Digital Universe, IDC, March 2007.

2. The approach could also improve security for mobile data devices, as it would allow standardized software to be loaded on every type of device being used while in transit, from a phone to a laptop. Each device would therefore be protected by a consistent set of security rules. Currently most devices have custom software. See: Virtualization set to boost mobile security, ZD Net UK, 21 September 2007.

3. Server utilization can be notoriously low, as little as 5 percent utilization per server: Virtual servers a savings strategy, The Sheet News Bites, 2 August 2007; Virtualization for flexible, efficient IT resource utilization, Tech & U, 10 September 2007.

4. Security in a virtual reality, Banking Technology, 1 October 2007.

5. One estimate suggested a saving of £78,000 ($159,000*) per 1000 PCs, if a company were to migrate from a full desktop PC architecture to a virtual server-hosted desktop environment. See: Virtualization could save companies millions, Silicon.com, 25 September 2007.

6. VM Security risks: phantom or menace, eWeek, 25 October 2007.

7. How to turn one computer into many, The Guardian, 8 November 2007.

8. VMware sends shivers through server world, Financial Times, 13 August 2007.

9. With virtual machines, management is key, InformationWeek, 1 October 2007.

10. Security expert: beware virtualization in 2008, ZDNet, 19 November 2007.

11. With virtual machines, management Is key, InformationWeek, 1 October 2007; VM Security risks: phantom or menace, eWeek, 25 October 2007.

12. Ibid.

13. Programs written in old code pose business problem, Financial Times, 22 November 2006.

14. The IBM mainframe: alive and kicking, Computergram, 19 July 2007.

15. Data energy efficiency and productivity, The Uptime Institute, 2007.

16. See: http://www-1.ibm.com/services/ondemand/business/legacy_transformation.html

17. Long live the mainframe, Ziff Davis Enterprise, 19 February 2007.

18. Pros and cons of consolidating servers, Financial Times, 2 April 2003.

19. Ibid.

20. COBOL coders: going, going, gone? Computerworld, 9 October 2006; COBOL skills needed in the future, Data Center News, 17 May 2005; Computing's dying breed, Financial Times, 20 May 2003.

21. COBOL Coders: Going, going, gone? Computerworld, 9 October 2006.

22. Programs written in old code pose business problem, Financial Times, 22 November 2006.

23. According to one analyst, new software is over 100 times less efficient than legacy software: Programs written in old code pose business problem, Financial Times, 22 November 2006.

24. IBM to spend $100 million on mainframes, eWeek, 4 October 2006.

25. The IBM mainframe: alive and kicking, Computergram, 19 July 2007.

26. The nature of light, T J Keefe, 2 February 2007: http://www.ccri.edu/physics/keefe/light.htm

27. Lighting efficiency information, California Energy Commission, July 2002: http://www.energy.ca.gov/efficiency/lighting/.

28. Energy saving lamp: http://www.coffj.com/veg1/lamp.htm

29. Two years to change EU light bulbs, The Scotsman, 10 March 2007; Europe to unplug from common light bulbs, 7 March 2007.

30. Cree Announces Lighting-Class Performance for Warm White XLamp LEDs, Cree, 21 March 2007: http://www.cree.com/products/xlamp_new_warm.asp

31. For more information, see: Solid State Lighting, Lighting Research Center: http://www.lrc.rpi.edu/programs/solidstate/completedProjects.asp?ID=79

32. Cree demonstrates 131 lumens per watt white LED, Cree, 20 June 2006: http://www.cree.com/press/press_detail.asp?i=1150834953712

33. Nichia achieved 150lm/W white LED development, Nichia 2006: http://www.nichia.co.jp/about_nichia/2006/2006_122001.html

34. Philips Lumileds launches new Luxeon K2 with TFFC, the industry's first 1A LED, Philips, 15 November 2007: http://lumileds.com/newsandevents/releases/LuxeonK2TFFC_CW071115.pdf

35. For more information, see previous links from CREE, Nichia and Lumileds.

36. LEDs shine as replacement for light bulb, EE Times, 16 November 2007: http://www.eetimes.com/news/semi/showArticle.jhtml?articleID=203101640

37. LED light bulbs, C. Crane, 2007: http://www.ccrane.com/lights/led-light-bulbs/index.aspx

38. Ultra efficient lighting, LLF, 2007: http://www.llfinc.com/efficiency.htm

39. Nature photonics focus on LEDs, Nature, 2006: http://www.nature.com/nphoton/journal/v1/n1/full/nphoton.2006.78.html

40. For more information see: Fluorescent lights, Worldwise: http://www.worldwise.com/recfluorlig.html

41. The Science of Small Things, BBC Today Program. To listen to the broadcast, see: http://www.bbc.co.uk/radio4/today/reports/archive/science_nature/nanotechnology.shtml

42. Nanotechnology: the next industrial revolution, Mondaq Business Briefing, September 2003.

43. Science report – A people's inquiry into nanotechnology and the environment, Environment Agency, June 2006 (ISBN: 978-1-84432-782-9).

44. Nanotechnology: small matter, many unknowns, Swiss Re. Available at: http://www.swissre-centre-for-global-dialogue.com/resources/31598080455c7a3fb154bb80a45d76a0-Publ04_Nano_en.pdf

45. Don't fear science you can't see, Wired, 12 January 2001.

46. Safety fears over 'nano' anti-ageing cosmetics, Sunday Times, 17 July 2005.

47. Nanotech fears and the science of the future causing concern, digital journal, 14 November 2007: http://www.digitaljournal.com/article/246161/Op_Ed_Nanotech_fears_and_the_science_of_the_future_causing_concern

48. Reviewing Michael Crichton's Prey, The Business Review, 10 January 2003.

49. Guide to the future of nanotechnology, Media Planet, 26 June 2007.

50. Cheap Nano Solar Cells, Technology Review, 5 March 2007.

51. As solar gets smaller its future gets brighter, San Francisco Chronicle, 11 July 2005.

52. Nanotechnology being used in next-generation LED lights, Science Daily, 23 March 2007.

53. Cheap drinking water from the oceans, Technology Review, 12 June 2006.

54. Reverse osmosis processes force water through a semi-permeable membrane under pressure, leaving salt and other contaminants on the membrane.

55. Engineers develop revolutionary nanotech water desalination membrane, Physorg.com, 6 November 2006.

56. Ibid.

57. New catalyst for cars using nanotechnology, Technology News Daily, 1 October 2007.

58. Ibid.

59. See http://es.epa.gov/ncer/nano/research/nano_remediation.html

60. Groundwater depletion and pollution, People and Planet, 14 July 2003.

61. Super battery, ScienCentral News, 8 June 2006: http://www.sciencentral.com/articles/view.php3?type=article&article_id=218392803

62. For background on gray goo, see: Why the future doesn't need us, Wired, April 2000.

63. For an in-depth analysis of other nanotechnology impacts on the environment, see: Nanotechnology in the Environment Industry: Opportunities and trends, Final Report and Bibliography for the Nano-Environmental Cross-Sector Initiative, 4 March 2005.

64. For further detail on potential recommendations for the nanotechnology industry and its dialog with the public, see: A matter of size: Triennial Review of the National Nanotechnology Initiative.

To read Part Two of this article please click on the Next Page link below


The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.