ITT 2002 Cover Page
ISSN 0971-7102
Vol 21 No 2 June 2002





What's in a (domain) name?

The European Commission has been working towards opening a new top-level domain name (TLD) which could strengthen the European identity of the EU's Internet users and relieve congestion problems. The new TLD will be available for organisations operating in the EU, and possibly also the EA and candidate countries. Complementing national TLDs such as .uk and .de as well as generic ones like .com and .org, it will remove the need for companies to register in each country where they have markets.

The new TLD will relieve the current congestion in the generic .com domain, where only 10% of registered domains are actively used- `cybersquatting` reserves rights to a name, so genuine potential users cannot use it. Many details for the new .eu domain name still need to be worked out. Among these is the possibility of sub-domains to represent specific economic or professional sectors-second -level domains such as, .lex. eu and

-- Reproduced from Innovation and Technology Transfer 2001, 6(1), 5

Intel paves the way for `Terahertz' chip

Transistors, the microscopic circuits that animate semiconductors, are going to be flipping off a trillion times a second in a few years, a prospect that is forcing Intel back to the drawing board. In a presentation at the International Electron Devices meeting next week in Washington, DC, the Santa Clara, California-based chip giant will shed light on a series of major changes coming to the design of its transistor - culminating in the "Terahertz" transistor in the second half of the decade- that ideally will keep a lid on the growing problem of power consumption.

Overall, the pending chemical and architectural changes will let Intel by 2007 manufacture chips that contain a billion transistors but consume about the same amount of power as today's processors. "If nothing is done, Moore's law is going to be throttled," said Rob Willoner, a researcher in the Logic Technology Group at Intel. "Power consumption is running away from us…This is aiming at the heart of the problem."

The presentation will also no doubt spark a technological space race with IBM. One of Intel's proposed changes involves incorporating a version of silicon-on-insulator (SOI) technology into its chips, a technology IBM already uses and that Intel has roundly criticised. IBM will also present papers at the conference on advancements to its own version of SOI.

Transistors are the red blood cells of computing, tiny units that ferry electronic signals across semiconductors that eventually get orchestrated into higher commands. Under Moore's law, the number of transistors on a chip doubles roughly every two years through, among other factors, shrinking the transistors.

The transistor explosion has allowed computers to experience continual improvements in performance. That means some awfully complex devices by the second half of the decade. The addition of the SOI layer will also lower resistance to the flow of current across the source and drain. Ultimately, lower power consumption can either let designers lower power consumption or improve performance at a given level of energy. Other benefits are also likely to appear. Free-floating alpha particles that come in contact with a transistor on current chips can switch a transistor's state unexpectedly and cause errors.

In future, the ultra thin SOI layer will absorb these, Mr Willoner said. Much work remains. Intel is yet to determine the chemical composition of the gate oxide. Currently, gate oxides are made of silicon dioxide. Future candidates include oxides from aluminium, titanium or other materials. These changes will be incorporated into processors at separate times but at some point during the second half of the decade will all be standard features in Intel processors, Mr Willoner said.

As far as the pending SOI debate goes, Mr Willoner said one of the reasons Intel has been sceptical of IBM's version of SOI is that it doesn't perform as well as it should. It helps reduce energy consumption, or conversely improve performance, but not economically, he said. Additionally, Intel's version of SOI is far easier to design into processors. IBM representatives could not be reached for comment, but sources inside IBM's microelectronics division said some of Intel's plans look like IBM's version of SOI "in disguise." In arrangement with

Michael Kanellos
-- Reproduced from The Economic Times 28 Nov 2001

New forensic journal on Internet and CD

E- journals can be used to good advantage by disciplines which are traditionally starved of money. One good example is the speciality of forensic medicine, the science which deals with the detection of crime using scientific, technical and medical knowledge. Prof Anil Aggarwal of Maulana Azad College, New Delhi has started an e-journal in forensic medicine. Its URL is http:/ Keeping in view the good number of hits, it is now available on CD also. It is a biannual journal. The journal has an array of specialists from all continents on its editorial board. The e-mail address is The Web sites are: 1) 2); 3) .

-- Reproduced from Invention Intelligence 2001, 36(4),192.

Learning organizations. APO News, May 2001

Conventional production factors such as land, capital and machinery are no longer sufficient to gain and sustain a competitive edge in the global markets. The bases for competition has shifted to how well and fast intangible assets such as knowledge, ideas and organizational capacity can be developed to reduce cost, increase quality and generate innovation to meet customer needs quickly and effectively. This gives rise to the concept of the learning organization. This is about an organization's ability to learn and translate that learning into action. Therefore the ability to learn faster than the competitors is the critical competitive advantage. In response to an increasing interest in the concept and application of Learning Organizations among its member countries, the Asian Productivity Organization (APO) sponsored a symposium on Learning Organization in Seoul, Republic of Korea, in April, 2000. This publication is a report on the proceedings of the meeting. It features a summary of the deliberations of the meeting and its conclusions. The resource papers covered the following topics:Features of Excellent Organizations and the Organization of the Future; Learning Organization as a New Management Paradigm, and Towards a Balanced Organization.

Contact: The Information and Public Relations Department, APO, Tokyo 102-0093, Japan for order and enquiry.

-- Reproduced from WISTA: Innovation 2001, 3(2), 19

Secrets of electronics commerce. ITC 2001

Since the introduction of the first commercial Web site in 1994, electronic commerce has spread across the globe as a marketing, sales and communication phenomenon, even changing the face of some business sectors. Entrepreneurs from around the world use the Internet to increase their marketing reach and improve their profitability. Most of the small and medium-sized enterprises (SMEs) in developing and transition economics continue to be unclear about how they can benefit from this new tool for business organization, communication, marketing and sales. They may have questions from business, organizational and government perspectives. It is for this reason that the International Trade Centre has decided to research and publish a practical publication Secrets of Electronic Commerce. The book helps identifying SME issues and constraints in regard to e-commerce; outlines marketing and on-line communication techniques; discusses legal and financial issues; highlights characteristics of successful Web sites; deals with technical, policy and country specific issues etc.

Contact: International Trade Centre (ITC), Palais des Nations, 1211 Geneva 10, Switzerland for order and enquiry.

-- Reproduced from WISTA: Innovation 2001, 3(2), 19

ACM Digital Library:the ultimate online resource

ACM, the Association for Computing Machinery, is the world's first educational and scientific computing society. Since 1947, computing professionals worldwide have turned to ACM for authoritative publications, pioneering conferences, and visionary leadership. Today, with nearly 1,000 institutional members and over 80,000 individual members from over 100 countries, ACM has grown into a global electronic community, the centerpiece of which is the ACM Digital Library. The ACM Digital Library has a comprehensive collection of over 20 ACM publications online, including a fifteen-year archive of journals and magazines, and nine-year archive of ACM conference proceedings- equal to more than 250,000 pages of text! ACM Digital Library is available to institutions through ACM Digital Library Core Package. Various subscription options are available. Libraries interested to subscribe the document may contact.

Contact: ACM Member Services, PO Box 11414, New York, NY 10286-1414 USA. Email: or Phone:+1-212-626-0500, Fax:+1-212-944-1318.

-- Source: ACM leaflet

Biosciences thesaurus for WINISIS users in CD

The thesaurus has been prepared by Sarada Ranganathan Endowment for Library Science, Bangalore with the support of NISSAT. The CD (i) includes the thesaurus called BIOTH of 23,500 terms perttaining to the fields of biosciences, natural resources, agriculture, and environmental sciences; (ii) the thesaurus called WTHES of about 50 terms is for demo and covers the subject of food chain/ food web/ energy flow in an ecosystem, and (iii) the opening page of WINISIS database. This data links to WTHES database.

BIOTH and WTHES have conventional thesaurus structure - descriptor, scope not (SN), use (US), used for (UF), broader term (BT), narrower term (NT), and related term (RT). However, in WTHES the type of relation of RT terms is indicated.

In BIOTH ans WTHES every descriptor occurring as BT, NT, RT or Use may be clicked to link it to appropriate connected term(s) in BIOTH ans WTHES respectively. Additionally, a descriptor in WTHES can be clicked to cross-link it to an appropriate descriptor, if any, in BIOTH.

These thesauri have been desiged and developed using WINISIS 1.4 (Windows version of CDS/ISIS).

For further information contact:

  1. Smt Sreedevi Ravindran, Department of Scientific and Industrial Research, Technology Bhavan, New Delhi - 110016. Tel: 6677373/496, E mail: 

  2. Prof A Neelameghan, Sarada Ranganathan Endowment for Library Science, 702, 42nd Cross, III Block, Rajajinagar, Bangalore 560 010. Tel: (080)330 5109. E mail: 

Roomware: towards the next generation of human-computer interaction. ERACIM Newsletter, July 2001

Firmware‚ comprises computer-augmented room elements with integrated information and communication technology facilitating new forms of human-computer interaction. They are part of an approach that the `world around us' is the interface to information and for the cooperation of people. The Roomware‚ components were developed at GMD's Integrated Publication and Information Systems Institute (IPSI) in Darmstadt.

The next generation of human-computer interaction is determined by a number of new contexts and challenges. One major challenge is to overcome the limits of desktop-based information environments currently in use. In the past, the introduction of information technology caused a shift away from real objects in the physical environment as information sources towards monitors of desktop computers at fixed places as the interfaces to information. Accordingly, user-interface designers developed the known types of human-computer interaction for the desktop paradigm.

In contrast to this, a new approach has been developed that emphasizes again the relevance of physical objects. In this approach, the `world around us ` is the interface to information where traditional human-computer interaction will be transformed to human-information-interaction and human-human communication and cooperation. One result of this research is the so-called Roomware‚ components of the workspaces of the future in so-called `cooperative buildings". They require new forms of interaction with information, which are supported by the software that has been developed.

The passage mechanism

Passage is a mechanism for establishing relations between physical objects and virtual information structures, thus bridging the border between the real world and the digital, virtual world. The so-called passengers (passage objects) enable people to have quick and direct access to a large amount of information and to `carry them around' from one location to another via physical representatives that are acting as physical bookmarks in the virtual world. It provides an intuitive way for the transportation of information between roomware components, e.g. between offices or to and from meeting rooms.

A passenger does not have to be a special physical object. People can turn an object into a passenger: a watch, a ring, glasses, a wooden block, or other arbitrary objects. The only restrictions passengers have is that they can be identified by the bridge and that they are unique. Passengers are placed on so-called bridges, making their virtual counterparts accessible. With simple gestures the digital information can be assigned to or retrieved from the passenger via the virtual part of the `bridge'. The bridges are integrated in the environment to guarantee ubiquitous and intuitive access at every location in a building (=> cooperative building).

Future work

At the beginning of 2001, a new project called `Ambient agoras: dynamic information clouds in the hybrid world' has been started. It is funded by the European Union as part of its proactive initiative `the disappearing computer'. Ambient agoras will provide situated services, place-relevant information, and a feeling of the place to the users. It aims at turning every place into a social information marketplace (=agora in Greek) of ideas and information where one can interact and cooperate with people.

Contact: Norbert A Streitz _ GMD at or Tel: +49 6151 869 919

-- Reproduced from Science and Technology in Germany, September 2001, p4-5

Magnetic disk devices

Technology for a fourfold expansion of the data recording capacity of magnetic disk devices, viz. from current 25 to future 100 gigabits per square inch has been developed. Magnetic disks that can record five hours of television program developed separately by technology rivals Hitachi and NEC may be commercialised next year. The magnetic disk devises used in mainframes two decades ago had a data-recording capacity of 10 megabits per square inch, or two seconds of TV program. Magnetic disk devices record and read data from disks rotating at high speeds by means of magnetic heads. The data is recorded as if minute magnets are lined up on the discs and data recording capacity can be enhanced by shrinking the size of each of these magnets or the space used to record each data bit.

Hitachi achieved it through perpendicular magnetic-recording technology that creates an effect as if magnets were lined up vertically unlike the conventional magnetisation method that is equivalent to lining up magnets horizontally. Tiny magnets lined up horizontally may get deviated by the magnetic repulsion and attraction and the direction of their magnetic poles can become disordered. If these are placed vertically, the surface area of their group can be kept small without reducing the size of each magnet because magnets can be lengthened. NEC used a new tunnelling magnetoresistive head, which is ten times more sensitive, to boost the capacity of its magnetic disk devices. Made of metal parts sandwiching a 0.7mm thick insulating film, the head reads and writes data by using changes in the resistance of the insulation film when the metal parts pick up slight changes in the magnetic field. Magnetic disks with a capacity of one terabit per square inch are likely to be available in six years, when nanotechnology will be used to magnetize an area of 10mm square. Toshiba, IBM and MIT are working over it. Once magnetic disk capacity catches up those of latest optical disks having a capacity of 24 gigabits per square inch, the emergence of larger volume magnetic discs may lead to hard-drive videodisk recorders for home use and the development of new Internet-based information services.

-- Reproduced from Science and Technology in Japan, October 2001, p11-12.

Optical computing

A basic technology for the fabrication of optical integrated circuits has been developed at NTT. It may lead to the realization of all-optical chips where optical signals are processed directly without being converted into electrical signals. The new approach uses nano-fabrication technology to define rows of extremely small, equally spaced holes on a silicon substrate. The spacing is the key. The distance between holes is roughly equal to the wavelength of the light, that is, the 1.3 _1.6 microns conventional in optical communications.

When this silicon substrate is sandwiched between glass sheets and light is directed from one end, the light travels straight across the silicon between the rows of holes, exhibiting almost no leakage. The design exploits a certain characteristics that prevents light of a given wave length from travelling through a crystal whose structure has a periodicity equal to the wavelength of that light. The rows of holes in the silicon act as optical wave guides to direct the path of the light. Using this technique, the researchers opine, light can be turned at right angles and confined within extremely narrow spaces.

-- Reproduced from Science and Technology in Japan, October 2001, p18

E-mail turns 30, founder forgets first message

As great inventions go, e-mail had a rather ho-hum beginning back in 1971. In fact, Ray Tomlinson, the American engineer considered the `father of e-mail', can't quite recall when the first message was sent, what it said, or even who the recipient was.

"I had no idea what the first one was," he told Reuters, "It might have been the first line from Lincoln's Gettysburg Address for all I know. The only thing I know was it was all in upper case". Tomlinson, principal engineer at Cambridge (with BBNTechnologies), finds himself in the spotlight again after all these years, having to answer questions about the computer programme he designed as it reaches its 30th birthday in the coming weeks.

He modestly calls his baby `no major tour de force'. It was just 200 lines of code, he says. And the inspiration- one computer programme to enable file transfers and a second crude messaging programme _ already existed, he said.

But the programmes had their flaws. For example, the message programme only enabled a user to send a communiquť to a colleague's mailbox as long as that mailbox was located on the same computers as the sender's. Tomlinson got around this by creating the remote personal mailboxes that could send and receive messages via a computer network.

He also conceived the now famous `@' symbol to ensure a message was sent to a designated recipient.

The end product, he said, was simply the combination of two existing programmes, enabling a person to send a message for the first time to a specified computer user or any computer hooked to the ARPA Net, the predecessor to today's Internet, developed by the US Department of Defense. Thirty years on, e-mail has become a vital form of communication. Last month e-mail became the only reliable link for many frantic souls during the US attacks. Poignant e-mails from survivors have circulated around the world, filling in clues about harrowing escapes and daring rescue attempts. A week later, it was e-mail that helped spread the damaging Nimda computer virus, knocking out corporate computer networks around the globe and inflicting hundreds of millions of dollars worth of damage.

Like all essential communication devices, e-mail has a love-hate relationship with its users. For every pick-me-up message of praise or joke sent electronically, it seems there are an equal number of unsolicited e-mail reminders that we can lose weight overnight, make money working from home or earn an honorary college degree.

But back in the autumn of 1971 _ he says he can't recall which month- e mail was a relatively small success. That is, he added , simply because there were just a few hundred users of the ARPA Net that could put it to use. And, the top of the line modem connection at the time operated at a snail-like 300 baud, roughly one-twentieth of the speed of today's standard 56.6 kbps modem. It made the most concise message practical.

"Reliance took a few years to happen', said Tomlinson. It wasn't until the personal computer born in the mid-1980s that e-mail trickled into the lives of computer enthusiasts and university students.

Another major stage in the development came in the mid-90s as the first Web browsers introduced the World Wide Web to the couch potato. As Web usage grew, so did e-mail. Over the years, Tomlinson said, complete strangers have sent him notes of thanks and a few criticism for his invention _ all by e-mail, of course.

Bernhard Warner
-- Reproduced from The Indian Express 3 Oct 2001

A signal of mixed success for Net telephony

Hype surrounds the government's recent decision to open up Internet telephony, which enables people to make long distance calls at a fraction of cost of circuit switched POTs (plain old telephones). Consumers have long yearned for cheaper long distance calls , which make for the bulk of telecom profits and revenues. Internet telephony has held that promise for several years now - connect to the Net and sign up with a US-based company, Net2Phone for instance, and chat away with relatives and friends the world over for a fourth to a tenth of the cost.

Although not permitted in India, Net telephony is being offered here on the sly by `operators' for a couple of years. Now, the government plans to allow the service by April 2002. It is yet to evolve the rules to govern the service. For several years the Net telephony has been around, its success has been mixed at best and for long it has remained a trend of the future that could kill telecom companies offering long distance (ISD/STD ) services. It hasn't so far.

In the Asia-Pacific region, for instance, it will be another five years before the voice traffic reaches a third of the VoIP (Voice over Internet protocol). "About 5 billion minutes of VoIP traffic originated in Asia-Pacific. Excluding India, about 4.5 per cent of the voice traffic was carried over IP" according to Nitin Bhat of Frost and Sullivan, a leading research firm. China contributed about one-third of the total VoIP traffic that originated in A-Pac last year. Bhat projects the IP voice traffic to reach 38 per cent by 2006. In India, Skoch Consultancy MD Sameer Kochhar projects that 20 per cent of the voice traffic will move on IP network by 2005.


Steps to Web Talk

  • User need a computer with Internet connection

  • The PC has to be equipped with hardware accessories like a multimedia kit, speakers, handphones and a sound card.

  • On the software side, one needs to unload messaging software like MSN or Yahoo Messenger or software from or Software can be downloaded from the Web free of cost.

VoIP Advantages

  • Voice over Internet protocol is much cheaper compared to traditional circuit-switched telephony

  • A circuit-switched telephone calls takes up 64 Kbps bandwidth while VoIP calls take up 6-8 Kbps

  • Offers a set of new value-added services. This includes IP multicast conferencing and telephony distance learning applications, call centre applications, unified messaging, etc.

It appears to be a far cry from the reports suggesting that India will be swamped by Internet telephony, although some experts warn that it would hit hard state-owned telephone like VSNL, BSNL, and MTNL and rewrite the rules for private telecom companies planning to enter the long distance business.

Bhat opines, "long distance operators will have both sets of network, circuit-base 9POTS0 as well as IP-based. If there are more and more IP-based networks, the prices will fall further".

Experience suggests that in case there are a lot of players offering Net telephony putting pressure on prices, they will bleed. "But, a few players may lead to a 20-30 per cent decline in (long distance) tariffs", said Bhat.

One BSNL official who did not want to be identified, said, "In A-Pac the growth of Net telephony has been slower than expected but that may not be the case in India. In A-Pac, much of the ILD traffic is by business and they require high quality and reliable communication offered by POTs. In India, however, the customer profile is different".

The reason: there is a huge latent demand for long distance voice calls by people to relations and friends in US, UK and Mid-East. For now, majority of calls are made to and not from India.

The total ISD market in India is under Rs. 8,000 crore, less than half of the STD traffic.

Sanjay Anand
-- Reproduced from The Times of India 17 Oct 2001

Select bids adieu

Select, the newsletter of the British Library National Bibliographic Service, ceased publication with the issuance of no. 29. The topics, the newsletter used to cover were increasingly becoming Web-based, thus making the newsletter redundant. In future, both general and technical information about British Library bibliographic services will be on the Web page at

-- Reported by Peter Robinson, Editor, Select

Working Group sees Indian IT exports crossing 200,000 cr mark by 2007.

India's software export is estimated to be Rs.200, 000 crore by the end of the Tenth Five Year Plan (2002 _2007), with the domestic turnover expected to be Rs. 67,000 crore during the same period, according to a report of the Workiong Group on IT.This is in line with the earlier projection of $50 billion software and services exports by the year 2008 while the turnover of the IT industry is estimated to be over $ 87 billion (Rs. 391,500 crore).

Stating that during the Ninth Plan (1997-2002), the overall software industry is estimated to grow at a compounded annual growth rate (CAGR) of 52 per cent, the report said the industry grew at a CAGR of 57 per cent during 2000-2001.

Identifying IT-enabled services as an emerging area, the group statement said that the software industry would require 22 lakh professionals by 2008 with 11 lakh in the IT sector and the remaining 11 lakh in IT-enabled services.

The report has called for a four-pronged strategy for the manufacturing sector, which includes setting up a high level institutional framework for addressing the policy and procedural issues on a regular basis supporting focussed R & D in thrust-identified areas for different sub-sectors.

Apart from creation of a manufacturing infrastructure development fund to compensate for the infrastructural handicap and providing incremental support for quality assurance, prototyping is also recommended in the report. The proposed fund is also likely to initiate a manufacturing sector promotion programme to address issues such as improvement in technology including support for "greening technology", international marketing and business development.

The group has suggested setting up an electronic component development fund and technology incubators. Industrial and strategic electronics as well as components while performing below target, have been showing steady but low rates of growth. Consumer electronics had shown good performance during the first three years but is showing signs of decline now, the report said.

- Reproduced from The Financial Express16 Nov 2001

Make it simple, very simple _ computers will obey commands as the human body does

The information technology industry loves to make possible what seems impossible. Barriers in this industry get obliterated and new records are set with astonishing regularity. But from the very core of its success now springs a problem. And if it remains unsolved it will prevent the industry from moving in to the next era of computing. Interestingly this has nothing to do with the usual barriers that confront the IT industry such as how to keep pace with Moore's law or build systems that will embody the popular concept of artificial intelligence.

What is this obstacle that unfortunately very few in the IT industry are worried about? It is "complexity" says Paul Horn, senior vice president at IBM research and one of the most revered researchers around the world. "It is our next grand challenge", he says.

The IT industry continues to create increasingly powerful computers to make individuals and businesses more productive by automating key tasks and processes. Evolution through automation, however, has resulted in creating complexity _ an unavoidable by-product. In the evolutionary process of computers _ from single machines to modular systems to personal computers that are networked with larger systems, microprocessor powers, storage capacity and communication speed have gone up by a factor of several thousands. Remarkable achievements brought with them increasingly sophisticated architectures governed by software created through tens of millions of sophisticated code. "The Internet has added yet another layer to this complex maze", says Horn, "resulting in increasingly difficult systems that are not only difficult to manage but also difficult to use. The growing complexity of the IT infrastructure threatens to undermine the very benefits that IT aims to provide".

Until now, the IT industry has mainly relied on human intervention to manage the complexity it throws up, but with the current pace of expansion, far outshooting supply of IT professionals, there will not be enough skilled workforce to keep the world's computer system running, he says. Even if one were to assume that trained manpower can be got, complexity itself is growing beyond human ability to manage it.

Paradoxically to solve the problem and make the things simpler for administrators and users of IT more complex systems need to be created. Systems in which the complexity gets embedded in the infrastructure, both hardware and software, and whose management is automated. The process is similar to the massive complex systems of the human body which performs several tasks such as telling the heart how fast to beat, or controlling the pupils to receive the right amount of light _ all so seamlessly done without any conscious recognition or effort on the part of the human being.

Just as the nervous system allows humans to focus on what they want to do and not how to do it, computing systems need to be designed and built to adjust to varying circumstances to be able to run themselves and prepare their resources to efficiently handle the workloads users put upon them.

These autonomic systems, he says, must anticipate users' need and allow them to concentrate on what they need to accomplish rather than figuring out how to rig the computing systems to get them there _ the way a human being can make a mad dash for the train without having to calculate how much faster to breathe and pump the heart.

According to Horn, it is only when the complexity is eliminated from the visibility of the users will the next wave of IT-driven economic productivity occur. Only when it becomes simpler for users to use technology will new and unpredictable applications of IT emerge.

Horn says that much in the same way programming languages have moved closer to natural languages. Companies should be able to instruct the computing systems in conversational terms - tell it to monitor competition and if they are ahead in a particular market, adjust product pricing and supply position to make it attractive to customers.

According to him such a high level system can be described as possessing at least eight key elements: need to know itself, configure and reconfigure under varying conditions, optimise itself, perform self healing, self protect itself, study environmental act, and accordingly coexist, anticipate optimised resources need while hiding complexity.

Can such a system be built? It will be difficult and calls for significant exploitation of new technologies and innovations, say Horn. To make it possible, the industry should progress on two tracks to make individual system components autonomic and optimise the entire stack of computing layers.

The IT industry, he says, is beginning to make progress in key areas. Many established fields of scientific study like artificial intelligence, control theory, and cybernetics are contributing to autonomic computing. Also research projects such as `cellular chips' which are capably recovering from failures to keep the term applications running.

"The next era of computing enabled progress and abilities we barely envision today. But the measure of our success will be with our customers. Think about the functioning of computing systems as often as they think about the beating of their hearts, says Horn. [Modified].

R Subramanyam
-- Reproduced from The Economic Times 17 Nov 2001

This wireless technology is no fiction

First, there was Sci Fi, then Hi Fi, and now there is Wi Fi, which seems right out of Sci Fi. Wireless connectivity has improved by leaps and bounds, standards have evolved, and what would have been science fiction (Sci Fi) a few years back is reality now. Wireless fidelity (Wi Fi) is the term used for the alphabetic soup of protocols which underlie the wireless connectivity.

There are two aspects of connectivity _ one is connecting devices within a closed environment, the other is connecting them to the Internet. Ptotocols like 802.11 are broadband connections to the Internet in a local area network and they offer speeds up to 54 Mbps.

The advantage that wireless connectivity has over other forms of Internet connectivity like fibre or wired is that it adds the mobility dimensions to all devices. Rollout is faster, grabbing new subscribers is easy, it is convenient for the mobile worker and is the easiest way to connect home networks. Moreover, the spectrum used for connectivity is free and not regulated by the regulator so the capital cost do not include spectrum fees unlike in the third generation licences. Secondly, the number of devices in homes multiplying, there is a need to connect them and wireless offers best solutions to do so.

There are no service companies in India targeting this space as broad band services are yet to take off. Technology companies like Wipro are working on solutions like gateways for home networks. Wipro is investing in R & D to build products specifically for the home networks around wireless connectivity. The gateway is supposed to offer one entry point for all the devices in a home network to connect to the Internet.

Broadband players in the country are still busy digging ditches and stuffing with plastic pipes, though they have not yet "lit the fibre". But broadband connectivity has other routes and Wi Fi is the latest one. Where there is no digging of ditches, no fibre to be deployed etc. Some ISPs in congested areas are using fixed wireless through laser to connect two buildings but none of them have started using 802.11 standards.

Wi Fi is also seen as substitute for the third generation (3G) mobile wireless services. As LAN connectivity it will be developed in pockets of high usage like airports, cafes, bars, offices, etc.

There is no doubt that the WLAN market is booming. Worldwide sales of WLAN equipment increased by 80 per cent last year, to more than US $ 1 billion and will approach US $ 3.2 billion by the end of 2005, according to analysis farm IDC. Traditionally, WLANS have seen greatest acceptance in vertical markets, such as health care, inventory control and warehousing, where companies could justify high equipment and integration costs because the applications provide a clear ROI (return on investment). But in the past year, use has expanded into horizontal markets, including mainstream businesses, homes and educational environments.

K Yatish Rajawat
- Reproduced from The Economic Times 17 Nov 2001

Information Today & Tomorrow, Vol. 21, No. 2, June 2002, p.20-p.26 & p.32