This Nanotechnology Center was being built in the spring of 1990, as Eric Drexler was midway through a hectic eight-day trip, giving talks on nanotechnology to researchers and seeing dozens of university and consortium research laboratories. A Japanese research society had sponsored the trip, and the Ministry of International Trade and Industry MITI) had organized a symposium around the visit a symposium on molecular machines and nanotechnology. Japanese research was forging ahead, aiming to develop "new modes of science and technology in harmony with nature and human society, a new technology for the twenty-first century.
In the early hours of 26'Th December 2004 the world witnessed one of the most devastating natural disasters in the recent times causing the death of nearly 80,000 people.Tsunami a powerful fast moving wave by an under sea disturbance.If we had sufficient warning system to give indications ,then we certainly could avoid this much of destruction. With the present technology, even a slight undersea disturbance can be detected by special detectors placed on the sea floor.These signals are picked by the surface buoy, which sends data to satellite for further distribution to ground stations.
Can virus infect the mobile phones? It is a million dollar question that has no answer up to now. The chance is increasing day by day and recent hoax virus, Spam SMS worms is an evidence of it. Due to the financial aspects associated with wireless communications, this subject is under through research now.
This system is used in hospitals, because the patients in the ICU need a constant monitoring of their body, Respiratory Temp, Saline Status and ECG. Our project is a working model that incorporates sensors to measure all these parameters and transfer it to the computer, so that the patient condition can be analyzed to by doctors in any part of the hospital wherever they are. thus it reduces doctor’s work load and also gives more accurate results, wherever there is an abnormality fell by the patient, we have also incorporated saline monitoring system which gives an alarm when the saline bottle about to empty.
Electronics and computational techniques are increasingly being used to analyze biological cells to diagnose diseases and develop methodologies to cure diseases inside the body. One such technology is ‘Nanotechnology’. The paper emphasizes on the best and effective utilization of Nanotechnology in the treatment of cancer. The design of nanodevice is based on the constant study of cancer cells and nanotechnology.
The wireless communication and networking on a large scale is carried out through satellites. Inter planetary communication is also made possible due to satellite technology. The various kinds of space debris or space junk that are revolving around our earth in an orbit possess a threat to the satellite communications. Infact, these particles whether tiny or big are on their way to destroy the whole system of satellites and space stations orbiting around our globe in the near future. This deadly debris must be eliminated in order to have a complete access to our earth orbiting satellites. The concept of light pressure exerted by laser beams provides a valuable tool to eliminate these particles. The following section gives an idea of reducing the space debris and also eliminating them.
In the present day scenario of Embedded applications in Wireless Technologies Smart Dust Mote sensors is exploring the limits of autonomous sensing and communication by packing an entire system into a cubic millimeter at a relatively low cost. These volumetric constraints correspond to energy constraints on the system. Therefore, the mote "intelligence" must operate on the absolute minimum energy while providing necessary features. The mote can be partitioned into four subsystems:
- Sensors and analog signal conditioning
- Power system
- Transceiver front end
- The core.
Conventional register transfer level (RTL) debugging is based on overlaying simulation results on structural connectivity information of the hardware description language (HDL) source. This process is helpful in locating errors but does little to help designers reason about the how and why. Designers usually have to build a mental image of how data is propagated and used over the simulation run. As designs get more and more complex, there is a need to facilitate this reasoning process and automate the debugging .In this paper, we present innovative debug techniques to address this shortage in adequate facilities for reasoning about behavior, and debugging errors .Our approach delivers significant technology advances in RTL debugging; it is the first comprehensive and methodical approach of its kind that extracts, analyses, traces, explores and queries a design’s multi cycle temporal behavior . We show how our automatic tracing scheme can shorten debugging time by orders of magnitude for unfamiliar designs .We also demonstrate how the advanced debug techniques reduce the number of regression iterations.
Cosmologists have now accepted the notion of a big bang theory as the beginning of the universe. A leading contender among theories is now the inflation theory in which the universe briefly expands at speeds orders of magnitude greater than the speed of light. The Internet and its appetite for bandwidth is currently expanding at a rate that until recently economists considered impossible. The notion that the Internet would overtake the voice network, first in terms of bandwidth usage, and then in terms of profit and revenue, is now well accepted. Some of the theories about the Internet’s expansion in the coming 2–5 years seem analogous to cosmology’s inflation theory in that prior notions of the laws of physics must be suspended or modified in order to support the theories. Despite initial impressions that one may draw from the attempt at humour in the way of an unusual analogy, this paper provides a serious look at the problem of scaling the Internet by one or more decimal orders of magnitude. Protocol scalability is examined, considering potential roles of optical switching and the new class of tera bit routers.
Biometrics is a technology that automatically authenticates, identify, or verify an individual based on physiological or behavioural characteristics. This is accomplished by using computer technology in a non-invasive way to match patterns of live individuals in real time against enrolled records. The commonly used Biometric techniques include recognition of faces, hands, fingers, signatures, voices, fingerprints and irises for a person’s identification and authentication. This paper discusses in detail the Iris Recognition as a Biometric Technique.
How to improve your Interview, Salary Negotiation, Communication & Presentation Skills.
Got a tip or Question?
Let us know