Friday, August 29, 2008

Nanotechnology in Cancer

Nanoscale devices are somewhere from one hundred to ten thousand times smaller than human cells. They are similar in size to large biological molecules ("biomolecules") such as enzymes and receptors. As an example, hemoglobin, the molecule that carries oxygen in red blood cells, is approximately 5 nanometers in diameter. Nanoscale devices smaller than 50 nanometers can easily enter most cells, while those smaller than 20 nanometers can move out of blood vessels as they circulate through the body.


Because of their small size, nanoscale devices can readily interact with biomolecules on both the surface of cells and inside of cells. By gaining access to so many areas of the body, they have the potential to detect disease and deliver treatment in ways unimagined before now. And since biological processes, including events that lead to cancer, occur at the nanoscale at and inside cells, nanotechnology offers a wealth of tools that are providing cancer researchers with new and innovative ways to diagnose and treat cancer.

Steganography

Steganography works by replacing bits of useless or unused data in regular computer files (such as graphics, sound, text, HTML, or even floppy disks ) with bits of different, invisible information. This hidden information can be plain text, cipher text, or even images.


Steganography sometimes is used when encryption is not permitted. Or, more commonly, steganography is used to supplement encryption. An encrypted file may still hide information using steganography, so even if the encrypted file is deciphered, the hidden message is not seen.


Special software is needed for steganography, and there are freeware versions available at any good download site.


Steganography (literally meaning covered writing) dates back to ancient Greece, where common practices consisted of etching messages in wooden tablets and covering them with wax, and tattooing a shaved messenger's head, letting his hair grow back, then shaving it again when he arrived at his contact point.

Internet protocol


Background of Internet Protocols

In the mid-1970s, the Defense Advanced Research Projects Agency (DARPA) became interested in
establishing a packet-switched network to provide communications between research institutions in
the United States. DARPA and other government organizations understood the potential of
packet-switched technology and were just beginning to face the problem virtually all companies with
networks now have—communication between dissimilar computer systems.
With the goal of heterogeneous connectivity in mind, DARPA funded research by Stanford
University and Bolt, Beranek, and Newman (BBN) to create a series of communication protocols.
The result of this development effort, completed in the late 1970s, was the Internet Protocol suite, of
which the Transmission Control Protocol (TCP) and the Internet Protocol (IP) are the two best
known.
The Internet protocols can be used to communicate across any set of interconnected networks. They
are equally well suited for local-area network (LAN) as well as wide-area network (WAN)
communications. The Internet suite includes not only lower-layer specifications (like TCP and IP),
but also specifications for such common applications as mail, terminal emulation, and file transfer.

Tuesday, August 26, 2008

PC HARDWARE

Tools define our culture. We aren't so much what we make as what we use to make it. Even barbarians can make holes with a bow and shaft; we have drill presses, hydraulic punches, and lasers. More importantly, the development of tools defines civilization. No culture is considered civilized unless it uses the latest tools. The PC is the tool that defines today's current age and culture.
Once a tool only for initiates and specialists, today the PC has become as common as, well, all those monitors sitting on office desks and overweight attache cases bulging with keyboard-screen-and-battery combinations. The influence and infiltration of the PC stretches beyond comparison with any other modern tool, even beyond the reach of common metaphors. No office machine is as common; none so well used; none so revered—and so often reviled. Unlike the now nearly forgotten typewriter that was restricted to secretaries and stenographic pools, the PC now resides resplendently on once bare drafting tables, executive desks, and kitchen counters. Unlike fax machines, calculators, and television sets, the PC doesn't do just one thing but meddles in nearly everything you do at work and at home. Unlike your telephone, pager, or microwave oven, the PC isn't something that you use and take for granted, it's something you wonder about, something you want to improve and expand, perhaps even something that you would like to understand.
Indeed, to use any tool effectively you have to understand it—what it can do, how it works, how you can use it most effectively. Making the most of your PC demands that you know more than how to rip through the packing tape on the box without lacerating your palms. You cannot just drop it on your desk, stand back, and expect knowledge to pour out as if you had tapped into a direct line to a fortune cookie factory.
Unfortunately, despite the popularity of the PC, the machine remains a mystery to too many people. For most people, the only one more baffling than programming a VCR. Everyone knows that something happens between the time your fingers press down on the keys and a letter pops up on the screen, or a page curls out of the printer, or a sound never heard before by human ears shatters the cone of your multimedia loudspeakers. Something happens, but that something seems beyond human comprehension.

Saturday, August 9, 2008

Advanced technology

Cell processor



Mainstream processor development is mostly targeted at compatibility andContinuity. Thus, the processor market is dominated by x86 compatible CPUsSince more than two decades now. Several new concepts tried to gain someMarket share, but it was not possible to overtake the old compatibility drivenConcepts. A group of three corporates tries another way to come into theMarket with a new idea, the cell design. The cell processor is a new try toLeverage the increasing amount of transistors per die in an efficient way. TheNew processor is targeted at the game console and consumer electronics marketTo enhance the quality of these devices. This will lead to a wide spreading,And if everybody has two or more cell processors in TV, game console or PDA, ThisPaper gives a short overview of the architecture and several programmingIdeas which help to exploit the whole processing power of the cell processor. above allthis paper explains the advancement in cell processors and their applications in modern life.it also explains the design specifications, hardware and software of the cellprocessors.And explains how The Cell processor represents an incredible leap in computing technology, and it is debuting in the consumer market.
Cell provides a breakthrough solution by adopting flexible parallel and distributed computing architecture consisting of independent, multi-core floating point processors for rich media processing. With the capability to support multiple operating systems, Cell can perform both PC/WS operating systems as well as real-time CE/Game operating systems at the same time. Scalability offered by Cell can be utilized for broader applications, from small digital CE systems within the home to other entertainment applications for rendering movies, and to the big science applications as supercomputers.
Blue ray disk
When the CD was introduced in the early '80s, it meant an enormous leap from traditional media. Not only did it offer a significant improvement in audio quality, its primary application, but its 650 MB storage capacity also meant a giant leap in data storage and retrieval. At the same time, the DVD spec used the same form factor as the CD, allowing for seamless migration to the next generation format and offering full backwards compatibility. Now, in the next millennium, high definition video demands a new solution. History proved that a significant 5-10 x increase in storage capacity and the ability to play previous generation formats are key elements for a new format to succeed. This new format has arrived with the advent of Blu-ray Disc, the only format that offers a considerable increase in storage capacity with its 25 to 50 GB data capacity. This allows for the next big application of optical media: the distribution and recording of high definition video in the highest possible quality. In fact, no other proposed format can offer the data capacity of Blu-ray Disc, and no other format will allow for the same high video quality and interactive features to create the ultimate user experience. In future Blu-ray Disc capable of holding 200 GB of data on a single side, using six 33 GB data layers will be available.
4G technology


The ever-increasing growth of user demand, the limitations of the third generationof wireless mobile communication systems, and the emergence of new mobilebroadband technologies on the market have brought researchers and industries toa thorough reflection on the fourth generation. Many prophetic visions haveappeared in the literature presenting 4G as the ultimate boundary of wirelessmobile communication without any limit to its potential, but in practical terms notgiving any design rules and thus any definition of it. In this article we give a pragmaticdefinition of 4G derived from a new user-centric methodology that considersthe user as the “cornerstone” of the design. In this way, we devise fundamentaluser scenarios that implicitly reveal the key features of 4G, which are thenexpressed explicitly in a new framework — the “user-centric” system — thatdescribes the various level of interdependency among them. This approach consequentlycontributes to the identification of the real technical step-up of 4G withrespect to 3G. Finally, an example of a potential 4G application is also given inorder to demonstrate the validity of the overall methodology. Artificial Intelligence
Artificial Intelligence is a branch of Science which deals with helping machines find solutions to complex problems in a more human-like fashion. This generally involves borrowing characteristics from human intelligence, and applying them as algorithms in a computer friendly way. A more or less flexible or efficient approach can be taken depending on the requirements established, which influences how artificial the intelligent behaviour appears.
AI is generally associated with Computer Science, but it has many important links with other fields such as Maths, Psychology, Cognition, Biology and Philosophy, among many others. Our ability to combine knowledge from all these fields will ultimately benefit our progress in the quest of creating an intelligent artificial being.