Search
Thursday, July 31, 2008
Surface Computer
Microsoft is Lauching New System called surface computers.A Surface computer is able to recognize physical objects from a paintbrush to a cell phone and allows hands-on, direct control of content such as photos, music and maps. Surface turns an ordinary tabletop into a dynamic surface that provides interaction with all forms of digital content through natural gestures, touch and physical objects.
What it was, as the world found out yesterday, was a new touch-screen computer. The 30-inch screen sits 21 inches off the ground, as though it is the top of a fancy coffee table. You manipulate objects on the screen with your fingers. Drag virtual photos to sort them, flick an on-screen globe to spin it and so on.This new “surface computer,” as Microsoft calls it, has a multi-touch screen. You can use two fingers or even more — for example, you can drag two corners of a photograph outward to zoom in on it. (LOOK AT THE VIEDIO)
Next, you can set a cellphone down on the table — and copy photos into it just by dragging them into the cellphone’s zone.Same procedure for Digital camera tooo.
Monday, July 28, 2008
Supercomputer
A supercomputer is a computer that is at the frontline of processing capacity, particularly speed of calculation (at the time of its introduction). The term "Super Computing" was first used by New York World newspaper in 1929.Supercomputers introduced in the 1960s were designed primarily by Seymour Cray at Control Data Corporation (CDC), and led the market into the 1970s until Cray left to form his own company, Cray Research. He then took over the supercomputer market with his new designs, holding the top spot in supercomputing for five years (1985–1990). Cray, himself, never used the word "supercomputer"; a little-remembered fact is that he only recognized the word "computer".
A supercomputer generates large amounts of heat and must be cooled. Cooling most supercomputers is a major HVAC problem.
Information cannot move faster than the speed of light between two parts of a supercomputer. For this reason, a supercomputer that is many meters across must have latencies between its components measured at least in the tens of nanoseconds. Seymour Cray's supercomputer designs attempted to keep cable runs as short as possible for this reason: hence the cylindrical shape of his Cray range of computers. In modern supercomputers built of many conventional CPUs running in parallel, latencies of 1-5 microseconds to send a message between CPUs are typical.
Supercomputers consume and produce massive amounts of data in a very short period of time. According to Ken Batcher, "A supercomputer is a device for turning compute-bound problems into I/O-bound problems." Much work on external storage bandwidth is needed to ensure that this information can be transferred quickly and stored/retrieved correctly.
Technologies developed for supercomputers include:- Vector processing
- Liquid cooling
- Non-Uniform Memory Access (NUMA)
- Striped disks (the first instance of what was later called RAID)
- Parallel filesystems
Programming
The parallel architectures of supercomputers often dictate the use of special programming techniques to exploit their speed. Special-purpose Fortran compilers can often generate faster code than C or C++ compilers. So Fortran remains the language of choice for scientific programming, and hence for most programs run on supercomputers
Saturday, July 26, 2008
Artificial intelligence
Artificial intelligence (AI) is both the intelligence of machines and the branch of computer science which aims to create it.
Artificial intelligence has successfully been used in a wide range of fields including medical diagnosis, stock trading, robot control, law, scientific discovery and toys. Frequently, when a technique reaches mainstream use it is no longer considered artificial intelligence, sometimes described as the AI effect. It may also become integrated into artificial life.
The broad classes of outcome for an AI test are:
optimal: it is not possible to perform better
strong super-human: performs better than all humans
super-human: performs better than most humans
sub-human: performs worse than most humans
For example, performance at checkers is optimalperformance at chess is super-human and nearing strong super-human and performance at many everyday tasks performed by humans is sub-human.
Artificial intelligence has successfully been used in a wide range of fields including medical diagnosis, stock trading, robot control, law, scientific discovery and toys. Frequently, when a technique reaches mainstream use it is no longer considered artificial intelligence, sometimes described as the AI effect. It may also become integrated into artificial life.
The broad classes of outcome for an AI test are:
optimal: it is not possible to perform better
strong super-human: performs better than all humans
super-human: performs better than most humans
sub-human: performs worse than most humans
For example, performance at checkers is optimalperformance at chess is super-human and nearing strong super-human and performance at many everyday tasks performed by humans is sub-human.
Robotics
Advances in technology have been astounding over the last decade. Electronics are starting to be built into everything from vacuums to toothbrushes and slowly, but surely, computers will become an invaluable part of every aspect of daily living. Someday, you'll be able to take a shower and the bathroom will not only detect you, it will adjust the height of the sink to your level and set the water temperature to 96 degrees, just the way you like it.
Taking care of the elderly will become much easier, as well. Remember Rosie from The Jetsons? She was able to cook and clean and even read their son, Elroy, a bedtime story. It may be a long time before we see a robot that we're actually able to communicate with, but robots that cook and clean are already in production. Think about your elderly loved ones. Instead of having to send them away to a retirement home, they'll be able to spend their remaining years at home with you, where they should be. And you won't have to worry about leaving them alone or making sure they're taking their medication.
This all may sound great, but can we really expect to see all this new technology anytime soon? The answer is yes, and no. Within our lifetimes, expect to see electronics built into just about everything. Expect to be able to use verbal commands to control most major household appliances, but don't expect them to be able to answer back with a witty remark until about 2050. We are in the electronic and digital age, but will soon be up against the robotic and nanotechnology age.
Nano Technology
Nanotechnology refers to a field of applied science and technology whose theme is the control of matter on the atomic and molecular scale
One nanometer (nm) is one billionth, or 10-9 of a meter. To put that scale in context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.Or another way of putting it: a nanometer is the amount a man's beard grows in the time it takes him to raise the razor to his face.
Tera-scale computers which are based on 10 to 100s of integrated processor cores perform 3 workloads in Model based computing: "Recognition object or pattern in a database, data mining (source the data base to find identical objects or related patterns that match the target object), & information synthesis (you mine out data, different patterns & objects, put together in a way that user can digest; user can derive useful solutions)" quoted from Vived De; fellow of Intel.
Let us briefly compare side by side, the main offerings from both Intel & AMD quad core microprocessors. You can see that AMD's strategy is taking the competitors weakness as its strength. AMD keeps mentioning about its superior 2mb L3 cache; touted as better than any of its competitors while Intel is in a competition with its own past achievements. Intel is staying away from a head to head "chicken play" with AMD by producing new products much faster than its nearest competitors even though it misses a few innovative hits like a new L3 cache.
However, Intel more than compensates by improving it microprocessors in other avenues like
SSE4 instructions & more efficient materials like the hafnium metal gate silicon technology. Intel's speed of having new products into the consumer market is the main reason why it is the microprocessor market leader today. People usually want a new product they can use now rather than wait for a marginally better product after a few months. AMD's chance to be the new market leader will come from its ability to do the same like Intel; not only talk the talk but also walk the walk.
The Dunnington 6 core microprocessors from Intel are the next in line to be released to the consumer market. Users of the Dunnington microprocessors will own a piece of Peta flop level computing technology. The Peta flop computing technology, a computer that calculates a thousand trillion flotation points per second, exists in Intel's lab demo of a powerful peta flop computer that utilizes Dunnington microprocessors.
It seems that from a marketing point of view, Intel is using its research & development arm to show consumers the full capability of its microprocessors. It is unlikely that an individual home user will be buying like 60 or more pieces of Dunnigton microprocessors to build a peta flop home computer. Nonetheless, the appeal of it exists, & people will go like "You know, I own a 6 core Intel microprocessor that has peta-flop computing potential!" One must admit that it is a very appealing prospect!
It seems that from a marketing point of view, Intel is using its research & development arm to show consumers the full capability of its microprocessors. It is unlikely that an individual home user will be buying like 60 or more pieces of Dunnigton microprocessors to build a peta flop home computer. Nonetheless, the appeal of it exists, & people will go like "You know, I own a 6 core Intel microprocessor that has peta-flop computing potential!" One must admit that it is a very appealing prospect!
Tuesday, July 22, 2008
Faster Browsing in Windows Explorer on Network Computers
By default, a Windows XP machine connecting to a Windows 95/98/Me computer will search for scheduled tasks or enabled printers on the remote computer. Two sub-keys control this behavior. Deleting them will speed up browsing on the remote computer. [Start] [Run] [Regedit] Go to : HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Explorer\RemoteComputer\NameSpace To Disable Scheduled Task Checking Value Name: {D6277990-4C6A-11CF-8D87-00AA0060F5BF} Delete the sub-key in the left pane of Registry Editor To Disable Printer Checking Value Name: {2227A280-3AEA-1069-A2DE-08002B30309D} Delete the sub-key in the left pane of Registry Editor Exit Registry / Reboot
Subscribe to:
Posts (Atom)