Friday, September 5, 2008

Intel's New Chip

Has Intel's New Chip Architecture Finally Defeated Amd?

For a few years now AMD has dominated the performance PC market, but the latest release of Intel's new desktop chips has changed that virtually overnight. Intel's new Core 2 Duo chips do not simply outperform AMD's fastest chips; they leave them in the dust. Preliminary benchmark testing puts Intel's fastest chip, the Core 2 Extreme X6800, 17 percent faster than AMD's Athlon FX-62 on the World Bench 5. The World Bench 5 consists of a multi-tasking test encoding a file with Windows Media Encoder while browsing the net with Mozilla. Even Intel's more affordable Core 2 Duo E6700 can beat the Athlon FX-62 by 12 percent. This makes a huge difference in the microchip market where gains of 10 percent or less are more common.

Intel's Core 2 Duo line ranges from a 1.86 GHz E6300 to the 2.93 GHz Core 2 Extreme X6800. These chips are already available in Dell's XPS line of desktops, in Alienware's Area-51 and ALX series, and in ABS Ultimate computers. Many of these desktops with new chips are more affordable than you might think.

How did Intel do it? One of the major features Intel improved upon in their new chip architecture was power consumption. While the Pentium Extreme Edition was burning 135 watts, the latest Core 2 Extreme X6800 burns only 75 watts. That is a 44 percent increase in efficiency. This means the new Intel Core Duo chips run a lot cooler. For the hardcore gamers and those needing the ultimate performance, such a low base temperature allows for more stable overclocking, and ABS has proven this with the ABS Ultimate X9. The Ultimate X9 runs the Intel Core 2 Extreme chip overclocked to 3.5 GHz for even stronger performance on the World Bench 5. This decrease in core temperature also means PC manufacturers will be able to build smaller and quieter systems, using fewer fans.

Another major feature of Intel's new chip architecture is the development of multiple CPU cores. This allows the chip to be more adaptable and better able to deal with bottlenecks. A great deal of the increase in performance comes from an extra executable core. While the Pentium D had 3, the new Core 2 Duo has 4 executable cores. That extra core along with some coding have allowed the chip to read multiple sets of instructions as if they are a single set. Intel has also eliminated another major bottleneck with its huge 4 MB cache. Though most dual core chips allocate a specific amount of cache, to each core, the Core 2 Duo is able to share the entire cache across both cores. Amazingly, it is able to distribute that shared cache as needed to the core working on the more complex task. This core might use 3 MB of he cache to complete a difficult task while the other core uses the other 1 MB for a simpler, though necessary, task.

So, these developments beg the question, has Intel finally defeated AMD? It certainly looks that way for the remainder of 2006 and perhaps through 2007. AMD has a new chip architecture for the desktop due out in 2007. AMD is attempting to stay competitive with Intel with some dramatic price cuts. The FX-62 has gone down in price by over $200. They have a CPU package for multisocket PCs called "Torrenza". Torrenza will debut on servers where multi-socket configurations are already common. This will not be their answer to Intel's Core 2 Duo. AMD's next generation chip architecture is due out in 2007 and it has been called "K8L". No one really knows how these and quad core chips will be able to perform. So, while AMD is still in the race for the long run, for the next 6 months to 1 year, Intel is now the unquestioned leader in high performance chips.


Laptop Computers Instigate Classroom Revolution!

Have you noticed that laptop computers are beginning to dictate what happens in the classroom? Additionally, mobile technology is changing where learning takes place. This is especially true as laptops are entering the educational arena in ever increasing numbers.

Yet not all educators are quick to jump on the notebook bandwagon. They are waiting to see if laptop use is truly changing learning for the better. That’s why schools such as Framingham State College in Boston and the Myron B. Thompson Academy in Honolulu are being scrutinized. Both schools have elected to use laptop computers as a major resource in their curriculum.

It must be noted that simply having laptop computer access does not change the learning process - the entire curriculum must change and merge technology with academics. Using a laptop is more than replacing paper and pencil with electronics. The entire relationship between the teacher and students must change. Teachers are no longer merely dispensers of knowledge but become facilitators in a learning process that takes place with additional tools for communication with students. This encourages greater student involvement, long recognized as a key to learning.

Professors actively engaged in education via technology see laptops as a way to give real life experiences to otherwise insipid theories. For instance, physics students can witness the breakdown of radioactive materials, math students can apply their knowledge of logic to projects that use spreadsheets, and english majors can evaluate peers’ poetry and prose. With laptop availability in the classroom, access to additional information and knowledge is readily available.

So what do students think of the laptop trend? Well most find it invigorating. Rather than dealing with one dimensional lectures, teachers can engage the internet to show relevancy to daily life. Out of date, static texts can be set aside for breaking news stories - and so much more.

As schools show the link between laptops and learning effectiveness, the trend for laptop use will certainly grow.

For those opposed to the technological revolution taking place in today's classrooms, perhaps the words of Maria Montessori should be heeded.

".....If education is always to be conceived along the same antiquated lines of a mere transmission of knowledge, there is little to be hoped from it in the bettering of man's future....."

Acquaint Yourself With Computer Printers

A computer printer is a device used for printing text or images on hard copy stored in electronic form, generally on physical print media like paper. Printers are designed to support both local and network connected users simultaneously.

Nowadays, modern printers can directly interface to electronic devices like digital cameras. Some printers that come with non printing features are commonly known as Multi-Functional Printers (MFP) or Multi-Function Devices (MFD). It integrates various functions of multiple devices into one. Such types of printers are extremely useful for small businesses and home offices.

As opposed to a traditional printer, a multi-functional printer is a combination of devices like Printer, Scanner, Photocopier, and Fax Machine. Likewise, there are numerous other types of printers widely available in the market. Let’s take a look at them:

Laser Printer: A laser printer works in a similar fashion as a photocopier does. It has a roller which is charged with electricity. A laser beam is passed to remove the charge from portions of the roller. The parts hit by the laser are powdered by the toner which is then transferred from the roller to the paper. Finally, the ink is baked into the paper with the help of a heater incorporated in the printer. People generally prefer laser printer because of its ability to give high quality output and high speed.

Dot Matrix: It has a print head that moves across the page. A dot matrix printer produces characters using a cluster of pins which press an inked ribbon to the paper, thereby creating a dot. Each character is made in the same way. Dot Matrix printers are relatively cheaper and durable. These qualities still attract businesses which use them as invoice printers.

Ink Jet and Bubble Jet: It works in a manner similar to a Dot Matrix Printer. However, its print head sprays liquid ink onto the page instead of pressing a dry ink against the page. Ink jet and bubble jet printers are better known as predecessors of laser printers. They produce better image quality and run faster.

>From inkjets to monochrome and color lasers, different printers are designed to accomplish different tasks. Companies like DELL, CANON, LEXMARK, BROTHER, EPSON, and HP HEWLETT PACKARD are most preferred when it comes to buying a printer. Nowadays, computer printer support is widely available on the Internet which saves you from taking it to any expensive technician for troubleshooting.

Computer Consulting 101 Hiring Tips

Does your business need the services of a computer consulting firm? Before you rush out and hire the first techie or slick-salesperson that knocks on your door, be sure to consider these favorite Computer Consulting 101 hiring tips for screening and interviewing local computer consulting firms. In this first in a two-part series, we'll look the root of the problem, as well as the four most basic criteria that you'll need to ask about when searching for computer consulting vendors.

Computer Consulting 101 Preventative Medicine

Many small business owners have a tough time knowing how to deal with difficult computer consultants. However, if you're able to uncover potential problems at the start of your computer consultant/client relationship, you can avoid many of these unpleasant issues altogether.

Root of the Problem

While most entrepreneurs and small business managers know exactly what to ask when it comes to hiring for internal staff positions, hiring a computer consulting firm can be more difficult.

So on top of dealing with the myriad legal issues surrounding how you retain the services of contractors (best advice: consult with your attorney), as opposed to hiring employees on your payroll, you'll need to know how to ask the "right" questions. Don't make the ultra-common common mistake of focusing on the wrong things. Use these Computer Consulting 101 Hiring Tips as your checklist for doing your homework before you sign on the dotted line.

Part-time or Full-time Computer Consulting

Do you have a "day job"? Are you moonlighting?

Solo Practitioner or True Computer Consulting Business

What do you mean by the "we"? Are there any other people who work at your company?

Are they employees or contractors? What are their names, specialties and backgrounds? How long have they been with the company? Will they be involved with this account? (Tip: The more pointed questions you ask, the more you'll flush out the B.S. and hyperbole.)

Small Business or Large Company Computer Consulting Experience

What "size" is your typical consulting client, in terms of number of PCs, employees and annual revenue?

Generalist or Specialist Consulting Company

What industries or vertical markets have you worked with? And in what particular aspects and software applications?

What kinds of products, services, and platforms does your company shy away from? Do you work with any specialty hardware, software or services vendors?

The Bottom Line

In this first of a two-part series of these Computer Consulting 101 hiring tips, we looked at why small business owners and managers find that computer consulting companies are so difficult to hire, as well as four basic issues that you must confront when searching for a new computer consulting vendor. In the second installment of this two part series on Computer Consulting 101 hiring tips, we'll look at how you can get your hands around the true costs of using a computer consulting firm, as well as how you can more objectively evaluate the computer consulting firm's suitability for the task of servicing your company's technology needs.


Thursday, September 4, 2008

Acquaint Yourself With Computer Printers

A computer printer is a device used for printing text or images on hard copy stored in electronic form, generally on physical print media like paper. Printers are designed to support both local and network connected users simultaneously.

Nowadays, modern printers can directly interface to electronic devices like digital cameras. Some printers that come with non printing features are commonly known as Multi-Functional Printers (MFP) or Multi-Function Devices (MFD). It integrates various functions of multiple devices into one. Such types of printers are extremely useful for small businesses and home offices.

As opposed to a traditional printer, a multi-functional printer is a combination of devices like Printer, Scanner, Photocopier, and Fax Machine. Likewise, there are numerous other types of printers widely available in the market. Let’s take a look at them:

Laser Printer: A laser printer works in a similar fashion as a photocopier does. It has a roller which is charged with electricity. A laser beam is passed to remove the charge from portions of the roller. The parts hit by the laser are powdered by the toner which is then transferred from the roller to the paper. Finally, the ink is baked into the paper with the help of a heater incorporated in the printer. People generally prefer laser printer because of its ability to give high quality output and high speed.

Dot Matrix: It has a print head that moves across the page. A dot matrix printer produces characters using a cluster of pins which press an inked ribbon to the paper, thereby creating a dot. Each character is made in the same way. Dot Matrix printers are relatively cheaper and durable. These qualities still attract businesses which use them as invoice printers.

Ink Jet and Bubble Jet: It works in a manner similar to a Dot Matrix Printer. However, its print head sprays liquid ink onto the page instead of pressing a dry ink against the page. Ink jet and bubble jet printers are better known as predecessors of laser printers. They produce better image quality and run faster.

>From inkjets to monochrome and color lasers, different printers are designed to accomplish different tasks. Companies like DELL, CANON, LEXMARK, BROTHER, EPSON, and HP HEWLETT PACKARD are most preferred when it comes to buying a printer. Nowadays, computer printer support is widely available on the Internet which saves you from taking it to any expensive technician for troubleshooting.

Protect Your Computer

I will have to be first when I admit I am addicted to the Internet. Whenever I‘ve got a spare hour, I love to just surf around as a form of entertainment. I have also found out about the world of download. I can take the best in modern music and DVD and watch or listen as I do whatever else I want. This is the kind of thing that absolutely anyone could become addicted to. Whatever entertainment you want for the cost of your Internet connection! Wow.

As perhaps you would expect, when I am online all the time I run into a few problems that I have had to deal with. When downloading, you are simply asking for problems when these files are not being scanned before you actually download them. I lost all information from my hard disk before, and that includes my financial records, my photographs and the all the games, and DVDs and music which I had downloaded along the way.

Before, I just had some anti-virus software called Norton which was just on my computer when I bought it. I thought it worked fine but occasionally some viruses would get through. Yes only occasionally, but this is not good enough when your computer could just crash at any given time. This is a problem that you have to tackle before some malware or spyware penetrates your hard drive or the internals of your computer. My friend's PC just started switching off whenever it pleased as a result of this and the same thing could easily happen to you.

So basically what I would recommend is that you do not surf the Internet, then your current protection will probably be ok. However if you do, then you absolutely have to upgrade your virus package.

So now I don’t rely on any rubbish package for anti-virus protection. You simply have to update your as well if you want to protect your computer. Because the Internet changes every minute, so does the viruses that your anti-virus software has to tackle. That is why you need software which automatically updates itself via your Internet connection. Think about when a new virus pops-along. Will your two-year old package be able to defend your computer? I do not think so some how.

And other thing that you need from anti-virus software is something that will come with support functionality. Imagine when you realize that you have zero idea how to upload your software. You need a call-center that is going to ensure you can get it working on your computer. You should actually get software which directly meets your surfing habits. If you do not then you are just asking for problems to happen. Look at the software and ask questions. Will this be able to handle downloading from peer-to-peer networks?

Basically, if you spend your time online or even if you work from the Internet then you cannot afford not to invest in some quality anti-virus protection. Having some rubbish package is better than nothing, however things are far more sophisticated these days, and you really need something that is going to work. It is a wise investment, and think how much money you will save once you start downloading free stuff online?

Newest Work Hazard For Computer Users: CVS

Millions of Americans go to work every day to sit at a computer for 8 or more hours, seven days a week. While office work is hardly considered to be a "dangerous" job, several years ago, doctors began to notice that certain afflictions are becoming increasingly common among those who use computers all day long.

First, the medical community became aware of CTS (Carpal Tunnel Syndrome), a painful and sometimes debilitating inflammation in the wrist, arms, and hands. In response to the increased number of office workers being diagnosed with CTS, computer furniture manufacturers began to develop ergonomically correct keyboards and keyboard trays, as well as computer chairs that offer more comfortable seating for computer use. Now, many employers provide ergonomically correct workstations for their office staff and CTS is on the decline.

Unfortunately, a new hazard has now taken the place of CTS. It's called CVS (Computer Vision Syndrome), and is caused by prolonged visual exposure to a computer screen. CVS has, in the past two years, become the number one health complaint of office employees.

Like CTS, Computer Vision Syndrome is also preventable. There are a few things you can do to reduce the sensitivity and eye strain associated with computer work. According to optometrist Dr. Larry K. Wan, there are five key ways to reduce the effects of CVS.

Dr. Wan suggests getting regular eye exams, which means (according to NIOSH—the National Institute of Occupational Safety and Health) once a year. While many contact-lens wearers are required to get eye exams each year to renew their lens prescription, it's important for those who don't wear contacts as well. Tracking the health of your eyes will allow your optometrist to identify whether or not you're experiencing a deterioration of vision due to CVS.

Dr. Wan also contends that proper lighting is important, citing that eyestrain can be caused by excessive lighting either coming in through a window, or from interior lighting itself. Dim your lights when using a computer.

Glare is another factor that Dr. Wan claims can cause CVS. In addition to adding an anti-reflective coating to your eyeglasses, glare can be prevented by anti-glare screens attached to your monitor, or by using an LCD monitor arm. LCD monitor arms actually allow you to tilt or slide your monitor at the touch of a finger, eliminating strain as the light in the room changes. We found affordable, high-quality LCD monitors online at Versa Products, Inc.

Adjusting the brightness of your computer screen can also help. Optimize for comfort which may mean either reducing or increasing the brightness. Find one that feels comfortable to your eyes.

Interestingly, Dr. Wan also tells his patients to blink more often. When staring at a computer screen, we tend to blink less than we normally do—a process which keeps the eyes lubricated and comfortable. Make a point of blinking more often and every half hour, blink about ten times in a row, slowly. This will help reduce the effects of CVS as well.

Best Desktop Computer Deal

How To Find The Best Desktop Computer Deal

If you are unable to find desktop computer deals that truly strikes your fancy, the best advice is to simply to wait for five minutes. After all, computers are consistently getting more and more powerful, not to mention cheaper, and nowadays it is no longer that hard to find a decent enough, mid-range box for less than 400 dollars, which is a pretty good deal if you ask me. There are two important aspects of sourcing for good desktop computer deals. One of them is of course patience, and the other one is research. If you can combine both of these, you are virtually guaranteed to find a very decent machine at a great price.

The best thing is to start with researching desktop computer deals. What is it that you want inside your computer? Do you need the latest, high-powered engine, or would a relatively dated computer (up to 18 months) be good enough to start with? Before you start looking for desktop computer deals on the latest, most powerful machines, take a brief moment and think if you really need them. Honestly, for all intents and purposes, you can find a better desktop computer deal on a slightly older machine which will work just as well for you. Again that would depend on how you use it.

The simple fact is that the latest models are always going to be more expensive. If you need one, wait six months until it goes down. Unless you are a really serious gamer, or a researcher involved in complex analytical work, look for middle of the line desktop computer deals instead. Most other types of work will rarely require these kinds of advanced electronics.

The other thing that you have to do some research in is quality of product. What desktop computers will last, and which ones would not? What are the highest quality components? Does the company offering you desktop computer deals provide good warranties? Finding a cheap computer is less exciting when you find out that that computer is a lemon. The fact is that desktop computer deals are not real deals if you get a crappy machine out of them.

Just as important as doing your research is being patient. Once you find out the specific type of computer you want, don't just go out and buy the first machine you set your eyes on. Look for deals on a desktop computer for a while. Compare prices and wait to see which one is coming down. Think about how long you are willing to wait. The important thing to realize about desktop computer deals is that the longer you wait, the sweeter (i.e. cheaper) they become, so if you are able to put off buying your computer for just a wee bit longer, by all means you should.

Child's Computer Use


It can be difficult for parents to restrict their child's use of the computer. However, for many parents this has become an unavoidable eventuality, as computers have intruded to such a degree into the lives of children, that far too many youngsters are spending an unhealthy amount of time behind them.

The word 'unhealthy', supposes a negative influence on your child's physical and mental/social development. Physical problems include a bad posture due to a poor sitting position, or deteriorating eyesight due to too much time spent looking at the screen. Mental/social problems include a feeling of alienation from the real world, and less opportunity to develop real life social skills.

When parents first bring up the subject, that they intend to restrict when their child can use the computer, it is vital that the child is in a calm mood, and that he or she does not have ready access to a blunt instrument. The computer might well be your child's best friend in life, and any suggestion about cutting short the supply of the daily drip feed of games, chat rooms, and web surfing could lead to wild and fervent behavior from your offspring.

Calm and empty handed, your child is being made aware that you intend to restrict their use of the computer starting from tomorrow. You know your child spends too much time on it. Perhaps your child comes home from school, buries him or her self in the bedroom, and instantly goes on the computer for a few hours. Perhaps he or she just uses it for a couple of hours a day, the session which begins 5 minutes after their bedtime. However your child misuses the computer, you have to choose the best way to restrict their use. The scenarios are numerous, but effective solutions harder to come by.

One family told me how they recently took the computer out of their son's bedroom, and put it into the sitting room, from where they could more easily restrict its use. Unfortunately however, with a streak of sneakiness not unknown to generations of teenagers beforehand, their son would tiptoe down to the lounge after the parents had gone to bed, and reveled away on the computer all night long. The parents only began to suspect something when they found the child one morning in the lounge snoring away with his head on the desk.

Another source told me how they'd made a verbal agreement with their child in order to restrict his computer use. It was made on Sunday morning, and by Sunday lunch time the child had already broken the agreement, citing 'aggressive monster beheading' withdrawal symptoms as the grounds for his transgression.

I do know of a considerably more effective way to restrict your child on the computer. This method is sneaky-safe, and does not take withdrawal symptoms as an excuse. It is a computer control program called Chronoger; written by the acclaimed software development company SoftForYou. Inherent in the program is everything that you will need to restrict not only when your child can use the computer, but how he or she can use it too; when for doing homework, and when for entertainment.

By installing the program onto your child's computer, you will be able to select for each day of the week when and how long your child can use the Internet, play games, enter chat programs, and use the computer as a whole. Your child will not be able to change these settings, which are protected by an administrator (parent) password.

One of the best thing about it is that you hardly even have to know anything about computers to use the program. It is very user friendly, and simple to navigate around. If you do come across a problem, as part of the package you can contact the support center and get any assistance that you require.

This program means that you can leave the computer in your child's room, and know that they are using it when and how you would want. Also it means that you can dispense with verbal agreements of dubious significance and lasting.

Recording The Guitar to a Computer

You want to record your guitar, make your own riffs, music, etc. You want to plug my guitar into my computer? How to connect the guitar to the computer the right and best way? What software? Which audio interface? What about my favorites guitar effects? What computer? Where to start? It can be overwhelming.

The recording guitarist's computer setup should be something like this:


* Guitar (*GTR with 13 pin connection recommended) * Microphone for recording Acoustic guitar, vocals etc. * Quality guitar cable * Audio Interface-Firewire (recommended) or USB * Guitar/midi interface * USB/Firewire cables * Computer (Mac recommended or PC) * Extra external/internal Hard Drive (recommended) * Audio recording/sequencer software * Plug-ins: effects- reverb, compressor, delay, chorus, etc o Dedicated guitar effects/amp simulation o Sampler/synths (for 13 pin guitarists or guitarist who can play keyboards) * Quality cables to speakers * Powered Speakers * Headphones (so you do not disturb the wife an kids at those late weekend sessions).

Some recording setups:

-Setting up your whole live rig and putting a microphone in front of the your cabinet, playing at loud volumes so you get the amp to sound right, push the speaker enough, putting your speaker in a closet, using a blanket to damping volume, etc. -You could use a load on you speaker and record direct.

-Use a preamp that is compensated for direct recording as the front end. You can plug this into outboard effects or add plug-ins

-You could record from your pedal board or any outboard multi-effects you might own. You probably end up with a direct type sound. Some multi-effects have amp simulation that might take the edge off the direct sound.

With any of these scenarios you are committing yourself to the recorded track. If you record with effects, you can't change it later. You would have to re-record! And if you do not re-record on the same day or you have to come back and have to set up the rig again, hopefully the knobs setting were written down, place the speaker in the same place etc, etc.... If you get into the recording other instruments or mixing the song and you want to change the guitar sound, tone, effects, etc, -- you have to re-record!

With a guitar with a 13 pin connection, either internally or a pickup installed on the body, you get 100% use of your software. You can record your guitar sound and get access to any software synths or samplers that come with the software or as a plug-in add-on.

With a software computer base system, you can change almost everything after you have recorded--amps, effects, mics, mic placement and more. You do this with plug-ins. You can save all your presets, sounds, etc. You could have multi-guitar tracks with different settings all from recording one track. The flexibility is there! Yes, there are purists out there- "software can't sound like tubes". Well, the software is getting pretty close!!!!

There is a lot of software out there. You do not have to spend a lot of money AND it is always improving. You buy a box. (i.e.hard disk recorders, and others mentioned above), you have bought a box. You usually can't upgrade without buying a new box.

People use computers every day at work or home- Microsoft Office (Word, Excel Power Point, etc) Email, Internet. Use it to record your guitar and music!!

Tuesday, September 2, 2008


HP Pavilion Elite m9040n Core 2 Quad Q6600 Desktop Computer with 3GB RAM, 640GB SATA and HDMI out

A good computer is like a good saThe image “http://s3-external-1.amazonaws.com/wootsaleimages/HP_Pavilion_Elite_m9040n_Desktop_Computer558Detail.jpg” cannot be displayed, because it contains errors.ndwich. All the parts inside have to be delicious or the whole thing winds up wrong. We hope you find ourHP Pavilion Elite m9040n Desktop Computer mouthwatering.

The sleek piano black paneling is just the wax paper that holds everything in place. Underneath that is the 2.4 Ghz Intel Core 2 Quad Q6600 processor. You?ll have four execution cores inside a single processor, kicking out the jams for multi-threaded and multi-tasked environments. Baby ain?t gettin? no plain ham sandwich tonight!

Two 320 gig SATA drives give you 640 gigs of storage you can make a RAID array if you really want. And that?s good, because you?ll need the room to store everything that comes off your TV Tuner with HDMI out. Watch, record and pause when you need to run to the kitchen for more mustard, you?ll get up to 395 hours of recording space, plus an online Electronic Program Guide to help you pick and choose. And 3 gigs of RAM (expandable to 8 gigs) means you?ll be full of memory for a while.

Want to add some veggies to compliment the meat? How about a dual-layer multi-format LightScribe DVD/CD Burner? Up to 8.5 gigs on a dual-layer disc, plus custom silkscreen quality labels, right on to Lightscribe-enabled discs. There?s a 15-in-1 memory card reader. Can you even name 15 different types of memory cards? There?s the NVidia GeForce 8400 GS with 256 MB of dedicated video memory. There?s even a remote with a 16 foot range.

An FM tuner? Sure, they stuffed one of those in there! Plus Windows Media Center and Vista Home Premium so you?ve got an nice entertainment center waiting to go right out of the box. 7.1 Audio. This ain?t just lettuce and tomato, we?re talking fresh olives, green peppers and those squiggly things that goats eat! Alfalfa? Whatever, that?s what you?re getting, a hearty stack of yummy.

The HP Pavilion Elite m9040n Desktop Computer will feed your need for entertainment. Build up that library of Top Chef and Rachel Ray! Meanwhile we?re gonna run to the deli. It?s lunchtime somewhere, right?

Warranty: 90 Days HP

Features:

* Windows Vista Home Premium with Windows Media Center
* Intel Core 2 Quad Q6600 2.4 GHz Processor
* 1066MHz Front Side Bus and dual 4 MB L2 caches
* Viiv Technology, Everything you need to build a PC for HD content.
* 3gb of PC2-5300 RAM (4 Slots 2×1GB, 2×512MB); Expandable to 8GB on 64-bit OS, 4GB on 32-bit
* 640GB (2×320) 7200 RPM SATA Hard Drives
* 16x DVD±R/RW 12x RAM ±R Dual Layer LightScribe SATA Optical Drive
* nVIDIA GeForce 8400GS with 256MB DDR Memory, HDMI, Dual-Link DVI, and S-Video ports
* TV Tuner Card (ATSC/NTSC) with FM tuner
* PVR functionality and programming guide let you watch, pause, rewind, and record live television
* Realtek ALC 888S Audio Chipset, Supports up to 7.1 Audio Channels
* Built-in wireless 802.11b/g 54g Wi-Fi networking
* Dimensions: 16.61 x 7.0 x 15.51 (LxWxD); approx weight - 24lbs

Connectivity and Expansion:

* 6 USB 2.0 ports (2 in front, 4 in rear)
* 2 FireWire (1 in front, 1 in rear)
* Composite video, S-Video, and analog audio inputs
* Video out: S-Video, DVI, HDMI, and VGA
* Headphone and microphone jacks
* Digital SPDIF audio input/output
* Surround sound speakers--rear, side, center (subwoofer)
* 2 PS/2 ports for connecting keyboards and mice
* 15-in1 (4 slot) Media Card Reader (CF I/II, Microdrive, SD/mini-SD, MMC/RS/mobile, Memory Stick/Pro/Duo/Pro Duo, SmartMedia, Extreme Digital
* 56K modem
* 10/100/1000 Base-T Networking Interface
* Two (2) 5.25? Bay, for Optical Drive (One Available)
* Two (2) 3.5? Bay, for Hard Drive (None Available, System already has 2 Hard Drives installed)
* One (1) Personal Media Drive Bay
* One (1) Pocket Media Drive Bay

In the box:

* HP Media Center m9040n desktop PC
* Wireless keyboard
* Wireless mouse
* USB Transceiver for Wireless Keyboard and Mouse
* Remote control
* Power cord
* User's manual
* Recovery discs not included. HP recommends you use HP Recovery Manager, or you can create your own recovery disc with Recovery Disc Creator.


My Favorite Computer Shop in Singapore


If you're in the market for computer hardware, there is one place in Singapore I simply need to tell you about. Curious? Well, read on ...

The shop is called Fuwell Singapore and I have been buying computers from them for the past eight years or so. The staff there are extremely knowledgeable about computers and offer excellent service.

I say that from the viewpoint of a techie who has been dabbling with computers for close to 20 years.

1. Overview

The shop Fuwell is located in on the third level of Sim Lim Square, which, incidentally, is the electronics and computer haven of Singapore. Whenever I need to buy computer hardware or check out component prices, my standard practice is to zip down to Sim Lim Square and grab the price list from Fuwell. You can also print out a PDF copy from this link if you want. Extremely convenient.

I use Fuwell's prices as my benchmark before I start bargain hunting. But more often than not, I find myself coming back to Fuwell to buy my gizmo because it is not only very reasonable in terms of price, it also stocks an astounding variety of computer hardware. And oh, did I say the staff are knowledgeable? I say that again because many of the shop assistants in Sim Lim Square do not know the difference between an ISA motherboard and a PCI-Express motherboard. Or a Nvidia card versus a Radeon card, and so on.

3. So What Do They Sell?

Typically, the stuff that is sold at Fuwell include computer motherboards, CPUs, memory chips, graphics cards, sound cards, hard drives, optical drives, monitors and printers. Add to that a huge variety of computer peripherals such as video capture cards, mice, LAN cables, routers, wireless cameras, MP3 players and thumb drives. When I need to upgrade or buy a new desktop, the first place I check out will be Fuwell.

3. The Staff

I know a couple of staff there by name. One of them (a spectacled one) is called Andy and he is simply fabulous. If there is an award for 'Best Computer Shop Service Staff', I'd give it to him. I mean, I'll be telling him I want a simple, no-frills PCI-Express motherboard with a Nvidia card and 1 GB of DDR-2 RAM, along with a simple Soundblaster card and a 19 inch LCD monitor and he will know exactly what I'm talking about. And he gives me the best prices too, without any hardsell. I simply love it.

Conclusion

I know I've heartily recommended Fuwell to you and I just wanna say that I am in no way affiliated with the shop . I simply want to tell you about its excellent good service and how you can save yourself a lot of retail headaches when you're next hunting for computer hardware in Singapore. Trust me, Sim Lim Square has its share of crooky vendors but Fuwell is one of the rare ones which shines through. Thanks again, Fuwell.

First computer virus

Computer virus

A computer virus is a computer program that can copy itself and infect a computer without permission or knowledge of the user. The term "virus" is also commonly used, albeit erroneously, to refer to many different types of malware and adware programs. The original virus may modify the copies, or the copies may modify themselves, as occurs in a metamorphic virus. A virus can only spread from one computer to another when its host is taken to the uninfected computer, for instance by a user sending it over a network or the Internet, or by carrying it on a removable medium such as a floppy disk, CD, or USB drive. Meanwhile viruses can spread to other computers by infecting files on a network file system or a file system that is accessed by another computer. Viruses are sometimes confused with computer worms and Trojan horses. A worm can spread itself to other computers without needing to be transferred as part of a host, and a Trojan horse is a file that appears harmless. Worms and Trojans may cause harm to either a computer system's hosted data, functional performance, or networking throughput, when executed. In general, a worm does not actually harm either the system's hardware or software, while at least in theory, a Trojan's payload may be capable of almost any type of harm if executed. Some can't be seen when the program is not running, but as soon as the infected code is run, the Trojan horse kicks in. That is why it is so hard for people to find viruses and other malware themselves and why they have to use spyware programs and registry processors.

Most personal computers are now connected to the Internet and to local area networks, facilitating the spread of malicious code. Today's viruses may also take advantage of network services such as the World Wide Web, e-mail, Instant Messaging and file sharing systems to spread, blurring the line between viruses and worms. Furthermore, some sources use an alternative terminology in which a virus is any form of self-replicating malware.

Some malware is programmed to damage the computer by damaging programs, deleting files, or reformatting the hard disk. Other malware programs are not designed to do any damage, but simply replicate themselves and perhaps make their presence known by presenting text, video, or audio messages. Even these less sinister malware programs can create problems for the computer user. They typically take up computer memory used by legitimate programs. As a result, they often cause erratic behavior and can result in system crashes. In addition, much malware is bug-ridden, and these bugs may lead to system crashes and data loss. Many CiD programs are programs that have been downloaded by the user and pop up every so often. This results in slowing down of the computer, but it is also very difficult to find and stop the problem.

History


The Creeper virus was first detected on ARPANET, the forerunner of the Internet in the early 1970s.[1] It propagated via the TENEX operating system and could make use of any connected modem to dial out to remote computers and infect them. It would display the message "I'M THE CREEPER : CATCH ME IF YOU CAN.". It is rumored[who?] that the Reaper program, which appeared shortly after and sought out copies of the Creeper and deleted them, may have been written by the creator of the Creeper in a fit of regret.[original research?]

A common misconception is that a program called "Rother J" was the first computer virus to appear "in the wild" — that is, outside the single computer or lab where it was created, but that claim is false. See the Timeline of notable computer viruses and worms for other earlier viruses. It was however the first virus to infect computers "in the home". Written in 1982 by Richard Skrenta, it attached itself to the Apple DOS 3.3 operating system and spread by floppy disk.[2] This virus was originally a joke, created by a high school student and put onto a game on floppy disk. On its 50th use the Elk Cloner virus would be activated, infecting the computer and displaying a short poem beginning "Elk Cloner: The program with a personality".

The first PC virus in the wild was a boot sector virus called (c)Brain[3], created in 1986 by the Farooq Alvi Brothers, operating out of Lahore, Pakistan. The brothers reportedly created the virus to deter pirated copies of software they had written. However, analysts have claimed that the Ashar virus, a variant of Brain, possibly predated it based on code within the virus.[original research?]

Before computer networks became widespread, most viruses spread on removable media, particularly floppy disks. In the early days of the personal computer, many users regularly exchanged information and programs on floppies. Some viruses spread by infecting programs stored on these disks, while others installed themselves into the disk boot sector, ensuring that they would be run when the user booted the computer from the disk, usually inadvertently. PCs of the era would attempt to boot first from a floppy if one had been left in the drive. This was the most successful infection strategy until floppy disks fell from favour, making boot sector viruses the most common in the wild[4].

Traditional computer viruses emerged in the 1980s, driven by the spread of personal computers and the resultant increase in BBS and modem use, and software sharing. Bulletin board driven software sharing contributed directly to the spread of Trojan horse programs, and viruses were written to infect popularly traded software. Shareware and bootleg software were equally common vectors for viruses on BBS's.[citation needed] Within the "pirate scene" of hobbyists trading illicit copies of retail software, traders in a hurry to obtain the latest applications and games were easy targets for viruses.[original research?]

Since the mid-1990s, macro viruses have become common. Most of these viruses are written in the scripting languages for Microsoft programs such as Word and Excel. These viruses spread in Microsoft Office by infecting documents and spreadsheets. Since Word and Excel were also available for Mac OS, most of these viruses were able to spread on Macintosh computers as well. Most of these viruses did not have the ability to send infected e-mail. Those viruses which did spread through e-mail took advantage of the Microsoft Outlook COM interface.[citation needed]

Macro viruses pose unique problems for detection software[citation needed]. For example, some versions of Microsoft Word allowed macros to replicate themselves with additional blank lines. The virus behaved identically but would be misidentified as a new virus. In another example, if two macro viruses simultaneously infect a document, the combination of the two, if also self-replicating, can appear as a "mating" of the two and would likely be detected as a virus unique from the "parents".[5]

A virus may also send a web address link as an instant message to all the contacts on an infected machine. If the recipient, thinking the link is from a friend (a trusted source) follows the link to the website, the virus hosted at the site may be able to infect this new computer and continue propagating.

The newest species of the virus family is the cross-site scripting virus.[citation needed] The virus emerged from research and was academically demonstrated in 2005.[6] This virus utilizes cross-site scripting vulnerabilities to propagate. Since 2005 there have been multiple instances of the cross-site scripting viruses in the wild, most notable sites affected have been MySpace and Yahoo.

Infection strategies

In order to replicate itself, a virus must be permitted to execute code and write to memory. For this reason, many viruses attach themselves to executable files that may be part of legitimate programs. If a user tries to start an infected program, the virus' code may be executed first. Viruses can be divided into two types, on the basis of their behavior when they are executed. Nonresident viruses immediately search for other hosts that can be infected, infect these targets, and finally transfer control to the application program they infected. Resident viruses do not search for hosts when they are started. Instead, a resident virus loads itself into memory on execution and transfers control to the host program. The virus stays active in the background and infects new hosts when those files are accessed by other programs or the operating system itself.

Nonresident viruses

Nonresident viruses can be thought of as consisting of a finder module and a replication module. The finder module is responsible for finding new files to infect. For each new executable file the finder module encounters, it calls the replication module to infect that file.

Resident viruses

Resident viruses contain a replication module that is similar to the one that is employed by nonresident viruses. However, this module is not called by a finder module. Instead, the virus loads the replication module into memory when it is executed and ensures that this module is executed each time the operating system is called to perform a certain operation. For example, the replication module can be called each time the operating system executes a file. In this case, the virus infects every suitable program that is executed on the computer.

Resident viruses are sometimes subdivided into a category of fast infectors and a category of slow infectors. Fast infectors are designed to infect as many files as possible. For instance, a fast infector can infect every potential host file that is accessed. This poses a special problem to anti-virus software, since a virus scanner will access every potential host file on a computer when it performs a system-wide scan. If the virus scanner fails to notice that such a virus is present in memory, the virus can "piggy-back" on the virus scanner and in this way infect all files that are scanned. Fast infectors rely on their fast infection rate to spread. The disadvantage of this method is that infecting many files may make detection more likely, because the virus may slow down a computer or perform many suspicious actions that can be noticed by anti-virus software. Slow infectors, on the other hand, are designed to infect hosts infrequently. For instance, some slow infectors only infect files when they are copied. Slow infectors are designed to avoid detection by limiting their actions: they are less likely to slow down a computer noticeably, and will at most infrequently trigger anti-virus software that detects suspicious behavior by programs. The slow infector approach does not seem very successful, however.

Vectors and hosts


Viruses have targeted various types of transmission media or hosts. This list is not exhaustive:

* Binary executable files (such as COM files and EXE files in MS-DOS, Portable Executable files in Microsoft Windows, and ELF files in Linux)
* Volume Boot Records of floppy disks and hard disk partitions
* The master boot record (MBR) of a hard disk
* General-purpose script files (such as batch files in MS-DOS and Microsoft Windows, VBScript files, and shell script files on Unix-like platforms).
* Application-specific script files (such as Telix-scripts)
* Documents that can contain macros (such as Microsoft Word documents, Microsoft Excel spreadsheets, AmiPro documents, and Microsoft Access database files)
* Cross-site scripting vulnerabilities in web applications
* Arbitrary computer files. An exploitable buffer overflow, format string, race condition or other exploitable bug in a program which reads the file could be used to trigger the execution of code hidden within it. Most bugs of this type can be made more difficult to exploit in computer architectures with protection features such as an execute disable bit and/or address space layout randomization.

PDFs, like HTML, may link to malicious code.[citation needed]

It is worth noting that some virus authors have written an .EXE extension on the end of .PNG (for example), hoping that users would stop at the trusted file type without noticing that the computer would start with the final type of file. (Many operating systems hide the extensions of known file types by default, so for example a filename ending in ".png.exe" would be shown ending in ".png".) See Trojan horse (computing).

Methods to avoid detection


In order to avoid detection by users, some viruses employ different kinds of deception. Some old viruses, especially on the MS-DOS platform, make sure that the "last modified" date of a host file stays the same when the file is infected by the virus. This approach does not fool anti-virus software, however, especially that which maintains and dates Cyclic redundancy check on file changes.

Some viruses can infect files without increasing their sizes or damaging the files. They accomplish this by overwriting unused areas of executable files. These are called cavity viruses. For example the CIH virus, or Chernobyl Virus, infects Portable Executable files. Because those files had many empty gaps, the virus, which was 1 KB in length, did not add to the size of the file.

Some viruses try to avoid detection by killing the tasks associated with antivirus software before it can detect them.

As computers and operating systems grow larger and more complex, old hiding techniques need to be updated or replaced. Defending a computer against viruses may demand that a file system migrate towards detailed and explicit permission for every kind of file access.

Avoiding bait files and other undesirable hosts


A virus needs to infect hosts in order to spread further. In some cases, it might be a bad idea to infect a host program. For example, many anti-virus programs perform an integrity check of their own code. Infecting such programs will therefore increase the likelihood that the virus is detected. For this reason, some viruses are programmed not to infect programs that are known to be part of anti-virus software. Another type of host that viruses sometimes avoid is bait files. Bait files (or goat files) are files that are specially created by anti-virus software, or by anti-virus professionals themselves, to be infected by a virus. These files can be created for various reasons, all of which are related to the detection of the virus:

* Anti-virus professionals can use bait files to take a sample of a virus (i.e. a copy of a program file that is infected by the virus). It is more practical to store and exchange a small, infected bait file, than to exchange a large application program that has been infected by the virus.
* Anti-virus professionals can use bait files to study the behavior of a virus and evaluate detection methods. This is especially useful when the virus is polymorphic. In this case, the virus can be made to infect a large number of bait files. The infected files can be used to test whether a virus scanner detects all versions of the virus.
* Some anti-virus software employs bait files that are accessed regularly. When these files are modified, the anti-virus software warns the user that a virus is probably active on the system.

Since bait files are used to detect the virus, or to make detection possible, a virus can benefit from not infecting them. Viruses typically do this by avoiding suspicious programs, such as small program files or programs that contain certain patterns of 'garbage instructions'.

A related strategy to make baiting difficult is sparse infection. Sometimes, sparse infectors do not infect a host file that would be a suitable candidate for infection in other circumstances. For example, a virus can decide on a random basis whether to infect a file or not, or a virus can only infect host files on particular days of the week.


Stealth

Some viruses try to trick anti-virus software by intercepting its requests to the operating system. A virus can hide itself by intercepting the anti-virus software’s request to read the file and passing the request to the virus, instead of the OS. The virus can then return an uninfected version of the file to the anti-virus software, so that it seems that the file is "clean". Modern anti-virus software employs various techniques to counter stealth mechanisms of viruses. The only completely reliable method to avoid stealth is to boot from a medium that is known to be clean.


Self-modification


Most modern antivirus programs try to find virus-patterns inside ordinary programs by scanning them for so-called virus signatures. A signature is a characteristic byte-pattern that is part of a certain virus or family of viruses. If a virus scanner finds such a pattern in a file, it notifies the user that the file is infected. The user can then delete, or (in some cases) "clean" or "heal" the infected file. Some viruses employ techniques that make detection by means of signatures difficult but probably not impossible. These viruses modify their code on each infection. That is, each infected file contains a different variant of the virus.

Encryption with a variable key

A more advanced method is the use of simple encryption to encipher the virus. In this case, the virus consists of a small decrypting module and an encrypted copy of the virus code. If the virus is encrypted with a different key for each infected file, the only part of the virus that remains constant is the decrypting module, which would (for example) be appended to the end. In this case, a virus scanner cannot directly detect the virus using signatures, but it can still detect the decrypting module, which still makes indirect detection of the virus possible. Since these would be symmetric keys, stored on the infected host, it is in fact entirely possible to decrypt the final virus, but that probably isn't required, since self-modifying code is such a rarity that it may be reason for virus scanners to at least flag the file as suspicious.

An old, but compact, encryption involves XORing each byte in a virus with a constant, so that the exclusive-or operation had only to be repeated for decryption. It is suspicious code that modifies itself, so the code to do the encryption/decryption may be part of the signature in many virus definitions.

Polymorphic code

Polymorphic code was the first technique that posed a serious threat to virus scanners. Just like regular encrypted viruses, a polymorphic virus infects files with an encrypted copy of itself, which is decoded by a decryption module. In the case of polymorphic viruses however, this decryption module is also modified on each infection. A well-written polymorphic virus therefore has no parts which remain identical between infections, making it very difficult to detect directly using signatures. Anti-virus software can detect it by decrypting the viruses using an emulator, or by statistical pattern analysis of the encrypted virus body. To enable polymorphic code, the virus has to have a polymorphic engine (also called mutating engine or mutation engine) somewhere in its encrypted body. See Polymorphic code for technical detail on how such engines operate.

Some viruses employ polymorphic code in a way that constrains the mutation rate of the virus significantly. For example, a virus can be programmed to mutate only slightly over time, or it can be programmed to refrain from mutating when it infects a file on a computer that already contains copies of the virus. The advantage of using such slow polymorphic code is that it makes it more difficult for anti-virus professionals to obtain representative samples of the virus, because bait files that are infected in one run will typically contain identical or similar samples of the virus. This will make it more likely that the detection by the virus scanner will be unreliable, and that some instances of the virus may be able to avoid detection.


Metamorphic code


To avoid being detected by emulation, some viruses rewrite themselves completely each time they are to infect new executables. Viruses that use this technique are said to be metamorphic. To enable metamorphism, a metamorphic engine is needed. A metamorphic virus is usually very large and complex. For example, W32/Simile consisted of over 14000 lines of Assembly language code, 90% of which is part of the metamorphic engine.[7]

Vulnerability and countermeasures

The vulnerability of operating systems to viruses

Just as genetic diversity in a population decreases the chance of a single disease wiping out a population, the diversity of software systems on a network similarly limits the destructive potential of viruses.

This became a particular concern in the 1990s, when Microsoft gained market dominance in desktop operating systems and office suites. The users of Microsoft software (especially networking software such as Microsoft Outlook and Internet Explorer) are especially vulnerable to the spread of viruses. Microsoft software is targeted by virus writers due to their desktop dominance, and is often criticized for including many errors and holes for virus writers to exploit. Integrated and non-integrated Microsoft applications (such as Microsoft Office) and applications with scripting languages with access to the file system (for example Visual Basic Script (VBS), and applications with networking features) are also particularly vulnerable.

Although Windows is by far the most popular operating system for virus writers, some viruses also exist on other platforms. Any operating system that allows third-party programs to run can theoretically run viruses. Some operating systems are less secure than others. Unix-based OS's (and NTFS-aware applications on Windows NT based platforms) only allow their users to run executables within their protected space in their own directories.

An Internet based research revealed that there were cases when people willingly pressed a particular button to download a virus. A security firm F-Secure ran a half year advertising campaign on Google AdWords which said "Is your PC virus-free? Get it infected here!". The result was 409 clicks.[8]

As of 2006, there are relatively few security exploits[9] targeting Mac OS X (with a Unix-based file system and kernel). The number of viruses for the older Apple operating systems, known as Mac OS Classic, varies greatly from source to source, with Apple stating that there are only four known viruses, and independent sources stating there are as many as 63 viruses. It is safe to say that Macs are less likely to be targeted because of low market share and thus a Mac-specific virus could only infect a small proportion of computers (making the effort less desirable). Virus vulnerability between Macs and Windows is a chief selling point, one that Apple uses in their Get a Mac advertising.[10] That said, Macs have also had security issues just as Microsoft Windows has, though none have ever been fully taken advantage of successfully in the wild.[citation needed]

Windows and Unix have similar scripting abilities, but while Unix natively blocks normal users from having access to make changes to the operating system environment, older copies of Windows such as Windows 95 and 98 do not. In 1997, when a virus for Linux was released – known as "Bliss" – leading antivirus vendors issued warnings that Unix-like systems could fall prey to viruses just like Windows.[11] The Bliss virus may be considered characteristic of viruses – as opposed to worms – on Unix systems. Bliss requires that the user run it explicitly (so it is a trojan), and it can only infect programs that the user has the access to modify. Unlike Windows users, most Unix users do not log in as an administrator user except to install or configure software; as a result, even if a user ran the virus, it could not harm their operating system. The Bliss virus never became widespread, and remains chiefly a research curiosity. Its creator later posted the source code to Usenet, allowing researchers to see how it worked.[12]


The role of software development


Because software is often designed with security features to prevent unauthorized use of system resources, many viruses must exploit software bugs in a system or application to spread. Software development strategies that produce large numbers of bugs will generally also produce potential exploits.

Anti-virus software and other preventive measures

Many users install anti-virus software that can detect and eliminate known viruses after the computer downloads or runs the executable. There are two common methods that an anti-virus software application uses to detect viruses. The first, and by far the most common method of virus detection is using a list of virus signature definitions. This works by examining the content of the computer's memory (its RAM, and boot sectors) and the files stored on fixed or removable drives (hard drives, floppy drives), and comparing those files against a database of known virus "signatures". The disadvantage of this detection method is that users are only protected from viruses that pre-date their last virus definition update. The second method is to use a heuristic algorithm to find viruses based on common behaviors. This method has the ability to detect viruses that anti-virus security firms have yet to create a signature for.

Some anti-virus programs are able to scan opened files in addition to sent and received e-mails 'on the fly' in a similar manner. This practice is known as "on-access scanning." Anti-virus software does not change the underlying capability of host software to transmit viruses. Users must update their software regularly to patch security holes. Anti-virus software also needs to be regularly updated in order to prevent the latest threats.

One may also minimise the damage done by viruses by making regular backups of data (and the Operating Systems) on different media, that are either kept unconnected to the system (most of the time), read-only or not accessible for other reasons, such as using different file systems. This way, if data is lost through a virus, one can start again using the backup (which should preferably be recent). A notable exception to this rule is the Gammima virus, which propagates via infected removable media (specifically flash drives) [13] [14]. If a backup session on optical media like CD and DVD is closed, it becomes read-only and can no longer be affected by a virus (so long as a virus or infected file was not copied onto the CD/DVD). Likewise, an Operating System on a bootable can be used to start the computer if the installed Operating Systems become unusable. Another method is to use different Operating Systems on different file systems. A virus is not likely to affect both. Data backups can also be put on different file systems. For example, Linux requires specific software to write to NTFS partitions, so if one does not install such software and uses a separate installation of MS Windows to make the backups on an NTFS partition, the backup should remain safe from any Linux viruses. Likewise, MS Windows can not read file systems like ext3, so if one normally uses MS Windows, the backups can be made on an ext3 partition using a Linux installation.

Recovery methods


Once a computer has been compromised by a virus, it is usually unsafe to continue using the same computer without completely reinstalling the operating system. However, there are a number of recovery options that exist after a computer has a virus. These actions depend on severity of the type of virus.

Virus removal


One possibility on Windows Me, Windows XP and Windows Vista is a tool known as System Restore, which restores the registry and critical system files to a previous checkpoint. Often a virus will cause a system to hang, and a subsequent hard reboot will render a system restore point from the same day corrupt. Restore points from previous days should work provided the virus is not designed to corrupt the restore files or also exists in previous restore points [15]. Some viruses, however, disable system restore and other important tools such as Task Manager and Command Prompt. An example of a virus that does this is CiaDoor.

Administrators have the option to disable such tools from limited users for various reasons. The virus modifies the registry to do the same, except, when the Administrator is controlling the computer, it blocks all users from accessing the tools. When an infected tool activates it gives the message "Task Manager has been disabled by your administrator.", even if the user trying to open the program is the administrator.

Users running a Microsoft operating system can go to Microsoft's website to run a free scan, if they have their 20-digit registration number.


Operating system reinstallation

Reinstalling the operating system is another approach to virus removal. It involves simply reformatting the OS partition and installing the OS from its original media, or imaging the partition with a clean backup image (taken with Ghost or Acronis for example).

This method has the benefits of being simple to do, can be faster than running multiple anti-virus scans, and is guaranteed to remove any malware. Downsides include having to reinstall all other software as well as the operating system. User data can be backed up by booting off of a Live CD or putting the hard drive into another computer and booting from the other computer's operating system.


Atanasoff–Berry Computer


The Atanasoff–Berry Computer (ABC) was the first electronic digital computing device.[1] Conceived in 1937, the machine was not programmable, being designed only to solve systems of linear equations. It was successfully tested in 1942. However, its intermediate result storage mechanism, a paper card writer/reader, was unreliable, and when Atanasoff left Iowa State University for World War II assignments, work on the machine was discontinued.[2] The ABC pioneered important elements of modern computing, including binary arithmetic and electronic switching elements,[3] but its special-purpose nature and lack of a changeable, stored program distinguish it from modern computers.

John Vincent Atanasoff's and Clifford Berry's computer work was not widely known until it was rediscovered in the 1960s, amidst conflicting claims about the first instance of an electronic computer. The ENIAC computer was considered to be the first computer in the modern sense, but in 1973 a U.S. District Court invalidated the ENIAC patent and concluded that the ABC was the first "computer"

Design and construction


According to Atanasoff's account, several key principles of the Atanasoff–Berry Computer (ABC) were conceived in a sudden insight after a long nighttime drive during the winter of 1937–38. The ABC innovations included electronic computation, binary arithmetic, parallel processing, regenerative capacitor memory, and a separation of memory and computing functions. The mechanical and logic design was worked out by Dr. Atanasoff over the next year. A grant application to build a proof of concept prototype was submitted in March, 1939 to the Agronomy department which was also interested in speeding up computation for economic and research analysis. $5,000 of further funding to complete the machine came from the nonprofit Research Corporation of New York City.

The ABC was built by Dr. Atanasoff and graduate student Clifford Berry in the basement of the physics building at Iowa State College during 1939–42. The initial funds were released in September, and the 11-tube prototype was first demonstrated in October, 1939. A December demonstration prompted a grant for construction of the full-scale machine.[4] The ABC was built and tested over the next two years. It was described in a January 15, 1941 notice in the Des Moines Register. The system weighed more than seven hundred pounds (320 kg). It contained approximately 1 mile (1.6 km) of wire, 280 dual-triode vacuum tubes, 31 thyratrons, and was about the size of a desk.

It was not a Turing complete computer, which distinguishes it from more general machines, like contemporary Konrad Zuse's Z3 (1941), or later machines like the 1946 ENIAC, 1949 EDVAC, the University of Manchester designs, or Alan Turing's post-War designs at NPL and elsewhere. Nor did it implement the stored program architecture that made practical fully general-purpose, reprogrammable computers.


The machine was, however, the first to implement three critical ideas that are still part of every modern computer:

1. Using binary digits to represent all numbers and data
2. Performing all calculations using electronics rather than wheels, ratchets, or mechanical switches
3. Organizing a system in which computation and memory are separated.

In addition, the system pioneered the use of regenerative capacitor memory, as in the DRAM still widely used today.

The memory of the Atanasoff–Berry Computer was a pair of drums, each containing 1600 capacitors that rotated on a common shaft once per second. The capacitors on each drum were organized into 32 "bands" of 50 (30 active bands and 2 spares in case a capacitor failed), giving the machine a speed of 30 additions/subtractions per second. Data was represented as 50-bit binary fixed point numbers. The electronics of the memory and arithmetic units could store and operate on 60 such numbers at a time (3000 bits).

The AC power line frequency of 60 Hz was the primary clock rate for the lowest level operations.

The arithmetic logic functions were fully electronic, implemented with vacuum tubes. The family of logic gates ranged from inverters to two and three input gates. The input and output levels and operating voltages were compatible between the different gates. Each gate consisted of one inverting vacuum tube amplifier, preceded by a resistor divider input network that defined the logical function. The control logic functions, which only needed to operate once per drum rotation and therefore did not require electronic speed, were electromechanical, implemented with relays.

Although the Atanasoff–Berry Computer was an important step up from earlier calculating machines, it was not able to run entirely automatically through an entire problem. An operator was needed to operate the control switches to set up its functions, much like the electro-mechanical calculators and unit record equipment of the time. Selection of the operation to be performed, reading, writing, converting to or from binary to decimal, or reducing a set of equations was made by front panel switches and in some cases jumpers.

There were two forms of input and output. Primary user input and output and an intermediate results output and input. The intermediate results storage allowed operation on problems too large to be handled entirely within the electronic memory. (The largest problem that could be solved without the use of the intermediate output and input was two simultaneous equations, a trivial problem.)

Intermediate results were binary, written onto paper sheets by electrostatically modifying the resistance at 1500 locations to represent 30 of the 50 bit numbers (one equation). Each sheet could be written or read in one second. The reliability of the system was limited to about 1 error in 100,000 calculations by these units, primarily attributed to lack of control of the sheets' material characteristics. In retrospect a solution could have been to add a parity bit to each number as written. This problem was not solved by the time Atanasoff left the university for war-related work.

Primary user input was decimal, via standard IBM 80 column punched cards and output was decimal, via a front panel display.

Supercomputer


A supercomputer is a computer that is at the frontline of processing capacity, particularly speed of calculation (at the time of its introduction). The term "Super Computing" was first used by New York World newspaper in 1929[1] to refer to large custom-built tabulators that IBM had made for Columbia University.

Supercomputers introduced in the 1960s were designed primarily by Seymour Cray at Control Data Corporation (CDC), and led the market into the 1970s until Cray left to form his own company, Cray Research. He then took over the supercomputer market with his new designs, holding the top spot in supercomputing for five years (1985–1990). Cray, himself, never used the word "supercomputer"; a little-remembered fact is that he only recognized the word "computer". In the 1980s a large number of smaller competitors entered the market, in a parallel to the creation of the minicomputer market a decade earlier, but many of these disappeared in the mid-1990s "supercomputer market crash". Today, supercomputers are typically one-of-a-kind custom designs produced by "traditional" companies such as Cray, IBM and HP, who had purchased many of the 1980s companies to gain their experience.
The term supercomputer itself is rather fluid, and today's supercomputer tends to become tomorrow's ordinary computer. CDC's early machines were simply very fast scalar processors, some ten times the speed of the fastest machines offered by other companies. In the 1970s most supercomputers were dedicated to running a vector processor, and many of the newer players developed their own such processors at a lower price to enter the market. The early and mid-1980s saw machines with a modest number of vector processors working in parallel become the standard. Typical numbers of processors were in the range of four to sixteen. In the later 1980s and 1990s, attention turned from vector processors to massive parallel processing systems with thousands of "ordinary" CPUs, some being off the shelf units and others being custom designs. Today, parallel designs are based on "off the shelf" server-class microprocessors, such as the PowerPC, Opteron, or Xeon, and most modern supercomputers are now highly-tuned computer clusters using commodity processors combined with custom interconnects.

Common uses

Supercomputers are used for highly calculation-intensive tasks such as problems involving quantum mechanical physics, weather forecasting, climate research (including research into global warming), molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), physical simulations (such as simulation of airplanes in wind tunnels, simulation of the detonation of nuclear weapons, and research into nuclear fusion), cryptanalysis, and the like. Major universities, military agencies and scientific research laboratories are heavy users.

A particular class of problems, known as Grand Challenge problems, are problems whose full solution requires semi-infinite computing resources.

Relevant here is the distinction between capability computing and capacity computing, as defined by Graham et al. Capability computing is typically thought of as using the maximum computing power to solve a large problem in the shortest amount of time. Often a capability system is able to solve a problem of a size or complexity that no other computer can. Capacity computing in contrast is typically thought of as using efficient cost-effective computing power to solve somewhat large problems or many small problems or to prepare for a run on a capability system.


Hardware and software design
Supercomputers using custom CPUs traditionally gained their speed over conventional computers through the use of innovative designs that allow them to perform many tasks in parallel, as well as complex detail engineering. They tend to be specialized for certain types of computation, usually numerical calculations, and perform poorly at more general computing tasks. Their memory hierarchy is very carefully designed to ensure the processor is kept fed with data and instructions at all times — in fact, much of the performance difference between slower computers and supercomputers is due to the memory hierarchy. Their I/O systems tend to be designed to support high bandwidth, with latency less of an issue, because supercomputers are not used for transaction processing.

As with all highly parallel systems, Amdahl's law applies, and supercomputer designs devote great effort to eliminating software serialization, and using hardware to address the remaining bottlenecks.

Supercomputer challenges, technologies

* A supercomputer generates large amounts of heat and must be cooled. Cooling most supercomputers is a major HVAC problem.
* Information cannot move faster than the speed of light between two parts of a supercomputer. For this reason, a supercomputer that is many meters across must have latencies between its components measured at least in the tens of nanoseconds. Seymour Cray's supercomputer designs attempted to keep cable runs as short as possible for this reason: hence the cylindrical shape of his Cray range of computers. In modern supercomputers built of many conventional CPUs running in parallel, latencies of 1-5 microseconds to send a message between CPUs are typical.
* Supercomputers consume and produce massive amounts of data in a very short period of time. According to Ken Batcher, "A supercomputer is a device for turning compute-bound problems into I/O-bound problems." Much work on external storage bandwidth is needed to ensure that this information can be transferred quickly and stored/retrieved correctly.

Technologies developed for supercomputers include:

* Vector processing
* Liquid cooling
* Non-Uniform Memory Access (NUMA)
* Striped disks (the first instance of what was later called RAID)
* Parallel filesystems


Processing techniques

Vector processing techniques were first developed for supercomputers and continue to be used in specialist high-performance applications. Vector processing techniques have trickled down to the mass market in DSP architectures and SIMD processing instructions for general-purpose computers.

Modern video game consoles in particular use SIMD extensively and this is the basis for some manufacturers' claim that their game machines are themselves supercomputers. Indeed, some graphics cards have the computing power of several TeraFLOPS. The applications to which this power can be applied was limited by the special-purpose nature of early video processing. As video processing has become more sophisticated, Graphics processing units (GPUs) have evolved to become more useful as general-purpose vector processors, and an entire computer science sub-discipline has arisen to exploit this capability: General-Purpose Computing on Graphics Processing Units (GPGPU).


Operating systems

Supercomputer operating systems, today most often variants of Linux or UNIX, are every bit as complex as those for smaller machines, if not more so. Their user interfaces tend to be less developed, however, as the OS developers have limited programming resources to spend on non-essential parts of the OS (i.e., parts not directly contributing to the optimal utilization of the machine's hardware). This stems from the fact that because these computers, often priced at millions of dollars, are sold to a very small market, their R&D budgets are often limited. (The advent of Unix and Linux allows reuse of conventional desktop software and user interfaces.)

Interestingly this has been a continuing trend throughout the supercomputer industry, with former technology leaders such as Silicon Graphics taking a back seat to such companies as AMD and NVIDIA, who have been able to produce cheap, feature-rich, high-performance, and innovative products due to the vast number of consumers driving their R&D.

Historically, until the early-to-mid-1980s, supercomputers usually sacrificed instruction set compatibility and code portability for performance (processing and memory access speed). For the most part, supercomputers to this time (unlike high-end mainframes) had vastly different operating systems. The Cray-1 alone had at least six different proprietary OSs largely unknown to the general computing community. Similarly different and incompatible vectorizing and parallelizing compilers for Fortran existed. This trend would have continued with the ETA-10 were it not for the initial instruction set compatibility between the Cray-1 and the Cray X-MP, and the adoption of UNIX operating system variants (such as Cray's Unicos and today's Linux.)

For this reason, in the future, the highest performance systems are likely to have a UNIX flavor but with incompatible system-unique features (especially for the highest-end systems at secure facilities).


Programming

The parallel architectures of supercomputers often dictate the use of special programming techniques to exploit their speed. Because Fortran has relatively few features and a simple programmatic model, special-purpose compilers can often generate faster code than C or C++ compilers[citation needed], so Fortran remains the language of choice for scientific programming, and hence for most programs run on supercomputers[citation needed]. To exploit the parallelism of supercomputers, programming environments such as PVM and MPI for loosely connected clusters and OpenMP for tightly coordinated shared memory machines are being used.


Software tools

Software tools for distributed processing include standard APIs such as MPI and PVM, VTL and open source-based software solutions such as Beowulf, WareWulf and openMosix which facilitate the creation of a supercomputer from a collection of ordinary workstations or servers. Technology like ZeroConf (Rendezvous/Bonjour) can be used to create ad hoc computer clusters for specialized software such as Apple's Shake compositing application. An easy programming language for supercomputers remains an open research topic in computer science. Several utilities that would once have cost several thousands of dollars are now completely free thanks to the open source community which often creates disruptive technology in this arena.



Monday, September 1, 2008

Networking and the Internet


Computers have been used to coordinate information between multiple locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems like Sabre.

In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. This effort was funded by ARPA (now DARPA), and the computer network that it produced was called the ARPANET. The technologies that made the Arpanet possible spread and evolved. In time, the network spread beyond academic and military institutions and became known as the Internet. The emergence of networking involved a redefinition of the nature and boundaries of the computer. Computer operating systems and applications were modified to include the ability to define and access the resources of other computers on the network, such as peripheral devices, stored information, and the like, as extensions of the resources of an individual computer. Initially these facilities were available primarily to people working in high-tech environments, but in the 1990s the spread of applications like e-mail and the World Wide Web, combined with the development of cheap, fast networking technologies like Ethernet and ADSL saw computer networking become almost ubiquitous. In fact, the number of computers that are networked is growing phenomenally. A very large proportion of personal computers regularly connect to the Internet to communicate and receive information. "Wireless" networking, often utilizing mobile phone networks, has meant networking is becoming increasingly ubiquitous even in mobile computing environments.

Further topics
===========
Hardware


The term hardware covers all of those parts of a computer that are tangible objects. Circuits, displays, power supplies, cables, keyboards, printers and mice are all hardware.
Software
Software refers to parts of the computer which do not have a material form, such as programs, data, protocols, etc. When software is stored in hardware that cannot easily be modified (such as BIOS ROM in an IBM PC compatible), it is sometimes called "firmware" to indicate that it falls into an uncertain area somewhere between hardware and software.

Programming languages

====================

Programming languages provide various ways of specifying programs for computers to run. Unlike natural languages, programming languages are designed to permit no ambiguity and to be concise. They are purely written languages and are often difficult to read aloud. They are generally either translated into machine language by a compiler or an assembler before being run, or translated directly at run time by an interpreter. Sometimes programs are executed by a hybrid method of the two techniques. There are thousands of different programming languages—some intended to be general purpose, others useful only for highly specialized applications.

Professions and organizations


As the use of computers has spread throughout society, there are an increasing number of careers involving computers. Following the theme of hardware, software and firmware, the brains of people who work in the industry are sometimes known irreverently as wetware or "meatware".