Saturday, June 1, 2013

LOSSES IN TRANSMISSION LINES

Copper Losses
One type of copper loss is I2R LOSS. In rf lines the resistance of the conductors is never equal to zero. Whenever current flows through one of these conductors, some energy is dissipated in the form of heat. This heat loss is a POWER LOSS. With copper braid, which has a resistance higher than solid tubing, this power loss is higher.
Another type of copper loss is due to SKIN EFFECT. When dc flows through a conductor, the movement of electrons through the conductor's cross section is uniform. The situation is somewhat different when ac is applied. The expanding and collapsing fields about each electron encircle other electrons. This phenomenon, called SELF INDUCTION, retards the movement of the encircled electrons. The flux density at the center is so great that electron movement at this point is reduced. As frequency is increased, the opposition to the flow of current in the center of the wire increases. Current in the center of the wire becomes smaller and most of the electron flow is on the wire surface. When the frequency applied is 100 megahertz or higher, the electron movement in the center is so small that the center of the wire could be removed without any noticeable effect on current. You should be able to see that the effective cross-sectional area decreases as the frequency increases. Since resistance is inversely proportional to the cross-sectional area, the resistance will increase as the frequency is increased. Also, since power loss increases as resistance increases, power losses increase with an increase in frequency because of skin effect.
Copper losses can be minimized and conductivity increased in an rf line by plating the line with silver. Since silver is a better conductor than copper, most of the current will flow through the silver layer. The tubing then serves primarily as a mechanical support.


Dielectric Losses
DIELECTRIC LOSSES result from the heating effect on the dielectric material between the conductors. Power from the source is used in heating the dielectric. The heat produced is dissipated into the surrounding medium. When there is no potential difference between two conductors, the atoms in the dielectric material between them are normal and the orbits of the electrons are circular. When there is a potential difference between two conductors, the orbits of the electrons change. The excessive negative charge on one conductor repels electrons on the dielectric toward the positive conductor and thus distorts the orbits of the electrons. A change in the path of electrons requires more energy, introducing a power loss.
The atomic structure of rubber is more difficult to distort than the structure of some other dielectric materials. The atoms of materials, such as polyethylene, distort easily. Therefore, polyethylene is often used as a dielectric because less power is consumed when its electron orbits are distorted.


Radiation and Induction Losses
RADIATION and INDUCTION LOSSES are similar in that both are caused by the fields surrounding the conductors. Induction losses occur when the electromagnetic field about a conductor cuts through any nearby metallic object and a current is induced in that object. As a result, power is dissipated in the object and is lost.
Radiation losses occur because some magnetic lines of force about a conductor do not return to the conductor when the cycle alternates. These lines of force are projected into space as radiation and this results in power losses. That is, power is supplied by the source, but is not available to the load.

SPECIFYING OR DESIGNING RADIATED MEASUREMENT SYSTEMS

When specifying or designing any measurement receiver system, one should consider that the "system" will include other devices such as antennas, amplifiers, cabling, and possibly filters.Because a receiver's selectivity, the ability to select frequencies or frequency bands, is primarily a function of the receiver's tuner design, and will be chiefly dependent on the individual receiver selection, selectivity will not be specifically addressed in this text. Receiver system
sensitivity, however, presents one of the greatest difficulties, or
challenges, when designing or specifying receiver measurement systems. Therefore, the sensitivity of the two basic types of receiver systems,
one with a pre-amplifier and one without a pre-amplifier, will be addressed in some detail.

Because antennas are not perfect devices and have associated "losses," the following examples will include explanations for these error corrections. As mentioned previously, amplifiers will not only amplify the emissions being measured but they will  also amplify ambient electromagnetic noise. These ambient conditions can drastically change the overall sensitivity of a measurement system. Another potential problem associated with using amplifiers is that they also generate internal electromagnetic noise. Being active devices they will introduce their own internal electromagnetic noise into the receiver system, again having an influence on the total system's noise level, thus, its sensitivity. Some corrections for the above mentioned problems are necessary to accurately calculate both the receiver's signal input sensitivity and (more importantly) the total system's ambient
sensitivity. Without knowing the total measurement system's ambient sensitivity, measurements may not be possible down to anticipated emission levels. In electromagnetic measurement systems terms such as ambient sensitivity, system sensitivity, and receiver sensitivity have been used interchangeably.

More confusing expressions commonly used are terms such as "receiver noise floor," or "system noise floor."

THE RECEIVER AND AMPLIFIER

A receiver is an electro-mechanical device that receives electromagnetic energy captured by the antenna and then processes (extracts) the information, or data, contained in the "signal." The basic function of all receivers is the same regardless of their specific design intentions, broadcast radio receivers receive and reproduce commercial broadcast programming, and likewise, TV receivers detect and reproduce commercial television broadcasting  Programming. Special, or unique, receivers are sometimes needed to detect and measure all types of radiated, or transmitted, electromagnetic emissions. These specialized receivers may be called tuned receivers, field intensity meters (FIMs), or spectrum analyzers.

Radiated emissions that receiver systems may be required to measure can be generated from intentional radiators or unintentional radiators. The information contained in intentionally radiated signals may contain analog information, such as audio, or they may contain digital data, such as radio navigation beacon transmissions. Television transmissions, for example, contain both analog and digital information. This information is placed in the transmitted emission, called the "carrier," by a process called "modulation." Again, there are many different types of modulation, the most common being amplitude modulation (AM) and frequency modulation (FM). Receivers detect, or extract, the information/data from radiated emissions by a process called "demodulation", the reverse of modulation.

Many radiated emissions requiring measurements do not contain any useful information or data at all. As an example, radiated emissions from unintentional radiators, such as computer systems, are essentially undesired byproducts of electronic systems and serve no desired or useful purpose. These undesired emissions can, however, cause interference to communications system, and if strong enough, they can cause interference to other unintentional radiating devices. Radiated signals (if strong enough) can also present possible health hazards to humans and animals. Because these emissions must be measured to determine any potential interference problems or health hazard risks, specialized receiver systems must be used.

An important parameter for any receiver is its noise figure, or noise factor. This parameter will basically define the sensitivity that can be achieved with a particular receiver.
An amplifier, usually called a pre-amplifier, is sometimes required when attempting to measure very small signals or emission levels. Because these devices amplify signals, they will also amplify ambient electromagnetic noise. If improperly used, amplifiers can detract from the overall system's sensitivity as well as possibly causing overloading to the receiver's tuner input stage. Overloading a tuner's input stage is simply supplying a larger signal amplitude than the receiver's tuner input circuitry is capable of handling, thus, saturating the tuner's input stage.

THE ANTENNA

Measuring radiated emissions, or electromagnetic energy, begins with the antenna. Antennas are devices that receive (capture) electromagnetic energy traveling through space. Antennas can also be used for transmitting electromagnetic energy. There are many different types of antennas, some are designed to be "broad-banded," to receive or transmit over a large frequency range, and some are designed to receive or transmit at specific frequencies. In any case, all receive antennas are intended to capture "off-air"electromagnetic energy and to deliver these "signals" to a receiver. For this discussion, electric fields (E) will mainly be addressed.

Because antennas can only capture a small portion of the radiated power, or energy, a correction factor must be added to the detected emission levels to accurately determine the radiated power being measured. The actual power received by an antenna is determined by multiplying the
power density of the emission by the receiving area of the antenna, Ae. This antenna correction factor is called the "antenna factor."

Sunday, May 12, 2013

Cutting Costs Sees an Increase in Profits for Dell

Good news recently came out of Dell as the computer company reported that its net income for the last quarter nearly tripled as Dell benefited from lower computer component costs and growth in certain areas of its more profitable product lines.

Dell's shares rose 5% in extended trading, beating analysts' adjusted net income estimates but coming a bit short of revenue estimates. For Dell's first three months, which ended on April 29th, Dell earned $945 million, which equals about $0.49 per share, which was higher than the $341 million, $0.17 per share of last year.

If you exclude one-time items, Dell earned $0.55 per share which easily beat the numbers expected by Wall Street. Analysts polled by FactSet estimated adjusted earnings of $0.43 per share. Revenue rose only 1% to $15.02 billion from $14.9 billion last year, which was short of the predicted $15.4 billion. Product revenue remained the same at $12.1 billion with services revenue rising 6% to $3.0 billion.

Dell's consumer section, which accounts for nearly 20% of the company's revenue, dropped 7% to $3.0 billion as well. Consumer demand also fell more than anticipated and in an interview, CFO Brian Gladden attributed some of the cause to "the market for consumer PCs being saturated in developed countries." He also added that "while tablet computers are still a small portion of the PC market, there's clearly an impact for them on consumer demand for traditional PCs."

Revenue from large enterprises increased by 5% to $4.5 billion with revenue from small and medium-sized businesses increasing 7% to $3.8 billion. Public sector revenue, on the other hand, saw a decline of 2% to $3.8 billion. Dell saw the biggest gain in servers and networking. In this category revenue rose 11% to $2.0 billion. Sales of desktop PCs fell 8% to $3.3 billion with mobile PCs rising 3% to $4.7 billion.

Dell has been working hard to increase their proportion of server computers, data storage devices and technology consulting services sold. According to Dell, these areas are more profitable than the company's base PC business. However, compared with one year ago, most of Dell's product categories accounted for nearly the same percentage of revenue and computers for consumers, and businesses continued to make up over half of Dell's revenue.

However, Dell's gross margin, which is still an indicator of the efficiency of Dell's business, came in at 22.9% which was higher than the 20.4% expected by analysts from Reuters. Dell's strategy of focusing on more profitable areas of business and cutting back on lower-margin offerings is working extremely well according to Gladden.

Andy Hargreaves, an analyst for Pacific Crest, thinks that Dell's gross margin is "impressive" and stated that "Dell should be able to keep it up for now." Hargreaves also stated, "They do have the potential to sustain margins long-term, but in order to do so they have to drive toward more services-oriented businesses."

Taking a look at this current quarter, Dell is predicting that revenue will rise by a percentage in the mid-single digits over the first quarter, slightly faster than its seasonal 2% to 3% growth. Analysts are expecting somewhere around $16 billion. Dell continues to expect revenue to grow 5% to 9% for the full fiscal year which implies a total of $64.6 billion to $67 billion with analysts expecting around $64.4 billion.

Dell saw shares rise $0.86, or roughly 5.4%, to a total of $16.76 in extended trading. The stock finished regular trading down $0.10 to $15.90.

PayPal's Peter Thiel Pays Students to Skip College

Senior year is stressful for a lot of students. Most are concentrated on getting good grades and academic honors so they can get into a good college and have a better life some day. A lot of students do a lot of hard work in order to earn money to go to college. However, two dozen students from around the country will, instead of going to college, be paid to not go to school.

That's right, 24 gifted technical students from around the country will each be given a $100,000 scholarship by San Francisco tech tycoon Peter Thiel with a little catch, that they do not go to college this coming fall. Instead of going to school, these students are receiving the $100,000 so they can chase their dreams for the next two years.

"It seems like the perfect point in our lives to pursue this kind of project," stated Nick Cammarata, a gifted computer programmer who recently got accepted into the esteemed computer science program at Carnegie Mellon's University. He, along with 17-year-old David Merfield, will be working on software designed to upend the standard approach to high school teaching. Merfield is turning down an opportunity to attend Princeton University in order to participate in the scholarship.

Each applicant for the scholarship was asked to design a project to change the world. Thiel personally hand-picked the winners based on these projects. While all the ideas span different disciplines, they all have a high technology angle to them. According to Thiel, "One winner wants to create a mobile banking system for the developing world. Another is working to create cheaper biofuels. One wants to build robots that can help around the house."

This scholarship could not have come at a more interesting, and quite possibly crucial time as the debate over higher education's value is becoming quite heated. There are thousands of new graduates who are swimming in student loan debts and are encountering one of the hardest job markets in decades. Many people are pondering whether or not a college education is worth it given the rising tuitions and diminishing prospects.

"Turning people into debt slaves when they're in college students is really not how we end up building a better society," Thiel added. Thiel made his fortune as co-founder of PayPal shortly after graduating from Stanford Law School. After that he became the first major investor in Facebook. Thiel is adamant in his belief that innovation has become stagnant in the United States and that radical solutions are needed to push civilization forward.

One such effort is the "20 Under 20" fellowship. Thiel believes that the brightest young minds are able to contribute more to society by skipping college and bringing their ideas to the real world right away. However, not everyone can be as fortunate as Thiel and Mark Zuckerberg of Facebook.

Director of Research at Duke University's Center for Entrepreneurship Vivek Wadhwa doesn't agree with Thiel and sees his new program as sending a message that anybody can be Mark Zuckerberg. "Silicon Valley lives in its own bubble. It sees the world through its own prism. Its got a distorted view," Wadhwa stated.

Wadhwa also added, "All the people who are making a fuss are highly educated. They're rich themselves. They've achieved success because of their education. There's no way in hell we would have heard about Peter Thiel if he hadn't graduated from Stanford."

Thiel retorted that the "20 Under 20" should not be judged on the basis of his own education background or the merits of his critique on higher education. Thiel has urged critics to wait and see what these individuals achieve over the next two years.

Studies from the past few years have noted that individuals who received a college degree were laid off during the "Great Recession" at a much lower rate than individuals without college degrees. In addition to that, individuals with college degrees were also more likely to be rehired.

Could this be a new revolution in higher education? Or will the world push these students, as well as their ideas, away due to their lack of college education?

Lockheed-Martin Purchases D-Wave's First Quantum Computer

D-Wave out of Canada has just sold the first of its commercial quantum computers and they sold it to Lockheed-Martin. However, it wasn't as easy as your average sale. Despite the fact that D-Wave managed to make the sale, the company had to do it despite a debate over whether it truly was a quantum computer.

Back in February 2007 D-Wave demonstrated a machine that could solve problems regular computers are incapable of solving, in principle that is. The reason it is only in principle is because the tests run on the computer were not impossible on a regular computer. This created a fair bit of doubt among some that the chip was actually performing quantum-mechanical computations.

The computer works differently than the regular "gate model" of quantum computing where a series of quantum bits can be encoded as either 0, 1 or both simultaneously. D-Wave's machine uses something researchers are calling "adiabatic quantum computing" or "quantum annealing". However, some people disagree that this process is actually, truly quantum computing.

But despite all this, Lockheed-Martin wasn't turned away. The company just recently signed a deal with D-Wave to purchase a quantum computer for an estimated $10 million. This agreement will span multiple years and include system maintenance as well as various other professional services.

As of right now, it is unclear what Lockheed-Martin plans on doing with the computer. However, according to D-Wave's President and CEO Vern Brownell, "Our combined strength will provide capacity for innovation needed to tackle important unresolved computational problems of today and tomorrow. Our relationship will allow us to significantly advance the potential of quantum computing."

This is the second biggest deal the company has signed in the past couple of years with the biggest being a tie-up with Google in order to improve image search algorithms. Despite the fact that D-Wave's technology has not been 100% proven, Lockheed-Martin has still seen it as worthy of a $10 million investment. If anything, it gives them first access to this kind of technology.

Dell Computers Are a Green Choice

In today’s society, many are encouraged to ‘go green’ and help save our planet’s resources. These days our society is focusing on reusing, reducing, and recycling materials. Dell computers are a great choice for those who want to become more eco-friendly. There are many different things about Dell computers that qualify them as ‘green’.



One thing that qualifies Dell computers is what they’re made of. The company consistently finds new ways to incorporate recycled materials into their products, as well as less harmful materials. Much of the plastic used to make Dell computers is recycled material, as well as the plastic used in their packaging. They have also replaced many of the harmful chemicals used to produce their computers with less harmful and/or harmless materials. One major step they took was introducing more computers with LCD screens which reduce the usage of mercury.



Another great thing about Dell computers is the packaging that they come in. Besides using recycled plastic, Dell also uses bamboo when packaging their computers. The bamboo is biodegradable and is a natural, renewable resource. Bamboo also has a lot of tensile strength, which makes it ideal for packaging and protecting the computerNot only are the products that Dell creates eco-friendly, but also the way in which their products are manufactured. The company is one of the top five purchasers of renewable energy in the U.S. and is number one when it comes to the computer industry. Dell also strives to reduce their water usage. Many Dell facilities have installed more efficient water fixtures, as well as re-using water for landscaping irrigation after it is treated by on-site sewage treatment facilities. Dell also does not create any industrial wastewater.



Outside of creating and shipping their products, Dell is also eco-friendly in the fact that they have set up a partnership with Goodwill which encourages people to donate their old computer equipment, no matter what brand, in order to help people in economic crises. To find a drop-off center near you, click here. Dell has also partnered with The Conservation Fund and Carbonfund.org to help the Plant a Tree Program. This program uses donations to plant trees in areas that have been deforested in order to offset greenhouse gas emissions. If you would like to make a donation, you can do so here

Fake MacDefender Malware Originating from Russian Payment Processor

For about a month there has been a fake MacDefender malware that has been circulating and plaguing Apple computer owners. No one seemed to know where it was coming from, but finally on Friday, May 27 a computer security researcher made the claim that the fake malware could be traced back to an online Russian payment processor called ChronoPay.

"Some of the recent scams that used bogus security alerts in a bid to frighten Mac users into purchasing worthless security software appear to have been the brainchild of ChronoPay, Russia's largest online payment processor and something of a pioneer in the rogue anti-virus business," wrote security researcher Brian Krebs on his KrebsonSecurity blog.

The fake MacDefender and the incredibly similar scareware called MacProtector and MacSecurity tended to attack from points like infected Google Image search results. Once your computer is infected, it is incredibly difficult for Mac users to remove the malware. The issue is that the malware doesn’t have a dock icon and it attaches itself to the launch menu of the computer.

Krebs was able to trace the newest strains of the scareware back to ChronoPay by simply examining the two different domains that the software directs all of its Mac users to go to for a paid software security solution. While investigating, he found out that both mac-defence.com and macbookprotection.com were associated with the e-mail address fc@mail-eye.com. According to leaked ChronoPay documents, this e-mail address is owned by Alexandra Volkova, the company’s financial controller.

According to Krebs, both of the Mac domains listed above have been suspended by Webpoint.com, which is a Czech registrar; however, Krebs said that the fc@mail-eye.com account was used recently to register appledefense.com and appleprodefense.com. Despite this, Mac users have not yet reported being directed to either of these sites via malware like MacDefender.

"ChronoPay has been an unabashed 'leader' in the scareware industry for quite some time," Krebs writes. Just in 2008, it was the core processor of a site called trafficconvertor.biz. This was an “anti-virus” program that was designed to release the first strain of the Conficker worm. It was an incredibly destructive virus that still works to infect millions of computers across the globe.

"In the coming days, Apple will deliver a Mac OS X software update that will automatically find and remove MacDefender malware and its known variants," Apple wrote. “The update will also help protect users by providing an explicit warning if they download this malware."

Apple also released a document with detailed instructions for Mac users on ways to eliminate MacDefender from their computers.

Saturday, May 11, 2013

Insane Demand Causes Google to Shut Down Invite Process for Google+

Just a single day after Google unveiled Google+, the company's brand new social networking service, Google decided it would be a good idea to open up the invitation process late Wednesday afternoon to all those fortunate enough to have already been invited to participate in the service.

Before, the keys to Google+ were only given to those very select few who were fortunate enough to be allowed in at launch. So naturally an invite was a precious and prized commodity. Google has stressed that Google+ is in its very early stages though whatever it was that Google was aiming for with its field trial must have been pretty darn successful. The reason I say that is because every single person that was invited to the service is now able to invite other people in as well.

But that wasn't all, the people who were invited by the initial users were able to invite friends of their own too. Even better was the fact that people writing about the decision to open up the invitation process were suddenly struck with a lot of popularity amongst readers. MG Siegler from TechCrunch wrote on his Google+ page on his own story that, "I'm not sure any TechCrunch post has gotten comments at such a fast rate."

This amount of attention seems like it must have been a bit more than Google had asked for. After nearly six hours, Google shut the invitation process down over what Senior Vice President of Engineering at Google Vic Gundotra called "insane demand". Gundotra posted on his Google+ page late on Wednesday night, "We need to do this carefully and in a controlled way." Gundotra did not specify when the invite functionality would be back up and running.

Google+ Making Some Big Developments

Just a few weeks out of the gate and Google+ is already becoming highly popular among its early adopters. It is estimated that as many as 5 million users are already a part of the service's user base with many websites opting to adopt the +1 button. However, despite all that, it is still unclear as to whether or not Google+ is here to stay or if it will fail just like Google's other attempts at social networking.

In the meantime, things are looking positive for Google+. If you have not signed up yet, or have been unable to, then you might want a little peek at what is going on inside the pages of the service. Here are five notable developments pertaining to Google+ since it launched.

Increase in Google+ Share Buttons
While Google+ may be nowhere near Facebook, it is giving Twitter a run for its money. According to different sources, there are already more Google+ Share buttons, or +1 buttons, on the internet than Twitter share plugins. In a search of the 10,000 most viewed websites, nearly 4.5% have +1 buttons while only 2.1% have Twitter buttons.

MySpace and Digg Heavy on Google+
Kevin Rose, founder of Digg, recently reported that he would be moving his domain, KevinRose.com, to his Google+ page, which will move his online activity to that network as well. In addition to Kevin Rose, MySpace co-founder Tom Anderson also has an extensive Google+ presence. Anderson's presence is so prevalent that he even used the service to write a blog post praising Facebook and CEO Mark Zuckerberg for the site's recent integration of Skype.

Invites Stabilizing 
Google+ was being so inundated with invites and such that Google started a habit of randomly turning the invites feature on and off, and it was getting rather annoying. Users also griped about how they would invite friends and when those individuals finally got around to checking out the network, the invite feature would be turned off. However, all that seems to have stabilized, for now at least. The invite link has remained unchanged and working since Saturday.

Tips and Tricks from Users
Some of the more dedicated users of Google+ have already joined forces and created a tips and tricks guide for Google+ known as Google+: A Collaborative Document, which can be found on Google Docs. The document is over 40 pages long and covers everything like creative ways to use circles, tips on privacy, how to send private messages and an extensive guide for Chrome and Safari. In addition to that, the guide, which is in English, is being translated into Chinese, German and Russian.

Business Pages Coming Soon
A lot of enterprises are supposedly very anxious to get into Google+ and Google is planning on opening up its social experiment to business pages in the "near future". A lot of people are wondering if when Google+ offers business pages, if Google Offers, Google's form of Groupon and Facebook Deals, will expand into Google+ as well.

So there you have it, five of the top things that have been happening on Google+ since it launched. If you haven't gotten an invite yet, you might want to start making more friends or find some way to get on the network. Even though Google+ is in a somewhat of a trial stage, it is already gaining immense popularity.

Java Standard Edition 7 Finally Released by Oracle

Oracle has finally shipped Java Platform Standard Edition 7, otherwise known as Java SE 7, in what is the first major update to the programming language in over five years. Oracle let news of this out in a company announcement yesterday. This is also the very first release of Java SE under the ownership of Oracle.

According to Oracle Chief Java Architect Mark Reinhold in a webcast earlier in the month, "We all know for various business and political reasons that this release has taken some time."

According to an estimate by Oracle, some 9 million developers from around the globe use Java. Tiobe Software also estimates that Java is the most widely used programming language in the world, bumping off C and obliterating C++ with twice as many users. Over 3 billion devices around the world run Java and it is deployed by 97% of enterprise desktops worldwide. In addition to that, the Java runtime is downloaded over a billion times each year.

Since Oracle acquired Java as part of its January 2010 acquisition of Sun Microsystems, the company has come under a lot of scrutiny from a plethora of different quarters for its management. Back in December, the Apache Software Foundation withdrew its participation from the Java Community Process, stating that Oracle did not govern Java as a truly open specification. Oracle has also sued Google for "inappropriate use of Java" in Google's Android mobile OS.

According to Senior Director of Engineering for Red Hat's Middleware Business and Red Hat's Primary Liaison for the JCP Mark Little, however, "The new release is solid, though it is more of an incremental release than anything else."

The new version of Java addresses many of the trends that have overtaken the field of computer programming over the past 10 years. It offers increasingly improved support for the growing number of non-Java dynamic languages that are designed to run on Java Virtual Machine. In addition to that, it also features an API for simplifying the task of running a program across multiple processor cores. Also, the range of actions that programs can take with file systems has been vastly improved as well.

New Finger and Stylus Gestures Could Be Coming to Windows 8

A lot of people are excited about the upcoming operating system from Microsoft known, as of right now, as Windows 8. Windows 8, or whatever it is named upon release, is the highly anticipated successor to the the immensely popular Windows 7 operating system currently out from Microsoft. However, some new developments are leaking their way onto the internet, including the fact that Microsoft has unveiled several patent filings that are pointing to new finger and stylus gestures that could be incorporated into Windows 8 tablets.

These patent filings were actually filed back in 2010 but Microsoft just published them last week. These new patents show that Microsoft believes finger and stylus gestures can work not only separately but together as well in order to offer a user-friendly means of input on new touch screen devices. This new method that would recognize both methods of input would recognize the first method as your finger and the second as your stylus. Using both your finger and your stylus could then create a variety of gestures, a lot of which seem to be geared more toward image editing though could be put to other uses as well.

For example, a copy gesture would allow you to tap on an object with your finger and then move it around with your stylus. A cut gesture would allow you to split an object in half using fingers, a stylus or both at the same time. A brush gesture would assist you in removing part of an image and then storing it somewhere else on the screen. A staple gesture may be used to stack multiple objects or images on top of each other and a stamp gesture would create duplicates of whatever you wanted.

The upcoming Windows 8 operating system for tablets was unveiled by Microsoft at the D9 Conference back in June. Microsoft is expected to release more information and details about the OS at the upcoming Build Conference in September, with a beta possibly ready for developers to test out at that time as well.

Firefox 6 Ready for August 16 Release

Mozilla states that it is on track to release Firefox 6 next week according to notes posted on the company's website. Developers have signed off on Firefox 6 and also anticipate no problems that could delay the release date of August 16 for the most recent upgrade to Mozilla's web browser according to meeting notes.

According to the notes, "On track with a few bugs still remaining. No concerns for Tuesday." Mozilla has used a new rapid-release schedule ever since the spring of this year. This new schedule delivers a new version of Firefox every six weeks, a move that many analysts and critics are comparing to the same one Google uses to update its Chrome browser for the last year.

Firefox 5 was released back on June 21, six weeks ago next Tuesday. Mozilla is already working on Firefox 7 and plans on releasing it on September 27. In addition to that, if the rapid-release schedule works like it is supposed to, Firefox 8 will become available on November 8 with Firefox 9 becoming available on December 20.

Firefox 6 includes multiple, noticeable changes to the browser, including highlighting domain names in the address bar. Both Google Chrome and Internet Explorer 9 do a similar thing by making domain names boldfaced. In addition to that, Firefox 6 also reduces start-up time when users rely on Panorama, the multi-tab organizer for Firefox.

Some users, however, are upset by the change of pace for Mozilla and Firefox including corporations like IBM, which have installed the open-source browser on tens of thousands of Windows PCs. However, that has not frightened Mozilla, who has not backed off of the rapid-release schedule at all. Though, in response to enterprise complaints and concerns, Mozilla has established a committee in order to take feedback from users.

As well as releasing Firefox 6 next week, Mozilla also plans on releasing Firefox 3.6.20, an update that will include security patches and other fixes to the 2010 edition retained by nearly 1 in 3 users of Firefox. When Firefox 6 ships, users running Firefox 4 or Firefox 5 will be offered the newest edition via the browser's update mechanism that is triggered when the "About Firefox" dialog is opened.

As of the end of July, only 11% of Firefox users were still running Firefox 4 and 48% were running Firefox 5. What do you think? Are you excited about the release of Firefox 6? What are your thoughts on Mozilla's new rapid-release schedule? Let me know in the comments section below. 

IBM Develops Computer Chip That Acts Like a Brain

Artificial Intelligence is something that science fiction movie directors love to utilize. Many directors have envisioned a future with robots and things that can act and behave like humans. However, something like that has always been out of reach for us, until now.

IBM recently introduced a new experimental neurosynaptic computer chip that emulates brain function in areas like cognition, perception and action. According to statements from IBM, these new chips will use algorithms and silicon circuitry in order to recreate spiking neurons and synapses in the brain.

These new chips will be embedded into cognitive computers. These computers will not be programmed to do certain actions, like the computers you and I use everyday. Instead, these devices will learn through experiences., create hypotheses and remember outcomes.

According to Project Leader for IBM Research Dharmendra Modha, "These chips are another significant step in the evolution of computers from calculators to learning systems, signaling the beginning of a new generation of computers and their applications in business, science and government."

The chips are being designed and created by IBM, who is working alongside multiple university collaborators. DARPA, or the Defense Advanced Research Projects Agency, has already awarded $21 million in funding to IBM for the research as part of Phase 2 of the Systems of Neuromorphic Adaptive Plastic Scalable Electronics, or SyNAPSE (I hope they pay their anagram creators handsomely) project.

SyNAPSE is a project that was designed to create a computer system that will analyze complex information from multiple sensors and adapt itself automatically based on its interaction with the environment. Regardless, this technology is extremely impressive and, albeit, rather scary. If they are already making computer chips that can think, I wonder how long it will be before they make ones that will subsequently take over humanity? I guess we will just have to wait and see. 

HP Launches New Compaq 8200 Elite All-In-One PC

a few days after announcing that it was planning to shut down its PC business, HP has launched an all new all-in-one desktop aimed specifically at business customers.

Branded the HP Compaq 8200 Elite, this computer comes with a 23-inch HD LED display as well as your choice of either Windows 7 Professional, Ultimate or Home Premium as your operating system.

While all-in-one computers like this aren't really breaking stories, this one is a bit of a surprise given HP's previously reported stance on computers. HP announced last week that it would discontinue its TouchPad tablet and basically shutter its WebOS operation.

In addition to that, HP also stated that it is looking to find a new direction for its PC business, the Personal Systems Group (PSG), as it refocuses the attention of its business around software solutions instead.

According to a statement that was recently released by HP, "HP will consider a broad range of options that may include, among others, a full or partial separation of PSG from HP through spin-off or other transactions."

The HP Compaq 8200 Elite is the first device from the company to use Intel's second generation Core vPro technology in order to boost performance and hard drive access. Moreover, the device comes with a one year license to HP Virtual Rooms which allows you to set up online conference centers for meetings or presentations.

The HP Compaq 8200 Elite will also come with up to 8GB of memory, the option of solid state drives and hardware-based encryption. Employees will also find an integrated webcam as well as integrated speakers with SRS premium sound.

The HP Compaq 8200 Elite all-in-one computer is available now for a starting price of $999. This price nets you an Intel Pentium dual-core G850 processor as well as all the other standard features. 

Mozilla's Rapid Release Schedule for Firefox Comes Under Scrutiny

Mozilla's new rapid release schedule for its Firefox internet browser, which was created as a positive thing for the company, has come under a lot of scrutiny in the past few weeks and even more fuel was added to the fire over the weekend to make things even worse.

The criticism this time came from a former volunteer for the project, Tyler Downer. Downer recently left the project after three years after becoming increasingly frustrated with what he describes as a "broken" triage process for finding and fixing bugs.

According to a blog post from Downer, "Triage as we know it today is NOT ready to handle the Rapid Release process." Under the old model, with which a new major version of the browser would be released once every year, "triage had a bit more time to go through a massive pile of bugs to find regressions and issues, and there was a pretty good chance that most bugs would get caught just because we had time on our side, and we could afford to miss a bug for six weeks, because we would most likely get around to it," Downer added.

However, Downer asserts that with the new, faster process, triage has been caught off guard. "We currently have 2,598 [unconfirmed] bugs in Firefox that haven't been touched in 150 days. That is almost 2,600 bugs that have not been touched since Firefox 4 was released. And how many more bugs have been touched but not really triaged or worked on? Every day this number grows."

Despite his comments, however, Downer did make a point to note that he wasn't criticizing the rapid release process itself. "I love the idea of rapid release. Rapid release is going to be awesome if done properly. I have always been so frustrated by the continual late releases that hold back awesome new features from the web."

In addition to that, Downer also added that he doesn't think the situation is hopeless. "I have been in talks over the past few days, and I see a good possibility that Mozilla means business in improving triage." However, when Downer decided to leave, it was due to a general lack of interest in doing anything substantial to improve the triage process.

HP Releases Seven All-in-One PCs

Even though Hewlett-Packard might stop selling WebOS devices and move its PSG (Personal Systems Group) into a separate company, they haven't been stopped from unveiling seven new PC's today. They updated their Omni, TouchSmart and HP Pro lineups. These PC's all have starting prices ranging from $400 to $900. So even though they might not be keeping their PSG division around (which is responsible for desktops and laptop PCs), they are making it clear they still plan to be a leader in the PC market. HP plans on all-in-one PCs becoming a huge seller next year and even cited that they expect interest to grow from 9.9 percent to 15.7 percent in the next 12 months.

The first two PCs they are rolling out are the Omni 120 and the Omni 220. The Omni 120 has a 20-inch screen and will offer up to 750 GB of hard drive space and your choice of an AMD or Intel processor. The 21.5-inch Omni 220 will have a unique cantilever design and will have Intel's 'Sandy Bridge' core processors. The Omni 220 will be available September 11 starting at $800 dollars, followed by the Omni 120 becoming available September 21 with a starting price tag of $400.

HP also plans to release four new TouchSmart PCs around the same time. The TouchSmart 320 has a 20-inch screen and won't be in stores until October 2 with a starting price of $600. The 21.5-inch Touchsmart 420 and 23-inch Touchsmart 520 will be available September 11 with starting prices of $700 and $900. All of the PCs come with touchscreen displays and include Beats Audio technology.

The fourth TouchSmart being launched is the Touchsmart Elite 7320 Business PC. This all-in-one computer comes with a 21.5-inch HD LED back-lite display. You will be able to choose a Core i3, i5, or i7 processor and HP plans to launch this model September 21, starting at only $850.

The last all-in-one PC, which has no confirmed launch date but should hit stores sometime in October, is the HP Pro 3420. This computer features a 20-inch display, Core i3 processor, up to 8GB of RAM, and up to 2TB of hard drive space. The starting price of this PC is set to be $600.

Windows 8 Developer Preview

Windows 8, the highly anticipated followup to the hugely successful Windows 7 operating system from Microsoft, was displayed, albeit briefly, way back at D9. However, Anaheim is where the new OS is really going to kick off. Microsoft is starting its Build conference with a full-on developer preview of the new OS, code-named Windows 8 at the moment.

According to Microsoft's President of Windows and Windows Live Division Steven Sinofsky, Microsoft has been completely re-imagining the Windows operating system. In doing so the company has brought a lot of new capabilities that coders will be able to dive into sooner rather than later.

The new "Metro-styled" user interface is right up front and brings new graphical elements of the Windows Phone 7 to your desktop, laptop or tablet. In addition to that, Windows 8 will also come with Internet Explorer 10 pre-installed as well as a more intense focus on apps that have the ability to communicate with each other.

If you have been using Windows 7 for a while now and you are used to it, you should have no problem making the switch to Windows 8. Windows 8 is built primarily on the same foundation as Windows 7 though the retooled Task Manager and Windows Explorer should tease your interests a little more.

The new Windows Store will allow developers to present their apps to any country that has availability to Windows 8, and support for ARM-based chipsets is also proudly included along with x86 compatibility. What this basically means is that every device from a small tablet to a large custom PC will be able to easily handle everything Windows 8 has to offer.

Microsoft has also confirmed backwards compatibility with "devices and programs" that support Windows 7. In addition to that it has also been said that developers will be able to download the Windows Developer Preview from the new Windows Dev Center later on in this week, though no official date has been specified.

I don't know about you but I am thoroughly excited for Windows 8. I can't wait to get my hands on this new operating system and start exploring all the new features it has to offer!

Toshiba Introduces Tiny Enterprise Hard Drives

Toshiba's Storage Products Business Unit has just announced a high-capacity 2.5" high-performance enterprise-class drive. Known as the Toshiba MK01GRRB/R series, this drive supports the exacting requirements for compute-intensive environments witha 15,000 RPM spin speed, a 6Gb/s SAS interface and a maximum capacity of 300GB1. In addition to that, this drive also offers drive-based encryption in order to help companies manage data security.

According to Vice President of Marketing at Toshiba's Storage Products Business Unit Joel Hagberg, "Enterprise customers are increasingly satisfying their performance and capacity needs with power efficient small form factor drives. Enterprise drives with the latest self-encryption features are helping data centers to more cost-effectively achieve compliance with information security mandates. Toshiba small form factor enterprise drives deliver the performance, capacity and security features IT administrators require for today's mission critical server storage and cloud appliance markets."

The third-generation 2.5-inch 15,000 RPM enterprise drives leverage an enhanced power condition state that reduces the spin of your hard drive in idle states. In addition to that, it also significantly lowers power consumption which also means lower heat dissipation, increasing system stability and less energy usage. As a part of Toshiba's commitment to improved security, this drive also features self-encryption technology that is designed for the Trusted Computing Group "Enterprise SSC" specification.

According to John Rydning, IDC's Research Vice President for Hard Disk Drives, "Increasing the capacity of the 2.5-inch enterprise class HDDs is expanding the market opportunity for this form factor given its inherent power and data density advantages as compared to 3.5-inch models. Toshiba's new MK01GRRB/R series drives give server and storage system customers the capacity they want with the performance they need, as well as the ability to secure data on the drive with Toshiba's SED technology option." Shipments of the Toshiba MK01GRRB/R Series are scheduled to begin in volume in Q1 2012.

Samsung Introduces New Galaxy Tab 7.0 Plus

Samsung has just confirmed that another variation of its Samsung Galaxy Tab will be hitting stores soon. This new device, known as the Samsung Galaxy Tab 7.0 Plus, will begin selling on November 13, 2011 at many U.S. retailers for a surprisingly reasonable $399.99 according to a statement released by Samsung.

The 16GB tablet, complete with 7" screen, is focused more on home entertainment than business usage, coming complete with a Peel Smart Remote TV app that allows you to tap on the screen in order to quickly find and watch television shows on any manufacturer's TV or home entertainment system.

In addition to that, built-in infrared in the tablet allows you to control a home theater or television setup. The best part is that you don't need any extra cables or hardware, just what you already have in your living room and the tablet, in order to control the TV functions. The Peel remote app allows you to have full control over your TV or entertainment system regardless if it is Samsung brand or not.

Aside from controlling your DVD player, Blu-Ray player, set-top box or TV, the Galaxy Tab 7.0 Plus will also connect to Facebook and Twitter allowing you to share information about what is being watched. According to Chief Experience Officer for Peel Greg Lindley, "Peel's vision is eliminating the barriers between you and your favorite shows."

The Samsung Galaxy Tab 7.0 Plus runs on a 1.2GHz dual-core processor as well as Android 3.2 Honeycomb for an operating system. The device is also only .39" thick and only weighs 12.1 ounces.

Samsung has a plethora of tablets that come with either WiFi only or WiFi and 3G/4G connectivity. According to the website, the Galaxy Tab 7.0 Plus features both cellular and WiFi capabilities though Samsung has yet to announce which U.S. wireless carriers will support the tablet.

The original Galaxy Tab ran on Android 2.2 and upgraded to 2.3 back in May. It too had a 7" touch screen but was slightly larger and heavier and only came with a 1GHz processor. Sales of the Samsung Galaxy Tab Plus will begin on November 13 nationwide at Best Buy, Amazon, tiger Direct, Fry's and other outlets with pre-orders becoming available on October 23.

Will March 2012 Give Us the iPad 3?

I wrote recently abut the iPad getting a possible discounted release and a new name (iPad Mini?) in the early part of 2012. While that was sure to spark the interests of iPad and tablet users a little bit, a new report should make that excitement grow exponentially.

According to a report from Macotakara, which cited "a reliable Asian source", the next generation iPad is being rushed into production thanks to Chinese New Year celebrations lasting from January 22nd through the 28th. Factories in China are all gearing up to produce the iPad 3 by the end of January 2012.

Assuming that the source is reliable (An unreliable source on the internet? I don't believe it...), the new iPad is said to include a redesigned dock connector that features the same number of pins but in a smaller shape than the one the current iPad uses. If this is true, the new connector could pose a serious problem with existing products that plug into the dock.

In addition to the new connector, the tablet's screen will also remain the same size, even though the source didn't mention if Apple would increase it to Retina Display levels or continue on with the current resolution.

This report isn't the first, however, it joins a small number of previous iPad 3 rumors that have already surfaced. Different sites and sources have been in constant debate as to whether or not the new iPad would offer a 2,048 x 1,536 display. A move like that would give the tablet the same resolution as the current iPhone 4 and 4S as well as the iPod Touch 4G.

As far as a launch windows goes, you are probably looking at a March 2012 release date, especially considering the fact that Apple unveiled the original iPad in April of 2010 and the iPad 2 in March of 2011. Nevertheless, the Linley Group, a chip consulting firm, believes that Apple will outfit the iPad 3 with an A6 processor. This could also potentially push the launch back until June 2012 or later.

School Computers to Be Replaced by School iPads in 2016

With the technological age in full force, computers have become a staple in schools. Every single grade it seems is using computers for learning and entertainment while teaching. But, could the age of classroom computers be over? Is there a new piece of technology that will undoubtedly dethrone the computer in the classroom? Maybe, as many think tablets will eventually replace desktop computers in classrooms.

A recent survey of district tech directors discovered that all of them were testing or deploying tablet devices. What's more is that the survey also discovered that these directors expect tablets to outnumber computers in the classroom by the year 2016.

Analyst for Piper Jaffray Gene Munster recently surveyed 25 educational IT directors at a conference about the integration of technology in classrooms. Munster's survey, which was titled "Tablets in the Classroom", revealed that all 25 directors were using Apple's iPad in their schools while none of the participants were testing or deploying Android-based tablets. Munster went on to explain that this trend in education may be due to a familiarity with Apple devices among students and school employees.

According to Munster, "Within the next five years, our respondents expect to have more tablets per student than they currently have computers." Considering iPads represent a majority of the tablets in schools, Munster believes that the word "tablet" is synonymous with "iPad". The school districts that were represented in the survey currently have about 10 students per every computer. However, in the next five years, IT directors expect that number to drop to about six students per iPad. Devices, like the iPad, are more desirable over computers in the classroom because they provide a more individualized learning experience than traditional computers.

Tim Cook, the new CEO of Apple, stated earlier this year that demand for the iPad is strong among education customers. Back in February Georgia Senate President pro tem Tommie Williams proposed a plan to replace conventional textbooks in middle schools with the iPad. Williams met with Apple to talk about a plan to make the iPad a central component in the state's education system.

The iPad is slowly creeping its way into schools. Many students can rent the devices from their campus library in college and some schools, like the ones surveyed by Munster, are actually using them in the classroom. If expectations are correct, students in 2016 are going to have some fun times in the classroom.

Japan's K Computer Remains as the World's Most Powerful Supercomputer

There is a recently released list of the Top 500 most powerful computers in the world, and once again Japan's K Computer takes the number one spot with an increase from 8.162 petaflops to 10.51 petaflops per second. This balances out to 10.51 quadrillion floating-point operations per second. The letter "K" is short for the Japanese word "kei", which symbolizes 10 quadrillion. Coincidence? I think not.

On the Top 500 list from June, the K Computer took the number one spot out of the hands of China's Tianhe-1A system. The Tianhe-1A remains in the number two spot at the moment at 2.57 petaflops. Following in the number three spot is the Oak Ridge National Lab's Jaguar supercomputer. After that, the rest of the list looks pretty much the same as the last, with many companies holding the same spots as they previously did. According to TOP500 editor Erich Strohmaier, "This is the first time since we began publishing the list back in 1993 that the top 10 systems showed no turnover."

The K Computer, which is installed at the RIKEN Advanced Institute for Computational Science (AICS) in Kobe, Japan in partnership with Fujitsu, uses 705,024 SPARC64 processing cores. If you want to know just how many that is, it is more than the rest of the top five on the list combined. In addition to that, the K Computer uses 12.66 megawatts of power, four times more than its nearest competitor. This is also more than the 9.89 megawatts of power the computer recorded in June. However, despite these high numbers, the K Computer is still one of the most efficient supercomputers on the list, delivering 830 Mflops per watt.

The K Computer was installed in 672 racks back in June. An expansion of 800 cabinets allowed the 10 petaflop achievement to be accomplished. Just a week ago, Fujitsu announced the expansion possibilities of the K Computer as it could possibly grow to a theoretical 23 petaflops.

Aside from the rankings remaining unchanged, other characteristics evolved with the November 2011 list. The entry point for being on the Top 500 list is now at 50.9 teraflops with the combined performance across all 500 supercomputers being 74.2 petaflops, an increase from the 58.7 petaflops recorded in June. Gigabit Ethernet is still the most popular internal system interconnect, being used in 223 systems. However, InfinfBand use increased to 213 systems.

Average power consumption continued to rise with 29 systems on the list confirmed at using more than one megawatt of power. The most energy efficient supercomputers are BlueGene/Q with 2,029 Mflops per watt. IBM just filed a patent for a massive supercomputer that could potentially reach 107 petaflops. In total all of the systems that reported in on power consumption combined for more than 159 megawatts.

Microsoft Kinect for PC Coming Soon

On Tuesday, November 22 Microsoft announced that it is working on getting its Kinect motion-sensing device ready to use on PCs that run Windows. The company currently sells Kinect for its Xbox 360 gaming console; however, it is hard at work and says that it will have the PC version ready by early 2012.

"Coupled with the numerous upgrades and improvements our team is making to the Software Development Kit (SDK) and runtime, the new hardware delivers features and functionality that Windows developers and Microsoft customers have been asking for," wrote Craig Eisler, general manager of Kinect for Windows, in a blog post.

"Simple changes include shortening the USB cable to ensure reliability across a broad range of computers and the inclusion of a small dongle to improve coexistence with other USB peripherals," Eisler continued. "Of particular interest to developers will be the new firmware which enables the depth camera to see objects as close as 50 centimeters in front of the device without losing accuracy or precision, with graceful degradation down to 40 centimeters. 'Near Mode' will enable a whole new class of 'close up' applications, beyond the living room scenarios for Kinect for Xbox 360. This is one of the most requested features from the many developers and companies participating in our Kinect for Windows pilot program and folks commenting on our forums, and we're pleased to deliver this, and more, at launch."

This announcement about the PC version of the Kinect comes on the exact same day that Microsoft officially acquired VideoSurf. Microsoft acquired the company, which was founded in 2006, for $70 million. It hopes to incorporate the California-based company’s online video search technology into the Xbox Live system.

According to a statement released by the companies on Tuesday, November 22, VideoSurf "offers a back-end computer vision technology that 'sees' frames inside videos to make discovering content fast, easy and accurate."

"VideoSurf's content analytics technology will enhance the search and discovery of entertainment content across our platform," said the director of Xbox Live for Microsoft's Interactive Entertainment Business Alex Garden. "This holiday we will launch voice search across our entertainment partners on Xbox Live. Over time, as we integrate VideoSurf's technology into our system, we are excited about the potential to have content tagged in real time to increase the speed and relevance of the search results.”

According to Microsoft, the acquisition would "make it easier for world-class video partners to take full advantage of advanced features such as voice search enabled by Kinect for Xbox 360."

Microsoft also said that in the next few months it will bring "nearly 40 world-leading TV and entertainment providers to Xbox Live." These will include Bravo, Comcast, HBO GO, Verizon FiOS, and Syfy in the U.S.; BBC in the U.K.; Telefónica in Spain; Rogers On Demand in Canada; Televisa in Mexico; ZDF in Germany; and Mediaset in Italy.

"Microsoft's Interactive Entertainment Division is at the leading edge of connected entertainment," said Lior Delgo, the CEO and co-founder of VideoSurf, in the statement that they released with Microsoft. "We are incredibly excited to be working together on our mutual passion for creating amazing consumer experiences and reinventing how consumers search, discover and enjoy content on their televisions."

Western Digital's Deal to Acquire Hitachi's HDD Business Approved

 Western Digital has just secured conditional approval from the European Union's competition regulator to purchase Hitachi's hard disk drive business for a grand total of $4.3 billion. This was only made possible, however, after Western Digital agreed to sell several of its production operations.

Western Digital, which just so happens to be the world's second largest competitor in the hard disk drive (HDD) sector, and Hitachi, which just so happens to be the third largest, unveiled the deal back in March. The deal itself is aimed at giving the United States company a competitive edge in developing next-generation information storage technology.

This decision by the European Commission confirmed a story published last week by Reuters. According to European Union Commissioner Joaquin Almunia in a recent statement, "The proposed divestiture will ensure that competition in the industry is fully restored before the merger is implemented."

Western Digital did promise to sell essential production assets for the manufacturer of 3.5" disc drives, including a production plant, according to the Commission, which cited reduced competition in the sector after Seagate Technology bought Samsung Electronics' hard disk drive business recently. In addition to that the company agreed to transfer or license intellectual property rights to the business to be sold off. What's more is that Western Digital will also transfer staff and the supply of HDD components to the unit.

Western Digital is unable to complete the deal until it finds an appropriate buyer for the unit, which will then need to be approved by the regulator, according to the commission. If Western Digital is unable to find a suitable buyer, it stands to reason that this deal, although approved, will never actually come to fruition, which could be disappointing for both companies.

Computer History Museum Opens New Online Exhibit Dedicated to Steve Jobs

In what is perhaps the biggest show of remembrance for him since his death, the Computer History Museum has just launched an online exhibit completely dedicated to former Apple co-founder and CEO Steve Jobs. Jobs, as many of you know, passed away back in October, a great loss for Apple and the world.

The exhibit, known as "Steve Jobs: From Garage to World's Most Valuable Company", features a plethora of photos and descriptions of objects from the museum's permanent collection. In addition to that, visitors will find vintage footage of Jobs from his younger years and Apple's humble beginnings.

One thing of particular interest in the exhibit is a 22 minute video of Jobs from 1980. In the video Jobs talks about the early days of Apple and, at one point, after citing some examples of how Apple computers were being used in agriculture and schools, Jobs confesses that he and co-founder Steve Wozniak "had absolutely no idea what people were going to do with these things when we started out. As a matter of fact, the two people it was designed for was Woz and myself because we couldn't afford to buy a computer kit on the market."

In a statement from the Computer History Museum's Senior Curator Dag Spicer, "In Jobs' own words, we hear how luck as well as skill played big roles in Apple's founding. We also see how focused, articulate and convincing Jobs could be, even at this early stage."

The story also runs through Jobs' entire life via pictures and allows visitors to see historic documents like one of the newsletters of Jobs and Wozniak's famed Homebrew Computer Club. Visitors are also able to look through confidential memorandum outlining Apple's public offering plans, as well as its original business plan for the Macintosh.

There are quite of few gems in this little collection for you to look around at, especially if reading Steve Jobs' biography didn't fully satisfy you. I highly recommend checking it out if you are a fan of Apple, Jobs himself or simply the story of what two ambitious minds can accomplish.

Computer History Museum Opens New Online Exhibit Dedicated to Steve Jobs

In what is perhaps the biggest show of remembrance for him since his death, the Computer History Museum has just launched an online exhibit completely dedicated to former Apple co-founder and CEO Steve Jobs. Jobs, as many of you know, passed away back in October, a great loss for Apple and the world.

The exhibit, known as "Steve Jobs: From Garage to World's Most Valuable Company", features a plethora of photos and descriptions of objects from the museum's permanent collection. In addition to that, visitors will find vintage footage of Jobs from his younger years and Apple's humble beginnings.

One thing of particular interest in the exhibit is a 22 minute video of Jobs from 1980. In the video Jobs talks about the early days of Apple and, at one point, after citing some examples of how Apple computers were being used in agriculture and schools, Jobs confesses that he and co-founder Steve Wozniak "had absolutely no idea what people were going to do with these things when we started out. As a matter of fact, the two people it was designed for was Woz and myself because we couldn't afford to buy a computer kit on the market."

In a statement from the Computer History Museum's Senior Curator Dag Spicer, "In Jobs' own words, we hear how luck as well as skill played big roles in Apple's founding. We also see how focused, articulate and convincing Jobs could be, even at this early stage."

The story also runs through Jobs' entire life via pictures and allows visitors to see historic documents like one of the newsletters of Jobs and Wozniak's famed Homebrew Computer Club. Visitors are also able to look through confidential memorandum outlining Apple's public offering plans, as well as its original business plan for the Macintosh.

There are quite of few gems in this little collection for you to look around at, especially if reading Steve Jobs' biography didn't fully satisfy you. I highly recommend checking it out if you are a fan of Apple, Jobs himself or simply the story of what two ambitious minds can accomplish.

Computer History Museum Opens New Online Exhibit Dedicated to Steve Jobs

In what is perhaps the biggest show of remembrance for him since his death, the Computer History Museum has just launched an online exhibit completely dedicated to former Apple co-founder and CEO Steve Jobs. Jobs, as many of you know, passed away back in October, a great loss for Apple and the world.

The exhibit, known as "Steve Jobs: From Garage to World's Most Valuable Company", features a plethora of photos and descriptions of objects from the museum's permanent collection. In addition to that, visitors will find vintage footage of Jobs from his younger years and Apple's humble beginnings.

One thing of particular interest in the exhibit is a 22 minute video of Jobs from 1980. In the video Jobs talks about the early days of Apple and, at one point, after citing some examples of how Apple computers were being used in agriculture and schools, Jobs confesses that he and co-founder Steve Wozniak "had absolutely no idea what people were going to do with these things when we started out. As a matter of fact, the two people it was designed for was Woz and myself because we couldn't afford to buy a computer kit on the market."

In a statement from the Computer History Museum's Senior Curator Dag Spicer, "In Jobs' own words, we hear how luck as well as skill played big roles in Apple's founding. We also see how focused, articulate and convincing Jobs could be, even at this early stage."

The story also runs through Jobs' entire life via pictures and allows visitors to see historic documents like one of the newsletters of Jobs and Wozniak's famed Homebrew Computer Club. Visitors are also able to look through confidential memorandum outlining Apple's public offering plans, as well as its original business plan for the Macintosh.

There are quite of few gems in this little collection for you to look around at, especially if reading Steve Jobs' biography didn't fully satisfy you. I highly recommend checking it out if you are a fan of Apple, Jobs himself or simply the story of what two ambitious minds can accomplish.