Top
Past Meeting Archive Los Angeles ACM home page National ACM home page Click here for More Activities this month
Check out the Southern California Tech Calendar

Joint Meeting of
the Los Angeles Chapters of
ACM and Association of Information Technology Professionals (AITP)

Wednesday, September 13, 2006

"PETER COFFEE’S ANNUAL FORECAST"
About key technologies in Computing, Communications, and Business Applications

Peter Coffee
eWeek

(Note that this meeting is on the second Wednesday of the month!)

With products like Windows® Vista and chip-based security, the battle of the PC is still going strong. Long-awaited breakthroughs in nano-scale manufacture and high-speed communication are finally leaving the laboratory, the Internet is maturing into a distributed computing platform with industry standards for remote Web services interaction.  Major players in every industry are moving aggressively to exploit the efficiencies of being online.

Making sense of it all in an evening of analysis and comment is Peter Coffee; he will provide an insider’s view and forecasts for key technologies in computing, communications, and business and technical applications. His talk will span the range from nanometer chip fabrication techniques to worldwide multi-gigabit networks and will include time for questions.

Peter Coffee is Technology Editor at e-Week (formerly PC Week), the national newspaper of enterprise computing. He serves as an internal consultant to the editorial staff and as the technical liaison between e-Week and its advisory panel of corporate IT architects. He is the author of "How To Program Java" (ZD Press) and "Peter Coffee Teaches PCs" (Que) and has received many professional awards.  He has assisted CBS News, MSNBC, and the PBS News Hour in covering events as diverse as the Microsoft antitrust trial and the worldwide attacks against high-profile Internet sites.  His commentaries appear on e-Week's website at www.eweek.com

Before becoming a full-time writer in 1989, Peter held project management and technical positions with Exxon and The Aerospace Corporation. Peter holds degrees from MIT and the Pepperdine School of Business and is active as an amateur radio operator, community web site operator, and euphonium soloist and composer.

Come at 6:00 PM to network with your fellow professionals. A round table with Peter will start promptly at 6:30 PM followed by dinner and talk.
prepared by Paul Schmidt
 

~Summary~

LA ACM Chapter Meeting
Held Wednesday, September 13, 2006

LA ACM Chapter September Meeting. Held Wednesday September 13, 2006.

This is an enhancement of the report in the October DATA-LINK.

The presentation was "Annual Forecast" by Peter Coffee, Technological Editor for eWeek magazine. This was a joint meeting of the Los Angeles Chapter of ACM and the Los Angeles Chapter of the Association for Information Technology Professionals (AITP) and was held at the LAX Plaza Hotel (Formerly the Ramada Hotel) in Culver City. This year AITP, once again, did the hard work of making the arrangements for the meeting and deserve congratulations for successful accomplishment of this task.

Peter Coffee started out taking questions at the "Round Table" before dinner. He was asked to comment on Europe's disapproval of Microsoft's security systems. The US was concerned about what was an acceptable enhancement to an operating system product and what was anti-competitive bundling and tying in of product. Europe went after Microsoft not from the desktop, but from the server side. This was not so much about functionality; Europe wants full disclosure of the application programming interfaces so other people can compete with Microsoft. Does Vista represent excessive bundling? Peter said that Vista doesn't exist yet and he won't bet on it shipping as a usable product in January. Vista is an extremely complicated product. How important is the non-US market? It used to be that if you could sell your product in the US market you would do fine. Five or six years ago HP moved the worldwide center of its personal computer product to Europe because that was where the growth was. The US market was full and there were no more large profits to be made.

PC's have become a commodity product with not much improvement available that would increase their value. Scott McNeely said "Dell is in the same position as your local super market, the only thing you can add to a banana is a bruise" and this is the attitude of most PC customers who don't really care about performance improvements and won't pay for them. Fast gaming machines appeal to the hot rodder and there are people who are still making money selling to gamers who care about the clock rate. Peter doesn't remember the last time he bought a machine and worried about the clock rate. Peter recently bought a new PC and did not even realize he had purchased a machine with a dual processor. He was concerned with whether it had slots for all the cameras in his house, would it burn DVD's and was it small and quiet. A company that is still making money is Apple. The Mac is facing the cold wind of commoditization on the back of its neck and will have to struggle to maintain its market share. After they moved to the Intel processor a lot of people asked how well will this Mac run Windows? You can now buy a Mac with Windows on it. Also, the Mac's reputation for high quality and superior design has taken a hit because of heat problems and battery issues. Peter bought a Dell XPS machine instead of a Mac because it was $200.00 cheaper.

How important is the non-US market? The non-US market is huge, Asia especially. In Asia, intellectual issues are different, manufacturing processes are different, and there are big infrastructure issues. It's a new business and neither Intel nor Microsoft seems well prepared to compete. They are trying to maintain demand, with Microsoft buying into content producers. AMD has been giving Intel problems. 12 years ago HP and Intel started to work on a new architecture and they are still working with it. In fact, they had been working on long instruction word architectures for 18 years. Instead of having complicated hardware that determined parallelism at run time, hardware people thought software people could easily determine it at compile time. The chip could be optimized for sheer speed since it didn't require a great deal of complexity. Now there were a large number of instructions crossing processing boundaries and requiring huge caches. In 1988 HP started work on explicitly parallel computing. Peter has charts from 1995 to compare with today's reality. There was a world-wide assumption that you could not push the 386-486 architecture to much higher clock rates and that you were going to have to find a different solution. In 1994 Intel and HP announced they were going to work together on this. In 1999 IA64 was offered as the solution to advanced computing, but somebody went ahead with x86 processors. People liked this because they didn't have to change their software. Intel promised IA64-x86 compatibility. The compatibility was there, but the software did not run very fast. Every time the Itanium was delayed, the 486 got faster. By the time Itanium was shipped the 486 was very competitive in performance and was much cheaper. In 2001 Intel shipped Merced, the first Itanium product at the same time a 2GHZ version of a Pentium 4 was being delivered. In 2002 Microsoft demonstrated a 64 bit x86 version of Windows running on an AMD chip and said they would not do two versions of 64 bit Windows. Then Intel came out with an Itanium that was a 64 bit extension of x86 that runs Windows fine. In 2004 HP ceased being a partner with Intel in development of the Itanium hardware. In 2006 there are 1.6 GHZ versions of dual core Itanium processors with 4 gigabytes of memory and 18 megabytes of level 3 cache that have been combined into a very fast machine, but it costs $12,500. How do you make money? Not the way Intel and Microsoft are doing it. In 2003 Microsoft's Vista was supposed to be out in 6 months and they are still working on it. They are getting killed by Open Source and clones from AMD. AMD is defining the hardware platform and Linux is defining where system software is going. Office 2007 is such a big change that more retraining is required by a company using earlier versions of Microsoft Office than if it shifted to Open Office, which is free. In the market place, Europe has rules, the Asian market has no rules and Peter has no idea what will happen there. China has been very good on manufacturing infrastructure while India had better intellectual infrastructure. Now China is improving in the computer science area in collaboration with Microsoft, and India is improving on hardware.

Peter was asked "If Alpha had been developed would it have made a difference?" There is no evidence that anything could have competed successfully with the 64 bit extension of the x86 instruction set. He said that once AMD showed how to extend the x86 instruction set it made all the competitive architectures uneconomical. Intel spread its resources and lost technical credibility.

Everyone now understands the unbelievably huge importance of virtualization solutions as opposed to emulation solutions. Vista plugs a hole in Microsoft's revenue stream. There are two selling propositions remaining for Vista at this point. It will be more secure and PC makers need a new system such as Vista to sell new hardware. Hollywood is the principal Vista driver because it can be used to better implement DRM (Digital Rights Management). This has been coming for awhile, and Vista's 64 bits can be used to allow locking. It makes a big difference in the definition of what a PC is. There will be limitations on what the PC can do enforced by content providers. Consumers don't like it, but it could be forced on them. The FCC says that all TV will be HDTV someday and no one wants to buy both a 64 bit PC and a HDTV receiver because they are the same hardware. The convergence will result in a single device controlled by content providers. China has not been interested in complying with copyrights. Peter says eventually prices drop to what things are actually worth rather than what producers want to get for them and content owner controls will fail. Places to download videos such as YouTube are becoming very popular. The 18-34 year old market is shifting from TV and movies to Internet downloads. Yahoo is making money and is investing in infrastructure for doing Internet TV. They televised a Shuttle launch with 34 cameras and allowed the viewer to select the camera viewpoint. This may also be the future for sports reporting and for dramatic presentations. A movie called Time Code provides different viewpoints and the movie was shot in one take. Future movies can give you different viewpoints on events occurring at the same time and let you see something new every time you watch it. There will be devices marketed that are not DRM protected and people interested in producing new products for wide distribution. There was a Broadway play that had three different points of view, different on each night. There will be no profit in hardware, but huge profits in content. The home will be networked, probably by wireless. What will happen in the future? Everything will be tied up. The bad news is old medium producers are plugging holes by buying hours. The good news is that creative people are working around it. There are new avenues for creativity and originality available on the Internet.

How does online retail need to change? The current model is the horseless carriage analogy, online retail that still has brick and mortar stores. A needed function is that you to be able to find out why other people like or don't like products. Amazon does a better job than most by providing reviews. Your own site should give out bad news for your new product including the availability of a used copy at a lower cost. If you don't give that information on your site then people may look elsewhere and you may lose the opportunity for the sale. Peter had doubts about Amazon's concept earlier because he thought they were overextending, but now it seems to be working for them. Next generation retail, shouldn't we inform the shopper of the details on new products and conversions when a new product is installed on a machine? A download should inform you of needed downloads, recalls, and the availability of upgrades when you plug the device in for use. There would be interaction with the vendor on how you are using the device. This involves some loss of privacy but provides a useful service.

How long can free stuff like Open Office survive and make money? Peter referred to Windows NT and how IBM has made more money by providing support than Microsoft made by selling it. Peter said the life cycle cost of power and cooling is more than that for the software and the hardware box for a typical server. The law is open source, you can get all the knowledge you need by yourself but you still need a well-trained lawyer who has mastered this open knowledge. Applications that are economical to write are more numerous when you spend more money developing the idea and less money on the hardware and system software. Free and open software will have more people writing code and most open source code is being written by people interested in profiting from it. The question "How can you make money by giving software away?" is a misfocused question. More people will be writing code and making money than before. There will be a different focus of effort. Most people who run open source software are doing it for profit. IBM is making immense investments in Linux and they are not doing it for the fun of it. Anything that saves money for customers frees more money for them to buy IBM products and services. Sun is schizoid. You can download Solaris free or buy an expensive version. Sun sells its willingness and ability to support it and people are willing to pay for this if reliability and availability of the system are important to them. People will always pay to eliminate downside risk.

A dinner break was next, and was followed by Peter Coffee's main presentation. He addressed his 1995 predictions where his quantum leap projections for the next ten years were very conservative. Every prediction was laughably modest. In 1995 IBM was still shipping IBM PC's. At that time what IBM considered a home PC had eight megabytes of memory and 270 megabytes of hard drive storage and people thought Microsoft was over-reaching with its version of Windows. Recent machines have 125 times that memory and 1000 times the storage. The growth in storage has radically outpaced the growth of bandwidth and processing speed. PC speed has had a compound growth of 15% per year over many years. Compound growth of memory has been 25% and compound of storage has been 80% per year.

Why doesn't my PC feel as fast as it used to? Vastly more data of different types is being processed that is outpacing the increases in processing speed. Continue to expect that compression algorithms and caching strategies will be used more and more. Routing strategies that gives path directions for faster transmission are more and more in use on the Internet. Net neutrality will not work. Someone will always find a way to charge you what something is worth. Peter doesn't believe in it, he would like to be able to buy the level of service he wants instead of only being able to buy one level of service and hoping it is good enough.

Processing power is hugely out of proportion with what we want to do. Anything we can do to get the right content to the right place at the right time is a very high return on investment. Peter extrapolated from an 870 megacycle PC available in 1995 and predicted improvement by a factor of 3. The latest PC he purchased has a clock rate of 2.2 GHz on a machine with 2 processors and he isn't exactly sure how that matches against a typical 486 instruction, but the improvement has been about a factor of five. This was not a high end home machine. The prediction that "All the low hanging fruit is gone" was wrong. Parallel processing snuck up on us and the growth in bandwidth has been huge although not as much as the growth in storage. Verizon is putting in fiberglass to the house; the "last mile" is now here. The Verizon download capability is very high. It is 2 ½ times a normal TV video feed without any compression. The content to make good use of these capabilities will be provided eventually. Peter's predictions were off by a factor of 4 on performance and a factor of 8 on cost improvement. Fiber has been laid in the ground everywhere. Is fiber in the ground bandwidth ready to be used? This is not quite true as it requires lasers and paying trained people to install it, but there is an immense amount of capacity available. There are interesting dynamics in the competition to provide bandwidth to users. Ten years ago Peter talked about Gallium Arsenide vs. optical processing but that has turned out to be a very small fraction of the cost. It is hard to unwrap a photon envelope to look at what is inside it. You need to get it into bits to process it anyway. In the last ten years we have gone from traditional data structures to XML.

It is almost impossible to overstate the importance of XM with its verbose syntax. It increases traffic greatly, but is worth it. It has a great effect on the usability of user devices when things are parsed by the device at the spot of use allowing dynamic task oriented, user oriented systems. System development requires coding talent and the US has not been producing much of it. Enrollments in computer science courses are down, the number of people learning programming is down and an outstanding Torrance High School no longer has an AP Computing course. The person who was teaching it retired and has not been replaced. What are we doing? We are trying to make-do with typeless scripting languages that don't require as much depth of knowledge instead of very demanding languages that require more knowledge of computing to use. Basic was a very simple language without data types but made it very easy to write sloppy and undisciplined code. Scripting has much of the same design but there are richer and safer environments where it can be run. There is not enough investment in testing tools. The tools aren't used because finding a bug doesn't make your day any better. What is needed are tools that not only find the bug but provide solutions for fixing it. New test tools improve workflow, and are oriented toward both finding bugs and fixing them which makes them more popular.

Parallel processing requires new code for threaded processes. Intel was usually going for faster clock speed in place of parallelism but now they have moved to hyper-threading and quad multiprocessors. Intel has become interested in helping developers process things. Intel brought out three tools, Thread checker, Thread profiler, and Threaded Building Blocks. The building blocks are a C++ library with sophisticated classes and specialized processing tools. Will these tools work with AMD processors? Hyper-threading won't work, but the programming model is the same. A harder part is matching the tools with Linux and Windows threading models. These are trickier issues. For parallel processing, the hardware is here, the skills required are complex, and the infrastructure of the tools is coming. There are companies that are providing simulations that will help developers reduce time to market. There is greater understanding that simulations really matter.

Users will not tell developers things that are critical to them because it never occurs to them that something could be badly implemented. They think the state-of-the-art is the way it is represented in the movies and on TV. Marketing people don't get technology and technology people don't understand ROI (Return on Investment). There is an amazing gap between the people paying the bills and the people writing the code because they have disjoint success criteria. What has happened since 9-11? Corporations are asked about physical security provisions such as smart cards and safety of data centers. Do corporations have a broader plan? They agree that in the last five years there have been enormous problems, but their biggest concern is not physical security but the requirements and costs of implementing Sarbanes-Oxley requirements. If that is done wrong your boss's boss goes to jail if you can't certify the financial processes. There has been a lot of work on disaster planning, but the biggest problem has been corporate governance. Almost every single vendor Peter has talked to has not understood this. They have been selling their tools for the purpose they were originally designed and not showing how they apply to corporate governance. Corporate governance tools should be emphasized and technical upgrades should be sold for their applicability to resolve corporate governance problems such as auditability and levels of access to data. A corporation passed a Sarbanes-Oxley compliance test but their computer systems had been infected by the Slammer worm so unknown people could control their systems and they really couldn't insure compliance with anything. Vendors should emphasize how the tools can help companies with corporate governance problems because these are what they are worried about. Sarbanes-Oxley has overshadowed any other problems.

XML is largely about making it possible to have web services. You want a direct connection of application software to the web and have your spreadsheet interact directly over the web. The wire coming into the back of the computer is a more important source of input than the keyboard. You need to think of spreadsheets as collaborative decision support and analysis tools. Microsoft has promised they won't sue anyone for using their patented technology to implement a web services standard. There is a good reason for them doing this, because web services connect applications to applications, especially for large clients. Microsoft knows how to sell to rich client platforms and this helps them do it.

In 1995 Peter predicted that by the year 2020 there would the capability of 10 tera-operations/second, just about the level that Hans Moravec predicted as human capability. Peter said he better have a few levels of magnitude off because we are nearly there now. In 2008 a petaflop machine with thousands of processors will be up and running using a very complicated programming model. What can you do with a petaflop system? Currently you can model weather on 120 kilometer squares updating it every 30 minutes. With a petaflop machine you can model on 10 kilometer squares every 6 minutes. Peter predicts that people will be willing to pay for more reliable weather predictions. Enormous bets are made on what the weather will be. That is just one example, earth resources management and advanced simulation are others. Cars get crashed many times on computers before physical tests. The 787 Boeing Dreamliner is the first aircraft completely designed by computer. Transformations change the way we do things, it's not just about doing things faster but about doing them more completely and in a different manner. This is about being able to run simulations immediately and give immediate answers to customers. It is about doing things we have never done before.

Question and Answer session.

What is enabling part timers to get jobs done? Companies have been able to use technology to make jobs easier to do and the technology doesn't require as much knowledge to operate. A lot of middle-level workers are being displaced by technology and this is part of the reason that outsourcing has been successful. Insurance adjusters can get pictures of damage and make their estimates without going to the scene of the accident. Initially the result has been a race to the bottom as you get less service than you did before. As time goes on it can increase service. McDonald's has been taking orders remotely leaving the cooks to do the cooking and food preparation. What if it comes out wrong? Well, is it any worse than before when on-site personnel took the order and passed it back to the cooks? Self service in supermarkets has sped up the processing, but reduced impulse sales at the checkout line.

It has been said that the Windows mono-culture is a threat to national security because too many people are using the same system. Has the mono-culture become any better? Anything that adds complexity increases vulnerability. Peter says things have become better and that given the low level of IT expertise perhaps the standardization of the mono-culture is actually saving us from further harm.

Computer environment and systems-visible complexity is out of control and people don't like it. The fastest growing messaging use is text messaging which is very simple, with fewer complex forms. People want simplicity and ease of use. Even the IRS has become simpler with online forms. There is more complexity behind the scenes. Web services are wonderful, but you don't control it. Now other people you don't know are writing and using the code and systems have become non-deterministic. Software developers have to become more sophisticated about failure scenarios. Users have to give consideration on how to recover from errors. We are seeing more outsourced services that help users. There is more opportunity to center capabilities and sell them over the Internet.

Peter doesn't recommend Office 2007. He liked it because there was easier accessibility to many things that he used regularly. There are a lot of people that don't use those things and the user interface putting them directly in front of the users does not help them at all. Peter said he made the error of assuming that because he liked the things he needed out in the open that other people would also like them. Instead they found things they were not using pushed in front of them, confusing and annoying them.

The medical profession is the poster child for technology lag. Doctors are still writing notes and prescriptions that can readily be misunderstood. There are billing and health insurance portability issues. Peter thinks that as more people get unhappy about medical service as the baby boomer generation ages that improvements will be made. There is greater regulatory scrutiny that will require better auditing and control and there will be improvements in health care information technology that will enable things to be accomplished easier and at lower cost so improvements will come.

This was a great talk by Peter Coffee. I hoped that my expanded article would be a better one. I did find a few places where corrections were needed and I was able to add additional details from his talk. There were still areas that I didn't quite understand well enough to write up, and Peter gave quite a few examples and provided witty remarks that I was unable to cover properly. If you were there you got the full effect of his presentation and will know how limited this report really is.

This was first meeting of the LA Chapter year and was attended by about 60 persons.
Mike Walsh, LA ACM Secretary 

The next meeting will be Oct. 4th. for an exciting talk from JPL about one of their current or recently completed projects.
Join us in October!


The Los Angeles Chapter will meet this September on the second Wednesday, Sept. 14th, at the LAX Plaza Hotel (Formally the Ramada), 6333 Bristol Parkway, Culver City, California 90230, (310) 484-7000. (Click here for a map.)

Directions to the Hotel from the San Diego (405) Freeway:

Southbound: Exit on the Howard Hughes Parkway, turn right on Sepulveda, turn right on Centinela, go under the freeway to the first signal light, and left onto Bristol.

Northbound: Exit on the Slauson/Sepulveda exit, turn right at the first signal light (Green Valley), and right at the first signal light onto Bristol.

There is a Parking Fee: $4 with validation

The Schedule for this Meeting is

6:00 p.m.  Networking

6:30 p.m.  Round Table Q&A with Peter Coffee

7:00 p.m.  Dinner    The dinner will be a buffet

            Avoid a $5 surcharge!!
            Reservations must be made by Sept. 7th to avoid the surcharge.

            Make your reservationsearly.

8:00 p.m.  Presentation

 
Reservations

Please join us for dinner. Call Mike Walsh at (818) 785-5056, or send email to Mike Walsh with your name and telephone number by Sept. 9th.

Reservations are not required for the 8:00 p.m. program, for which there is no charge. Paid dinner guests will be eligible for a drawing at the end of the program.

For membership information, contact Mike Walsh, (818)785-5056 or follow this link.

Other Affiliated groups

SIGAda   SIGCHI SIGGRAPH  SIGPLAN

LA SIGAda

Return to "More"

LA  SIGGRAPH

Please visit our website for meeting dates, and news of upcoming events.

For further details contact the SIGPHONE at (310) 288-1148 or at Los_Angeles_Chapter@siggraph.org, or www.siggraph.org/chapters/los_angeles

Return to "More"

Past Meeting Archive Los Angeles ACM home page National ACM home page Top

 Last revision: 2006 1124 [Webmaster]
 Page posted:   2006 1124