Top
Past Meeting Archive Los Angeles ACM home page National ACM home page Click here for More Activities this month
Check out the Southern California Tech Calendar

Joint Meeting of
the Los Angeles Chapters of
ACM and Association of Information Technology Professionals (AITP)

Wednesday, September 14, 2005

"The State of Computing"

Peter Coffee
eWeek

(Note that this meeting will be on the second Wednesday in September)

Peter Coffee will deliver his annual talk on the state of the PC and computing. Some lingering questions from last year. Are web services the computing of the future? Will Microsoft ever be able to go for a week with out finding another security hole? Will the internet get buried in spam? On the Linux side of things. What is the future for Linux? Will surviving its legal battles keep Linux as a low cost Server Operating System or will its price go up? Will Linux ever make it as a desktop OS to compete with Windows.

On the PC side of things. The big question this year concerns Windows XP SP2. Has it solved the security problems? Will Windows Vista (Longhorn) really be a giant leap forward? Will dual core CPUs be the norm. Apple has announced they are going with Intel. Leopard (OS X for the Intel) is slated to be released during the same time frame as Vista. Many reviewers are saying Vista looks a lot like OS X Tiger. Will Apple improve on Tiger with Leopard?

Peter will hopefully answer some of these questions and give us his ideas of what the future holds.

Peter Coffee has been covering IT developments for 13 years as a product reviewer, technology analyst, and opinion columnist for the national newspaper of electronic business, eWEEK. With an engineering degree from MIT and an MBA from Pepperdine University, he combines both technical and managerial perspectives in his examinations of emerging technologies that range from cryptography to software development tools and high-speed microprocessors. He has authored three books, "Peter Coffee Teaches PCs," "How to Program Java," and "How to Program Java Beans." He has assisted CBS News, MSNBC, and the PBS News Hour in covering events as diverse as the Microsoft antitrust trial and the worldwide attacks against high-profile Internet sites. His weekly column and other writings appear both in print and on eWEEK's Web site at www.eweek.com

Come at 6:00 PM to network with your fellow professionals. A round table with Peter will start at approximately 6:30 PM followed by dinner and talk.
 

~Summary~

LA ACM Chapter Meeting
Held Wednesday, September 14, 2005

LA ACM Chapter September Meeting. Held Wednesday September 14, 2005.

The presentation was "The State of Computing" by Peter Coffee, Technological Editor for eWeek magazine. This was a joint meeting of the Los Angeles Chapter of ACM and the Los Angeles Chapter of the Association for Information Technology Professionals (AITP) and was held at the LAX Plaza Hotel (Formerly the Ramada Hotel) in Culver City. This year AITP did the hard work of making the arrangements for the meeting and deserve congratulations for successful accomplishment of this task.

Peter Coffee started out with his round table presentation before dinner. His first question was about Windows Vista. Peter said that was a good question because he was now attending a large Windows conference and his work day had begun at 6:00 A.M. yesterday. He had a triple report to make on operating systems, Office 12 and the arrival of mass-market 64 bit PCs. Microsoft is projecting they will sell 475 million units of Windows Vista and Peter believes they may be right. Vista has extended itself in improving applications and people want Office 12 now, they don’t even want to wait for Beta. Peter said he has had no inclination to upgrade from Office 97 before but Office 12 has had the most fundamental reengineering of the user interface since 1984. The top portion of the screen is dynamic and it is easy to determine what you have available. The improvement is outstanding and the dynamism is enormous. Before people using Power Point who were using the same briefing charts had to be careful to maintain coherence. Office 12 is much better; Power Point references the same chart that everyone else is using. The underlying store is XML based and it can go and ping Excel for the same data. In 2003 Indigo was supposed to be important and it was supposed to include an improved file system that has now tailed into the future.

The presentation side, the Avalon display with an underlying graphic model that allows you to composite things. Vector based videos with overlay are provided in Vista. Size can be changed easily and there is the ability to overlay a document over a shape. Windows will have the same scaling abilities as the Mac as well as additional capabilities. This will generate a lot of hardware sales but you won’t mind because of the ease of navigating through the database. Is this going to make gamers of all of us? Peter said he can understand the reaction, but after you use it you don’t want to give it up.

Most of us know that just carrying a handheld PDA doesn’t work, most of the time you need a full keyboard and a notebook to do anything really meaningful. However, if you only want to verify some data you don’t want to boot up the full machine. Microsoft is working on just bringing up a PDA like region on a notebook fast. Microsoft’s usability research is excellent; no detail is too small for them. They have an enormous number of intelligent, motivated people. Peter says the number of people writing books on how to find and use Microsoft application is probably going to shrink because of the improved discoverability of applications in Vista. The underlying core of the Mac 10 is better code, but this is an enormous upgrade. It is a compelling release. The hardware requirements for Vista will be major, but we will buy the hardware and be happy to have it. Developer tools will be out in November 2005. People are already expecting to write code for the communications foundation which will be back ported to Windows 2000.

There were communications demonstrations on Microsoft Windows Indigo two years ago. One demonstration showed an application on a camera cell phone where you could photograph an accident scene, have GPS tag the location, and the network tag the time. The last six hours of weather data provided and the entire data provided to an insurance investigator for his report. From start to finish the entire code was 40 lines of C# for the application.

Another demonstration was of a patient with a sensor who has a problem and uses a pager to beep for a doctor. Then a web service goes to a data queue and gets cardiac data information from the patient to the doctor. Internet data will improve greatly and be more asynchronous in about two years. The communication function does not need a new release of Windows. They are going from dial-up to broadband everywhere and this is changing our ideas about Internet connectivity. Broadband has reached over 50% of the users and in the next few years the other 50% will probably sign up. Kids automatically use the computer to look up things in the dictionary and get results from several different dictionaries and all of the comparative meanings. That's not a Vista thing, but is important in indicating how use of the Internet is advancing. The connectivity aspect does not depend on Vista, it is back portable to earlier platforms but Vista provides creative ways of handling data.

Last year Sun provided an example of how Amazon makes it easy to track shipping. You can find things and easily track things that are shipped. It doesn't just happen in a browser; it is done in the background. This isn't new, but Vista provides much improved connectivity to accomplish the task.

What is the time frame on the 64 bit processor? The 64 bit processor may be driven by the deadline on HDTV processing. You need to look at the economics of this. Why will anyone want to buy a stand-alone HDTV processor when they can get it with major additional functions? Will the HDTV mandate trigger the fusion of TV and computer? If you already have wireless then any display anywhere in the house may be connected. There are a lot interesting applications that can be done using this. Peter said he had predicted that this fusion would occur earlier, so he is reluctant to predict when it is going to happen. The underlying trend is in that direction.

If you have HDTV on IP you can do what Yahoo has done, with 32 different camera views available on the last shuttle launch carried on Yahoo TV. On EBay you can get in on bidding in real time. You can be watching TV and be notified if you have been outbid on something. Then you can decide whether not to increase your bid. This integration of entertainment and active use of the net has been around as an idea for some time. The 64 bit tipping point may show up because of HDTV and FTC requirements. The next year could be a very interesting time.

Microsoft handed out 6 Vista DVDs on their early builds and will release two more tomorrow. Have they fixed the Office document numbering problem in Office 12? Peter said that he did not know and doesn’t know whether Microsoft considers technical documentation as a significant market. What causes the decline in technical conferences? Peter answered that web conferencing really works. You don't have to get a plane to go somewhere. Who wants to get to a plane today?

Open Source – What is happening and how are they dealing with it? Peter said that now instead of considering Open Source a risk you have to say "Why not use it?" Linux is the default choice for many appliances. There are a lot of Linux users who don't know it, but they use Linux when they access applications on other systems over the Internet. The old model is running on a set of Microsoft products. The new model is open a browser, access Google mail. We get Google changes automatically. It runs on everything and is an enormous threat to Microsoft. Microsoft's comeback with Vista is to make sure your experience in having Internet access is better on Vista than anything else. Open Source matters hugely. Some countries such as Brazil, China, and Germany and also the state of Massachusetts won't buy Office because its code is not open and easily available. How does this differ from PDF? PDF is open to all, only the copyright is held by the owner. Open Source places tremendous pressure on everyone including Apple, Microsoft and a lot of other people.

Microsoft reengineered their entire engineering process to provide greater usability. This led to a comment from the audience "I'll believe it when I see it." Google is very good, but Microsoft doesn't need to kill Google to win. It isn't clear what Google's long term business plan really is. It isn't clear how to sell advertising based content.

Are there any differences around the world? Peter believes that things will converge. Salaries in India are going up 14% to 20% per year. At this rate it only takes about 20 years to close the salary gap with the U.S. Markets reach equilibrium states. Japan was getting beaten by Korea in producing steel and changed its approach to producing specialty steels. China is breathing down everyone's throat; however China's standard of living is rising. Peter doesn't think the cost differential can be maintained and he doesn't believe that we will all end up at the economic level of a Pakistani bricklayer.

Can Office productivity suites be provided successfully over the Internet? Office relies on Microsoft server capabilities and Office 12 now provides positive results that tie in with Microsoft server capabilities. Rather than writing an application by interviewing people, begin by flowcharting the work flow. The high level extractions of this document show how you go from one state to another after particular actions are taken and build an organization specific work flow. You can set break points on these points the way you can set break points in code now. You can war-game your work flow to see what works and what doesn't work. At no point do you leave a development tool and go to a separate software development tool. This is an important integrated capability dependent on interaction between Microsoft software and servers. In the next six months Office 12 looks very good, provides more usability, and is a big step forward for Microsoft.

This was the end of the "Round Table" and the next part of Peter's talk occurred at the main presentation after dinner.

Peter said that before dinner he had talked about things in the present day and now he would focus on what he believes will happen in the next 18 months to ten years. He said there is now a two way street between fundamental research laboratories and the enterprise world. Some things that did not used to be issues in research but were in the enterprise world are now at a point where the research world needs solutions to problems discovered and solved by enterprises earlier.

Nanoscale fabrication is coming and the semi-conductor industry is having the willies because the rate of improvement is going down. Part of the problem is atoms come in only one size. Gate oxides, the insulating part, are about 1 nanometer thick which is not much of a barrier for a free electron. Meeting Moore's law prediction is possibly not desirable because at some point fabrication plants get so expensive they can only be paid for by nation-states. We have to investigate the basic assumption that we just need to continue what we have been doing, but only more so. Natural systems don't use nanometer devices. They work just well enough to accomplish their functions and not a whole lot more. We must look at affordably fast and affordably small. Now we are looking at new hybrid approaches between biological and silicon systems. There is a nanowire transistor technique that fabricates devices using a hybrid approach that lays down the channels and fills in the molecules and the molecules finish the job of constructing nanoscale devices. This is the key because memory isn't a terrible challenge at the nanoscale level and the technique could be available on a flash card in the next few years. Putting it into effect is a business decision dependent on costs and the availability of competing techniques that meet requirements rather than a technical decision. Memory just needs to have state, but logic devices must have gain. You have to have something that functions like a transistor. When people ask "What are the implications of nanoscale devices?" you have to ask them "Do you mean memory state or logic and micro processor devices?" because they are fundamentally different problems.

Rather than trying to build denser and denser, more and more complex cores there is an indication that the pendulum is moving to not push clock rate and density so hard. Sun says the sweet spot might not be a hypothetical Sparc5 but multiple Sparc3's on a chip and finding an efficient way to use them. They are working on a 32 CPU system in one rack. It will be able to run thousands of applications simultaneously on the Solaris operating system.

If that doesn't impress you then you can look at IBM's Blue Jean system. IBM said one rack of Blue Jean would be moderately sized and the physical size of the cabinet is part of the cooling process. They actually slow down chips to provide fewer cooling problems. They put a rich set of networks into a single cabinet with a high density of input/output hardware. The rack is smaller than most people's refrigerators and has 2400 processors, 12 gigabytes of physical RAM and has 5.7 teraflops of processing power in one rack. It is air cooled, not liquid cooled. Peter has seen pictures of a 24 x 24 grid of these racks. While IBM has these racks in their labs doing the testing before they ship them to a user, they have these units that come and go on the top 500 super computer list and they have 4 or 5 of these racks being tested before they go out to be integrated into a 64 rack system. They are happy to have one of these Blue Jean system racks going into production this month. People on a network will be able to bang on a Blue Jean rack with some of their research applications. If it works on one rack with 2400 processors then it is probably scalable to additional units. This is an extremely cost-effective architecture that uses air-cooled modules with teraflop performance. There is an incredible emphasis on input-output richness.

The other really significant device is a computer that once again fills a room called Data-Star down at San Diego which is using 1024 and 2048 processing nodes. The system has a high amount of input and output processing modes. There are many applications that desire the chance to run on a 2048 CPU version of Data-Star. The system is there and is ready to run. The Director of User services there told Peter that it used to be that writing code to run on a 256 node system was considered to be ambitious and that scaling beyond that was something very difficult. Now many applications are ready for an extension from a 1024 to a 2048 processing node system.

These machines are used for fluid dynamics, crash testing of automobiles and other tests that are very expensive to run. You can run tests to confirm answers developed on these machines and not have to run multiple tests. Even while people are lining up to use Blue Jean modules, the pace of the turnover of the PC market is steep. There are research clusters of Pentium 3 processors and when people go home they are using Pentium 4's. The research cluster built on a research grant becomes more obsolete as time goes by. In the meantime unused PC's can be used for computing on line and can outmatch the older research systems.

The rate at which PC's on desk tops storage has increased has been 90% per year compounded for 20 years. The equivalent of 57 miles of punched cards is stored in the 40 gigabyte disk of an Ipod. Processing speeds have been growing by 60% per year and bandwidth by 35% per year. There is a great imbalance where you have a vast amount of data accumulating on individual machines outpacing the ability to move it around. They need a lot of high speed memory to run jobs fast. The problem now is there is data from costly, destructive experiments that you don't want to rerun. Sometimes there is data from things like a deep space experiment that cannot be rerun. High speed computing centers are asked to become archivers as well as snap-shotters of data. The data centers are moving towards not using fiber channel memory but rather cheaper serial ATA such as you find in a good PC. It meets their needs for a relatively low duty cycle but with very large data requirement. These are new requirements for high speed computing centers, but are problems that enterprises have been dealing with for decades. Continuity of data storage hasn't been a problem in research areas before, but is now. On the other hand, enterprises are frequently requiring high speed computing processes previously only needed by research groups. Bandwidth has been growing much slower. This bites you hard when large amounts of data must be processed soon and it takes hours to send terabytes across lines. The problem is very application specific and you have to match your needs to that application.

Application developers can no longer afford to think of the net as a pipe. Sun has claimed that the network is the computer, but now you have to debug the network. You have to think of the network as an actual part of your application. Your biggest problem is memory bandwidth and the speed of light delay becomes the largest delay component. Even if your computer had an infinite clock rate the speed of light doesn't change and the transmission rate doesn't increase.

Algorithm design is a good choice for a future career and is going to be more complex than ever in the future. There are horrendous problems to be resolved by algorithm development. More and more data is being produced and stored at the enterprise level and wait until RFID cards are placed on nearly everything. Application processing has to be pushed out to the point where the data originated because where you need the data is where you do the work. Many bits are collected that have no value, but how do you determine which bits are valuable and which are not? New interactive tools are needed and some very good tools are being developed. A lot more data is going into databases as opposed to flat files. What do you do with a directory containing 1,000 files? How do you come up with a 1,000 meaningful file names? Intelligent database engine processing is required to reduce the problem of transmitting very high amounts of data. Things have to become more closely coupled with hardware and more closely coupled with networks which is the opposite of the conventional wisdom of the last 20 years. The Programming task is getting harder, not easier. Trends are reversing, you can't ignore network effects.

A Sun researcher said "It's not hard to find the point where people don't want things to be automatic." They want system transparency and for it to give warning to a person to take action. They want it to give suggestions, but they want to be able to over-ride those suggestions. Now it is important in designing computer systems to determine where to draw the line between decisions made by the computer and those that are reserved for humans.

Sun has hardware where a single blade has dozens of physical system sensors. They have a software process that can predict failures weeks ahead of time using continuous monitoring and pattern recognition. They are making use of older sensors designed to detect immediate failure modes and using the data to predict longer term effects.

There are many interesting new regulatory and statutory mandates for transparency and specifiability of the system you have running. One example is health services systems. New better instrumented systems are much better in disclosing their states and providing capabilities for auditing which is necessary to meet the required mandates.

We know most of the frustration and anger of people is because of software, not hardware. Microsoft is trying a safe operating system that runs in a completely self-aware environment and makes many self-checks. It isolates things from each other so that a problem in one area cannot propagate into another. They built an operating system in C# and have it all in managed code. You can’t pass pointers into other spaces when you are using an operating system process called Singularity. This is a research technology; they are not trying to claim it is ready or that it meets performance criteria. If a process dies and you kill it other processes have to recover but they recover in more of a contractual way because they can know their own state at the time of failure. They believe this is an important direction to go in.

People use to worry about Java overhead, now they worry about productivity and reliability problems caused by not using Java. Writing code is still a frustrating business and people keep trying to find ways to do it more cheaply. There is a desire to get coding done by very good people for skilled high level items and get it done cheaply for lower level needs. This will require program management techniques that work across time zones and across gaps of culture and language. There are development tools to make this easier that allow people to collaborate on files. People a continent away can work as if they are at the same work station near each other. This won't resolve things if the people have bad ideas and have differing concepts about what the software should do. New testing tools are being developed with semantic criteria to examine source code. Tools are more complex but developers are finding the results more relevant. End to end performance at runtime is still difficult to predict and is becoming more important to analyze. People working on multiple teams need a more unified project processing environment that can display things to people in different locations at the same time. MIT set a goal a year ago of becoming end-to-end wireless and got there a few months early. MIT a few decades ago had rooms called Athena complexes using micro-VAX work stations that you could drop in and work on. This has become a totally irrelevant model of public shared computing. Now they are being replaced with furniture that provided space for you to set a wireless laptop and log into the system. MIT's advice to entering basic computer students is to buy a WiFi laptop. They are redesigning the physical spaces in which people have meetings around that assumption. They are getting interesting dividends from this as that they don't need to replace cables with more advanced cables, they just put in wireless and building nodes.

Wireless brings up questions of security. Wireless started out either without security or with security that was not easily manageable. A system called Sparcle is now being worked on that lets you write English language statements of security policy, digests them, and generates a matrix of what it believes to be the allowable interactions. This provides much more rapid discovery of stupid mistakes in complex interacting policies. This could be a product within a year.

Things we have taken for granted for the last 20 years as trends are wrong. The continual increases in clock speed and device density are not the way to get more processing power. The degree of abstraction where you didn't need to worry about the level of implementation other than the level you were working at has turned out not to be true. You have to worry about the limitation of the speed of light and the continuing growth in the size of data transmitted. All the things that you thought were solved problems are not. Every fundamental statement about where computing is going to go between 1990 and 2010 has to be considered as subject to question as the things people want to do continue to push up against whatever boundaries technology creates. When you are banging up hard against those boundaries you discover you need to rethink your assumptions.

This was the end of Peter Coffee’s prepared remarks and the program moved on to a question and answer section.

Will fiber channel be replaced in 3 to 5 years? Peter answered "No, not in that time span." "There is too much investment in hardware, tools, and skills for that kind of a change." Bill Gates said that we overestimate what we can do in 3 years and underestimate what we can do in 10. IP is subsuming other network standards and no new capacity will be built that does not use Internet standards. Eventually all the non-Internet stuff becomes a backwater and a legacy corner.

When will the Internet be safe? When people are ethical and smart. It won't be safe, and no one should be told that it will be.

Are PDA's going away? Peter says his is not. The model is not having to carry a device at all. We should not be carrying around storage and worrying about whether we have backed up our laptops. You can carry around a smart card that will allow you to get to your desk top from other locations. The backups can be handled by professionals. Peter doesn't want to have to worry about whether or not he is backed up and doesn't want to have to carry a hard disk and a keyboard with him. He wants to be in an environment that recognizes him and directs him to the proper equipment.

What about wireless security problems? Encryption is becoming cheaper and better for users. SIGINT still is interesting as compared to crypto because information can be determined by traffic analysis. Systems provide random chatter to counter this. There are trade-offs on limiting allowable places to log in and providing the capability to log on remotely in case of an emergency. This is a cultural problem, not a technical problem.

There are a lot of security tools out there to prevent problems. Firewalls protect from outside but are easy to compromise inside the firewall. All of them fail against the trusted employee who has access to information as part of his job but is actually not trustworthy. Peter doesn't believe in software internal firewalls except at the level of the operating system.

How do you see IPV6 rolling out? Peter sees the areas in the world with the most rapid growth in connectivity installing IPV6 because it makes more sense. People who have not put it into effect because they didn't believe they needed it will need to IPV6 to stay competitive.

This was another of the superb presentations by Peter Coffee that he has provided to us year after year. He presented a huge amount of material within a short period of time. This report is fairly long, but doesn't include many examples and asides that made the speech both interesting and entertaining.

This was first meeting of the LA Chapter year and was attended by about 50 persons.
Mike Walsh, LA ACM Secretary 

The next meeting will be Oct. 5th. Check back in September for details about the meeting.
Join us in October!


The Los Angeles Chapter will meet this September on the second Wednesday, Sept. 14th, at the LAX Plaza Hotel (Formally the Ramada), 6333 Bristol Parkway, Culver City, California 90230, (310) 484-7000. (Click here for a map.)

Directions to the Hotel from the San Diego (405) Freeway:

Southbound: Exit on the Howard Hughes Parkway, turn right on Sepulveda, turn right on Centinela, go under the freeway to the first signal light, and left onto Bristol.

Northbound: Exit on the Slauson/Sepulveda exit, turn right at the first signal light (Green Valley), and right at the first signal light onto Bristol.

There is a possible PARKING Fee of $5.00

The Schedule for this Meeting is

6:00 p.m.  Cocktails & Social

6:30 p.m.  Q&A with Peter Coffee

7:00 p.m.  Dinner    The dinner will be a buffet

            Avoid a $3 surcharge!!
            Reservations must be made by Sept. 9th to avoid the surcharge.

            Make your reservationsearly.

8:00 p.m.  Presentation


*****

Please Note: The Council Meeting for September will be held on Wed., Sept. 7th, at 5:30 PM at LMU, University Hall, Room 1767 (Executive Dining Room).

*****

 
Reservations

Please join us for dinner. Call Mike Walsh at (818) 785-5056, or send email to Mike Walsh with your name and telephone number by Sept. 9th.

Reservations are not required for the 8:00 p.m. program, for which there is no charge. Paid dinner guests will be eligible for a drawing at the end of the program.

For membership information, contact Mike Walsh, (818)785-5056 or follow this link.

Other Affiliated groups

SIGAda   SIGCHI SIGGRAPH  SIGPLAN

LA SIGAda

Return to "More"

LA  SIGGRAPH

Please visit our website for meeting dates, and news of upcoming events.

For further details contact the SIGPHONE at (310) 288-1148 or at Los_Angeles_Chapter@siggraph.org, or www.siggraph.org/chapters/los_angeles

Return to "More"

Past Meeting Archive Los Angeles ACM home page National ACM home page Top

 Last revision: 2005 0917 [Webmaster]