April 3, 2006
PLEASE NOTE: This is the fifth part in my series of essays on the history of Systems Development. This week's issue will discuss events prior to and including the 1990's.
As the PC gained in stature, networking became very important to companies so that workers could collaborate and communicate on a common level. Local Area Networks (LAN) and Wide Area Networks (WAN) seemed to spring-up overnight. As the PC's power and capacity grew, it became obvious that companies no longer needed the burden of mainframes and minis. Instead, dedicated machines were developed to control and share computer files, hence the birth of "client/server computing" where client computers on a network interacted with file servers. This did not completely negate the need for mainframes and minis (which were also used as file servers), but it did have a noticeable impact on sales. Companies still needed mainframes to process voluminous transactions and extensive number-crunching, but the trend was to move away from big iron.
Thanks to the small size of the PC, companies no longer required a big room to maintain the computer. Instead, computers were kept in closets and under desks. This became so pervasive that companies no longer knew where their computer rooms were anymore. In a way, the spread of computers and networks closely resembled the nervous system of the human body.
One of the key elements that made this all possible was the introduction of Intel's 30386 (or "386") chip which allowed 32-bit processing. To effectively use this new technology, new operating systems had to be introduced, the first being IBM's OS/2 in the late 1980's. OS/2 provided such things as virtual memory, multitasking and multithreading, network connectivity, crash-protection, a new High Performance File System, and a slick object oriented desktop. Frankly, there was nothing else out there that could match it. Unfortunately, Microsoft bullied its way past OS/2 with Windows 95 & NT. By the end of the 1990's, OS/2 was all but forgotten by its vendor, IBM. Nevertheless, it was the advent of 32-bit computing that truly made client/server computing a reality.
Another major milestone during this decade was the adoption of the Internet by corporate America. The Internet actually began in the late 1960's under the Department of Defense and was later opened to other government and academic bodies. But it wasn't until the 1990's that companies started to appreciate the Internet as a communications and marketing medium.
The first web browser was developed by Tim Berners-Lee in 1990 which led to the World Wide Web protocol on the Internet. Early web browsers included Mosaic, Netscape Navigator, and Microsoft's Internet Explorer, among others. The beauty of the Internet was that all computers could now access the Internet regardless of the operating system, making it a truly universal approach to accessing data. To write a web page, a simple tag language was devised, Hyper Text Markup Language (HTML), which was compiled at time of request to display the web page. HTML was nice for developing simple static web pages (not much interaction, just simply view the web page). Developers then invented new techniques to make a web page more dynamic thereby allowing people to input data and interact with files, which ultimately allowed for the merchandising of products over the Internet.
Wanting to do something more sophisticated through the web browser, Sun Microsystems developed the Java programming language in 1995. Java was a universal programming language that could run under any operating system. Their mantra was "Write once, run anywhere." This was a radical departure from programming in the past where it was necessary to recompile programs to suit the peculiarities of a particular operating system. Basically, Java made the operating system irrelevant, much to Microsoft's chagrin. Further, Java could be used in small pocket devices as well as in the new generation of computers powering automobiles. This did not sit well with Microsoft who ultimately fought the propagation of Java.
By the 1990's the Structured Programming movement had fizzled out. Instead, "Object Oriented Programming" (OOP) gained in popularity. The concept of OOP was to develop bundles of code to model real-world entities such as customers, products, and transactions. OOP had a profound effect on Java as well as the C++ programming language.
During this time, source code generators faded from view. True, companies were still using report writers and 4GL's, but the emphasis turned to "Visual Programming" which were programming workbenches with screen painting tools to layout inputs and outputs.
The Relational DBMS movement was still in high gear, but the use of Repositories and Data Dictionaries dropped off noticeably. Of interest though was the introduction of "Object Oriented Data Base Management System" (OODBMS) technology. Like OOP, data was organized in a DBMS according to real-world entities. Regardless, Relational DBMS dominated the field.
Also during this decade "Data Mining" became popular whereby companies were provided tools to harvest data from their DBMS. This effort was basically an admission that companies should learn to live with data redundancy and not be concerned with developing a managed data base environment.
Because of the radical changes in computer hardware and software, companies became concerned with their aging "legacy" systems as developed over the last thirty years. To migrate to this new technology, a movement was created called "Business Process Re-engineering" (BPR). This was encouraging in the sense that companies were starting to think again in terms of overall business systems as opposed to just programs. I'm not sure I agree with the use of the term "Re-engineering" though; this assumes that something was engineered in the first place (which was hardly the case in these older systems).
Nonetheless, CASE-like tools were introduced to define business processes. Suddenly, companies were talking about such things as "work flows," "ergonomics," and "flowcharts," topics that had not been discussed for twenty years during the frenzy of the Structured Programming movement. Ultimately, this all led to the rediscovery of systems analysis; that there was more to systems than just software. But by this time, all of the older corporate Systems Analysts had either retired or been put out to pasture, leaving a void in systems knowledge. Consequently, the industry started to relearn the systems theory, with a lot of missteps along the way.
Companies at this time were still struggling with devising a suitable development environment. Most were content with just maintaining their current systems in anticipation of the pending Y2K (Year 2000) problem (where date fields were to change from 19XX to 20XX which could potentially shutdown companies). However, a few companies began to consider how to apply more scientific principles to the production of systems. Since people were already talking about "Software Engineering," why not apply engineering/manufacturing principles to the development of total systems?
Back in the early 1980's, Japan's Ministry of International Trade & Industry (MITI) coordinated a handful of Japanese computer manufacturers in establishing a special environment for producing system software, such as operating systems and compilers. This effort came to be known as Japanese "Software Factories" which captured the imagination of the industry. Although the experiment ended with mixed results, they discovered organization and discipline could dramatically improve productivity.
Why the experiment? Primarily because the Japanese recognized there are fundamentally two approaches to manufacturing anything: "one at a time" or mass production. Both are consistent approaches that can produce a high quality product. The difference resides in the fact that mass production offers increased volume at lower costs. In addition, workers can be easily trained and put into production. On the other hand, the "one at a time" approach is slower and usually has higher costs. It requires workers to be intimate with all aspects of the product.
MBA took it a step further by introducing their concept of an "Information Factory" in the early 1990's. The Information Factory was a comprehensive development environment which implemented MBA's concept of Information Resource Management. Basically, they drew an analogy between developing systems to an engineering/manufacturing facility, complete with assembly lines, materials management and production control. These concepts were proven effective in companies throughout Japan, most notably Japan's BEST project, which was sponsored by the Ministry of Finance. As background, the ministry wanted to leapfrog the west in terms of banking systems. To do so, they assembled a team of over 200 analysts and programmers from four of the top trust banks in Japan; Yasuda Trust & Banking, Mitsubishi Trust & Banking, Nippon Trust & Banking, and Chuo Trust & Banking. By implementing MBA's concepts they were able to deliver over 70 major integrated systems in less than three years. Further, because they had control over their information resources using a materials management philosophy, the Y2K problem never surfaced.
In terms of infrastructure, development organizations essentially went unchanged with a CIO at the top of the pyramid and supported by Software Engineers and DBA's. But there was one slight difference, instead of being called an MIS or IS department, the organization was now referred to as "IT" (Information Technology). Here again, the name hints at the direction most organizations were taking.
Finally, the 1990's marked a change in the physical appearance of the work force. Formal suit and ties gave way to casual Polo shirts and Docker pants. At first, casual attire was only allowed on certain days (such as Fridays), but it eventually became the normal mode of dress. Unfortunately, many people abused the privilege and dressed slovenly for work. This had a subtle but noticeable effect on work habits, including how we build systems.
THUS ENDS OUR DISCUSSION ON THE 1990's. NEXT WEEK, WE'LL HAVE PART VI - WITH A LOOK AT THIS DECADE AND MY CONCLUDING COMMENTS.
OUR BRYCE'S LAW OF THE WEEK therefore is...
"The word 're-engineering' implies something was 'engineered' in the first place,
which is rarely the case."
IN OUR "DOWN THE ROAD" SECTION
The Quality Assurance Institute will be holding its 26th Annual Quality Conference at the Rosen Plaza Hotel in Orlando, FL on April 24th - 28th. For information, contact the Institute in Orlando at 407/363-1111.
The World Conference on Quality and Improvement will be held May 1st-3rd at the Midwest Airlines Center in Milwaukee, WI. For information, contact the American Society for Quality at 800-248-1946 or 414/272-8575.
The 15th World Congress on Information Technology will be held May 1st - 5th in Austin, TX. For information, call 512/505-4077.
The 17th International Conference of the Information Resource Management Association will be held May 21st-24th at the Wyndham Hotel in Washington D.C. For information, call IRMA headquarters in PA at 717/533-8879
The National And State CIO Association will be holding their 2006 Midyear Conference at The Capital Hilton, in Washington, DC on May 31st-June 2nd. For information, contact NASCIO headquarters in Lexington, KY at: 859/514-9153
If you have got an upcoming IRM related event you want mentioned, please e-mail the date, time and location of the event to timb001@phmainstreet.com
MY "PET PEEVE OF THE WEEK" IS "QUICK AND DIRTY DEVELOPMENT"
Today you hear a lot about "Agile Methodologies" for software development. To their proponent's credit, they admit their products are only aimed at software, not major systems. They also use a lot of techniques derived from what was called Joint Application Development (JAD) and Rapid Application Development (RAD). There is nothing new here. They are simply saying, let's sit down with an end-user, interrogate him, and then quickly deliver some sort of software to solve his problem. Two things bother me about this approach: first, they are suggesting an iterative approach to development whereby an initial program is developed and delivered to the user in 30 days, followed by updates each month. This is scary to me. As a user, I don't want to be using some half-baked software to run my business. The second thing that bothers me is that it is doubtful the software being developed for one user will interface with software being developed for another user. In other words, I question their ability to share and reuse data. Agile might be fine for developing a single program, but it is not the way to go for a major systems solution. In other words, I refer to Agile as "QAD" - Quick and Dirty development. Is this progress? I hardly think so.
Such is my Pet Peeve of the Week.
AND FINALLY...
I received an e-mail from a Jon Harris in New York who
wrote me regarding last week's essay on "Part IV of the History on Systems Development."
Jon writes:
"Two things: first, you mention that relational DBMS' have replaced the hierarchical model and the network model. I still use such products to this day. I hardly consider them obsolete. You also mention that CASE is on the way out. I also see them being actively used."
Thanks Jon for your note,
First, I never said there was anything wrong with the hierarchical or network
model for DBMS. I just noted the transition to the relational model. Products
such as IMS and IDMS are still out there and being actively used. One of the
main benefits in using these products is their ability to handle heavy
transaction volume. This is where they excel. It is also the reason why
IBM now refers to IMS as a "transaction processor."
As to CASE tools, Yes, they are still out there and I'm sure there has been a lot of progress made in their development. But make no mistake, their ability to build enterprise-wide systems is doubtful. Their forte is on software only. Frankly, I am finding more people using the programmer's workbenches as opposed to CASE tools.
Again, Thanks for your e-mail. Keep those cards and letters coming.
Folks, don't forget to check out our BRYCE'S CRASH COURSE IN MANAGEMENT which is a free on-line multimedia presentation offering pragmatic advice on how to discharge the duties of a manager, whether it be for a commercial or non-profit enterprise. Frankly, for someone aspiring to be a manager or for a new manager, it will be the best 45 minutes you can invest in yourself. Check it out on the cover of our corporate web page at: http://www.phmainstreet.com/mba
For a complete listing of my essays, see the "PRIDE" Special Subject Bulletins section of our corporate web site.
MBA is an international management consulting firm specializing in Information Resource Management. We offer training, consulting, and writing services in the areas of Enterprise Engineering, Systems Engineering, Data Base Engineering, Project Management, Methodologies and Repositories. For information, call us at 727/786-4567.
Our corporate web page is at:
Management Visions is a presentation of M. Bryce & Associates, a division of M&JB Investment Company of Palm Harbor, Florida, USA. The program is produced on a weekly basis and updated on Sundays. It is available in versions for RealPlayer, Microsoft Media Player, and MP3 suitable for Podcasting. See our web site for details. You'll find our broadcast listed in several Podcast and Internet Search engines, as well as Apples' iTunes.
If you have any questions or would like to be placed on our e-mailing list to receive notification of future broadcasts, please e-mail it to timb001@phmainstreet.com
For a copy of past broadcasts, please contact me directly.
We accept MP3 files with your voice for possible inclusion in the broadcast.
Management Visions accepts advertising. For rates, please contact yours truly directly.
Copyright © 2006 by M&JB Investment Company of Palm Harbor, Florida, USA. All rights reserved. "PRIDE" is the registered trademark of M&JB Investment Company.
This is Tim Bryce reporting.
Since 1971: "Software for the finest computer - the Mind."
END
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home