Saturday, January 19, 2002
A few other interesting resources I have bookmarked include:
- Transition to Production Plan in MS Word format. This document outlines what is needed to bring an application into production.
- Guide to the software operations and maintenance phase. Specifically addresses maintenance issues after an application has been released into production.
- Software Maintenance, Reusability, Portability & Documentation, which is a PowerPoint presentation that covers both applications and service delivery.
- Software maintenance and customization processes in MS Word format and a collection of accompanying Visio diagrams. These are wonderful resources!
For my friend Muthukumar U I hope the following nugget proves useful: Automated Life-Cycle Approach to Software Testing in the Finance & Insurance Sector
Applications delivery is, for the most part, project-oriented, while service level management is operations-oriented. The practitioners who work within each of these domains often seem to live on different planets, and in some organizations, in different galaxies. However, there are many common touchpoints, including: quality, issue management and change control. Once an application has been released into production another touchpoint is maintenance, which is performed to either add enhancements or correct defects. Maintenance is an indication of quality. Too much maintenance may indicate poor development and QA practices on the application delivery side. However, it can also indicate responsiveness to user requests for enhancements (a good thing) or little control over requirements (a bad thing).
Internet development projects, especially those that are designed to acquire and/or retain customers in support of a B2C strategy, are unique in that most applications and systems are in a near-constant state of maintenance from the standpoint of features, adjusting to customer dynamics and other factors that marketing deems important. These systems pose challenges to traditional applications delivery methods because of their 24x7 nature and the immense pressures of business imperatives and competitive advantage that internal business applications normally don't impose.
With respect to initial systems development getting it right the first time is a common sense solution. That, of course, is easier said than done, but Six Sigma for Internet Application Development provides some excellent guidelines. Another article that addresses some of the challenges faced by teams that need to quickly develop and deploy applications is Rapid Application Development: Project Management Issues to Consider. QA and release management, regardless of the type of system, are two closely related areas that interface applications delivery and operations. Since requirements ultimately come from the business (end users - the reason we have jobs in the first place) to operations, which are then communicated to the applications delivery side, understanding how requirements relate to releases of enhancements and fixes is essential. A good starting point to gaining this understanding is An Examination of the Effects of Requirements Changes on Software Releases.
If you want to cast a pall over a social gathering of developers all you have to do is bring up the topic of software maintenance. This is probably the least sexy topic, aside from QA, for applications delivery folks. On the other hand, much of what is inflicted on end users is due to poor software maintenance practices. Did I say poor practices? In many cases they border on criminal - developers wildly patching production systems with no plan, no approval and no business ever touching a production system in the first place. Good practices, on the other hand, incorporate software configuration management, unit and integration testing, dependency analysis and other tasks that most developers hate. Also stirred into this mixture are regression testing, release management and change control. A good introduction to software maintenance is Tutorial on Software Maintenance, which is in PDF format. Another document that covers the basics and adds structure is Software Maintenance Process: A Seven-step approach in PowerPoint format. Once you have the basics down you'll probably want to examine some advanced techniques that tie maintenance to SQA, such as Applying Quantitative Methods to Software Maintenance and a related article titled Measurements to Manage Software Maintenance. If you're in the medical device industry then Architecture Level Prediction of Software Maintenance is essential reading because there is a world of difference between a business application and the software embedded in a dialysis machine.
As you dig into the software maintenance body of knowledge one name keeps popping up: George Stark. He has a lot of software maintenance material on his web site. One that I particularly like is Software Maintenance Management Strategies: Observations from the Field. He also wrote an interesting paper for MITRE titled Return on Investment from a Software Measurement Program. You can also download the paper in MS Word format.
MITRE has another document that you'll find useful: A Management Guide to Software Maintenance in COTS-Based Systems. COTS is Commercial Off The Shelf software, and, yes, there are maintenance strategies that can be applied to shrink-wrap software that comprises portions of business systems.
In a future entry I'll discuss in more detail my thoughts on configuration, change and release management; application acceptance processes and topics related to bridging the gap (chasm?) between applications delivery and operations.
Friday, January 18, 2002
- The 7 Deadly Sins of CRM
- Getting Ready for CRM: A Pre-Implementation Checklist
- Data Warehouses, Metadata and Middleware
- Why CRM and BI Aren't Created Equal
- The Data Warehouse Request for Proposal
- Choosing a Data Warehouse Consultant, Part 1 and Part 2
- Clickstream Analysis
Reading Jill's articles is like reading her books - you marvel at how one person can know so much and be so articulate and straightforward when it comes to explaining it. She is, indeed, world class in data quality, data warehousing and CRM. If you haven't read them already, you should rush out and get copies of her latest book, The CRM Handbook: A Business Guide to Customer Relationship Management and her first book e-Data: Turning Data into Information with Data Warehousing.
My research (remember, I had a purpose before I allowed myself to become sidetracked) led to some valuable finds, among them: a collection of presentations and papers on topics ranging from methodologies to data warehouses, a whitepaper in PDF format titled An Enterprise-Wide Data Delivery Architecture and an article that project managers, business stakeholders and project team members should read titled Why Data Warehouse Projects Fail.
Challenge prospective vendors to exclude from acts of God any services that they claim are particularly robust during their sales cycles or in their marketing materials. Whatever their reaction, protect yourself against acts of God by purchasing insurance.Here are a few thoughts:
- Even with exclusions to the Force Majeure clause there are events that could protect the vendor anyway. First, if the Force Majeure clause does not explicitly exclude (as opposed to merely not citing) an act or event the vendor may still have protection by claiming and proving impossibility and/or frustration of purpose. These are valid legal claims. [See in (1) below]
- Insurance is another tricky area. Consider the typical Act of War clause and then ask whether or not it applies to terrorist acts. In the US we claim to be in a war against terrorism, yet the war is undeclared in an official manner. Insurance companies may not honor a claim for damages and losses due to a terrorist attack. [See in (1) below]
- Even with Force Majeure clauses there may be limitations - if such a clause cites a large and encompassing number of example acts and events, anything not cited may actually be excluded as a legitimate Force Majeure event during litigation. [See in (2) and (3) below]
- Newsletter article on Outsourcing Law
- Professor Smith's Contracts Tutorial
- Outsourcing Primer (PDF format)
1 Effect of Acts of War and Terrorism on Contractual Obligations (includes detailed discussion of Force Majeure)
2 Paper on Force Majeure
3 New Developments Re: Force Majeure Clauses
Thursday, January 17, 2002
Some pages and papers on service level management and service level agreements that are worth visiting and reading are: A Structured Approach to Service Level Management and an Online Service Level Management Assessment. A plethora of service level agreement articles is provided at Business 2.0's Service Level Agreement links.
There are whitepapers, all in PDF format, that add to the service level management body of knowledge:
- Best Practices in Service Level Management
- Service Level Definitions
- IT Service Management Whitepaper
- Service Level Management: A Requirements View
With respect to UCITA and software quality, Cem Kaner's articles on Court Cases and Papers on the Law of Software Quality are essential reading for anyone who provides services or software. If you're following UCITA check for updates and developments on UCITA Online.
Another side of service level management deals with web site visitor experience. A site called Quality of Experience has articles and commentary on the subject. This is directly related to usability, a topic with which I ended today's research. Flash is Evil is an interesting and highly opinionated page dedicated to the dark side of Macromedia Flash in particular and usability in general. QA City is a good site for web testers and usability experts. There is something here for everyone who is involved in software quality.
Function point analysis has matured over the years and has evolved into an interrelated collection of methods called Functional Size Measurement. However, there are other methods and techniques, the most promising of which is Practical Software Measurement (PSM). PSM is supported by a large body of knowledge and software at the Practical Software Measurement site. Among the resources are an estimating application called PSM Insight, and documentation about the Practical Software Measurement methodology.
No list of software estimating would be complete without mentioning the constructive cost model (CoCoMo), which is now in version 2. The COCOMO II site that is managed by Dr. Barry Boehm, who is the father of software estimating, is resource-rich, including offshoots of the CoCoMo methodology and free tools (see the COCOMO Suite page.)
I've made two software estimating resources available for download from my site. They are in a single Zip archive, which contains:
- Software Estimating Techniques (PDF format)
- Estimating Process Guidelines (MS Word document)
Other metrics and estimating resources I dug up include:
- Impact of Experience on Maintenance - this paper has implications for maintenance programming.
- Web Development: Estimating Quick-to-market Software
- Miscellaneous metrics - this page has links to various business-related e-commerce metrics
- Requirements Change - an interesting paper that thoroughly covers the impact of requirements change on projects. This one is in PDF format, but you can also download it in RTF format
- EC Institute Body of Knowledge for Electronic Commerce - not specifically about metrics, but is essential reading for anyone involved in e-commerce projects
- System Quality Measures is a content-rich collection of articles and resources on quality as it's related to IT.
- A collection of data-related articles on Business Rules, Data Warehouses, Knowledge Management, general Methodologies, Data Modeling and Objects.
- For my friend and colleague, Gaiyasudeen Syed, at Kuwait National Petroleum Company: IT Applications/Developments in Oil and Gas
In its original form this act could have hurt the US software industry by not holding them accoutable for quality. If you haven't noticed India's industry has earned a well-deserved reputation for high quality and value that is in no small way due to Indian companies embracing the Capability Maturity Model (CMM) and other quality frameworks as their development paradigm. Contrast this with the US approach where the CMM is not well received outside of the Government and DoD contracting domains. It's clear that any law that further relaxes product liability and quality only shifts the competitive advantage off-shore. That consumers lose under the original provisions of UCITA is a given.
I liken the US software industry of today to the TV industry of days gone by. When I was growing up the top TV brands were all made in the US. Today they are all made by Japanese companies and assembled in Mexico.
What happened? After WW II Japan was rebuilding with a lot of help from the US. We sent teams of experts to assist. Among those experts were Juran, Deming and their colleagues who had sound ideas about quality. Those ideas were not embraced by US companies - it was too hard for one thing, and for another, there were no compelling reasons because consumers were buying everything that US companies could produce. The Japanese, however, had a horrible reputation for quality that extended into the 1950s. Anything marked Made in Japan was clearly a product to avoid because chances were that it would fall apart within days. However, by the 1960s Japan's quality-oriented approach, compliments of US techniques (that US companies did not value), began to pay off. The rest is history.
I see history repeating itself: US software companies cannot be bothered to implement quality methodologies and frameworks because they're too busy releasing new versions, which in reality are maintenance releases or, too often, patches to correct defects in the previous version. In the meantime, off-shore companies are espousing and embracing US developed quality approaches, such as the CMM, and are quietly establishing themselves as producers of quality software. To that end, the original provisions of UCITA would have hastened the migration of competitive advantage to off-shore companies.
In order to understand UCITA, if you are not already familiar with it, you should read the series of articles written by M.E. Kabay. His four articles will get you up-to-speed: Part 1, Part 2, Part 3 and Part 4 paint a dismal picture. However, on 15 January Mr. Kabay's newsletter article, titled The Latest on UCITA, brought a ray of hope. Since this article is not yet in the archives I'm going to extensively quote parts of it:
"On Dec. 17, the UCITA Standby Committee of the National Conference on Commissioners on Uniform State Laws issued a report to its executive committee recommending changes to the draft Uniform Computer Information Transactions Act.The sources of Mr. Kabay's optimism are: REPORT of UCITA Standby Committee December 17, 2001, and Press release from National Conference of Commissioners on Uniform State Laws. Also cited is a Computerworld article titled, UCITA backers agree to changes.
The Standby Committee's report explicitly acknowledged that the "majority of the amendments were submitted by AFFECT, an organization comprised of diverse interest groups and some individual companies for the purpose of opposing UCITA." AFFECT is the Americans For Fair Electronic Commerce Transactions."
"In my opinion, the most significant changes to the draft of UCITA that will be sent to state legislatures in the future are:
- UCITA does not supersede any consumer-protection laws in force and applicable to the purchase or licensing of software.
- Software sold through mass-market distribution must not be inactivated by the vendor (the so-called "self-help" provisions of the previous version) in cases of breach of license or contract.
- Software licenses for products distributed to the public in final form (i.e., not as test versions) cannot extinguish First Amendment rights of consumers to discuss, report, or criticize flaws in those products.
- Explicit recognition that UCITA "does not displace the law of fraud, misrepresentation and unfair and deceptive practices as they may relate to intentional failure to disclose defects that are known to be material."
- Explicit rejection of open-source software licenses (and also shareware licenses) from UCITA coverage. UCITA applies only to transactions involving the exchange of money.
- Reverse engineering is accepted as a legitimate method for ensuring interoperability of licensed software with other products."
This brings me to a few observations about the state of IT as a profession:
- IT has no business leading web projects. I came to this conclusion years ago, after watching one disaster unfold after another. The reasons are simple: there is a vast difference between the way IT views the world and the way business process areas view it.
- IT, as a rule, shys away from techniques that their business counterparts routinely use. Among the techniques are quantitative analysis, decision methodologies, and sound project management approaches.
- The common IT solution to almost any problem is technology. This, in my experience and opinion, only exacerbates the problem (not to mention squandering shareholder value).
- A Cost Performance Model for Assessing WWW Service Investments and Making Smart IT Choices. Both of these will augment Chapter 9, Risk Management, as well as give the IT side of projects some business-oriented tools that will bridge the gap between IT and business.
- Scaling for E-Business: Technologies, Models, Performance, and Capacity Planning augments Doug's Chapter 11, Traffic Models, as does Capacity Planning for Web Performance: Metrics, Models, and Methods. Note: Both of these books are written by the same authors. Each book has an accompanying web page that provides spreadsheets and other tools cited in the book. Scaling for E-Business spreadsheets and supporting information for Capacity Plannig for Web Performance are worth visiting. Also worth reading is a presentation given by one of the authors (Daniel A. Menascé) titled Challenges in Scaling E-Business Sites.
- Modeling the Real World for Load Testing Web Sites by Steve Splaine. This article from Stickyminds fully supports Chapter 9, Risk Management, in that somebody needs to think about testing and QA both in the site planning stage and after the site is in production. Again, who is managing the site is moot - due diligence dictates that load testing be performed to establish a baseline, as well as to model changes to production sites. The author of this article has also written, in my opinion, one of the best books on web testing and QA titled The Web Testing Handbook.
- For the business and IT members of a web project I highly recommend Internet Commerce Metrics and Models, which will get both factions to speak the same language and approach estimating using the same assumptions and techniques. Had the stakeholders in Doug's case study been familiar with this book perhaps their initial outlandish requirements for credit card transactions would have been more reasonable. After a web system is implemented (again, without regard for whether or not it's oursourced or run in-house), I highly recommend Measuring the Impact of Your Web Site. This book addresses technical and business issues. On the topic of metrics and testing, Mining Gold from Server Logs Using Existing Site Data to Improve Web Testing By Karen Johnson is an interesting look at how the web is changing the testing and QA profession.
Wednesday, January 16, 2002
I think I've finally learned my lesson with Sony - I've had 3 Sony laptops, 2 of their Vaio desktop systems, their first digital camera and a few other Sony products. Here's the lesson: they charge a premium price for less than premium quality in my experience. Each of the desktop systems had issues within a year of purchase, and two of the laptops developed power problems in less than a year of ownership, as did the digital camera. Worse, my F580 succumbed to the infamous keyboard problem that manifested itself on my laptop with a non-functioning left shift key. My conculsions are:
- Sony products are overpriced.
- Sony does not know how to engineer power supplies.
- I'll avoid buying Sony in the future.
- META Report: Room at the Data Center?
- Planning and Building a Data Center: Meeting the e-Business Challenge
- Enhancing e-Business Hosting Opportunities in the Internet Service Economy
- SSL Benchmarking: Designing a Secured Website
- e-Business Asset Management and Capacity Planning
- Retool Your Data Center
- Sanyo's Digital Memory Recorder that features the worlds first direct-to-card 64 MB Secure MultiMediaCard recorder. It has MP3/ VMA format, 1 hour of high-quality stereo recording time or 2 hours mono and 3 hours of playback. What impressed me was that it had direct recording to any audio equipment with or without a PC. You can even connect this to a cell phone. I don't think this is on the market yet. As an aside, this has positive and negative privacy implications
- Panasonic's audio player records from CD's or website onto your PC then transfers to an SD Memory Card. This one is for someone who likes to run or workout at the gym. The compact (tiny?) form factor is an engineering marvel and the sound quality is fabulous.
- Samsung's NEXiO S150 is an amazing entry into the WinCE family of PDAs. Although I have always preferred the PalmOS simplistic approach to PDAs, I was enchanted by the NEXiO and spent almost an hour playing with the demo model. It has 64M SDRAM/32MB Flash ROM, 5.1" color-wide VGA screen, of course it had mobile office, high-speed wireless voice and data network access. Suring the web with this baby is actually practical because the 5.1" reflective LCD screen with a WVGA (800 x 480) resolution that recreates a desktop viewing experience. In addition, its built-in wireless cdma2000 module makes the Internet is available 24/7 to mobile users at speeds that exceed today's traditional landline access capabilities.
- The Penguin emerges in the Linux-based, Java-enabled Sharp Zaurus SL-5500 PDA. This cool device should become the darling of the open source crowd. It has a built-in keyboard that is similar to the Blackberry, so I just had to try it. I had no problem's typing on it, but I have small, slim fingers. I'm not sure if this would work for everyone. With a 206MHz processor and 64 MB there is power to spare. See the Linux Devices article for more information. The Penguin rules!
Tuesday, January 15, 2002
I took a little side trip to CIO Magazine's IT Value section after reading Jill's Q&A, and found enough articles and papers on operations management, measuring IT value and related topics to keep me occupied and out of mischief for many hours.
2002 may prove to be the year CIOs finally wake up and make the lawyers wealthy. I am basing this on a January 14, 2002 eWeek article written by Dennis Fisher titled Software Liability Gaining Attention. My personal take: it's about time.
Governance is a topic that, like TCO, is little understood. In fact, it's too often ignored, which accounts for IT running amok with half-baked projects that are eventually scrapped or, if they succeed, add no real value to the business in the first place. Some companies attempt to implement a program management office to oversee projects, which is fine once projects are approved and initiated, but a PMO is no replacement for governance. One problem in the US is we are so enamored with the Project Management Institute's PMBOK (Project Management Body of Knowledge) that we overlook its many shortcomings, not the least of which is its emphasis on process to the exclusion of the real objective of project management - delivering something. I much prefer the UK PM standard called PRINCE2. PRINCE2 stands for PRojects IN a Controlled Environment, version 2. If you're not familiar with PRINCE2, these articles from the PRINCE User Group, or my Amazon reviews of Prince 2: A Practical Handbook (also reviewed by Linda Zarate) and Managing Sucessful Projects with Prince2 summarize it. In a nutshell, PRINCE2 makes provisions for both governance and a PMO.
Among the best governance resources I've compiled are:
- IT Governance Portal - this should be your first stop on the web for governance material. Some interesting documents I found here include: IT Scorecard Presentation, The Balanced Scorecard and IT Governance, and a plethora of others in their resources section. The collection of links points to a wealth of material.
- The development and implementation of an IM/IT governance framework
- The Chief Information Officer Branch, Treasury Board of Canada, which has a coherent framework that supports governance called Enhanced Management Framework. This site also has a large collection of key documents that further describe the components of the Enhanced Management Framework.
When governance doesn't work the next step is outsourcing. Yes, that's an extreme statement that is tainted with more than a little hyperbole. At this hour I'm entitled to indulge myself. Outsourcing is one business decision that should be made with care and copious amounts of due diligence. Templeton College, University of Oxford has a large and valuable collection of Research Papers that will support the decision making process with respect to outsourcing. Among the papers that address outsourcing I found the following particularly useful: Developing A Risk Framework For IT Outsourcing Strategy (supports due diligence), Relationships in IT Outsourcing: A Stakeholder Perspective (aligned to PRINCE2) and Exploring Information Technology Outsourcing Relationships: Theory and Practice.
One final gem to share: Systems Engineering Process is a collection of software engineering and production support processes, most in MS Word format, that cover the full spectrum of application and service delivery.
Monday, January 14, 2002
TechTarget is Whatis' parent site. This site is a treasure trove for competitive intelligence specialists, researchers and industry analysts. If you like quotations in general you should visit (and bookmark) About Quotations.
There are a few other links that I find myself going to when I'm researching particular topics: IT Director, CIO Insight and Baseline Magazine are excellent resources for IT management and operations professionals. Silicon.com and IT World are general news and content sources that cover the full spectrum of IT, development and related topics.
There are some reasons, in my opinion, for this: (1) too many IT professionals give lip service to business issues, but are more interested in technology (there are exceptions, but they're rare), and (2) their knowlege, such as it is, comes from Microsoft, PC manufacturers or "industry analysts". Considering the spate of security and availability problems with Microsoft's products that are reported daily in industry rags and e-news, I have to surmise that the only folks in Redmond who know what TCO is are the marketing folks. PC manufacturers, on the other hand, have a better story, but that story is slanted more towards asset management - not very total in my opinion. Regarding the "industry analysts", I've not seen anything that wasn't a rehash of the 1987 GartnerGroup work that launched the TCO voodoo in the first place.
I do have a few links that attempt to explain TCO and put it into its context. One of the best is TCO Models, which is focused on the GartnerGroup approach.
Gentronics has some interesting slants on TCO in a marketing blurb about building e-infrastructure. Their value proposition is based on Total Value of Ownership, which is based on information advantage. Try measuring or quantifying that. They tell a compelling story, it makes sense at an intuitative level, and I can see why one would go charging off to buy into it. However, if you ask a few basic questions, such as, how can I quantify the factors? you'll see that this stuff is pure marketing hype.
I have to admit that Compaq's materials (from desktop and infrastructure points of view) are sensible and take a straightforward approach. If your TCO focus in either desktop or infrastructure (or both), the TCO PowerPoint Presentation and TCO whitepaper (in MS Word format) are valuable.
Just bear in mind that the costs of ownership are not total if you're not looking at the total picture. I do have better resources on my old Infrastructure, Life Cycle and Project Management page that will help you get to the big picture. Look in the Tools & Documents section.
Sunday, January 13, 2002
Exploring Data Modeling's web site led to two other treasures: Success Instruments, which is a collection of IT critical success factors that is based on another book that Sanders coauthored titled Information Systems Success Measurement. The other treasure is a web site named Information Systems Effectiveness. This site is a goldmine for service delivery and IT operations professionals and consultants.
I was taking another mental trip to times past to recount some of the highlights in my life. One such event was a communications symposium I attended in 1986 in Indian Head, MD. It was there that I was treated to one of Admiral Grace Hopper's presentations. She was very frail when I saw her, and needed a little help getting to the podium, but once there an inner fire fueled her passion. That day the iconoclast who spent her life doing the right things for the right reasons, bureaucracy be damned, inspired us to always think out of the box and to eschew the phrase, we've always done it this way. I also took home one of her famous nanoseconds, which was a piece of wire approximately a foot long that represented the distance electricity traveled in a nanosecond. Her speech was pretty much what she had given thousands of times before, but hearing it from the lady herself was an honor and one of my life's memorable events.
In one of my 4 January entries I cited Laura Brown's Integration Models: Templates for Business Transformation as one of the top 5 books I read in 2001. You can get a PDF presentation that was compiled from materials in the book to get a glimpse of Laura's approach. Additional material is available from her web site.
Subscribe to Posts [Atom]