Saturday, January 19, 2002

 
Unlike Mike I am focused entirely on service delivery and production processes. True, one cannot avoid the applications delivery side, and I agree that there is a chasm between the two groups. One document that attempts to draw the applications and service delivery groups together is Software Maintenance from a Service Perspective. I also like the way that the ITIL (IT Infrastructure Library - the UK standard for service delivery) links both sides. A paper by Interpromusa titled ITIL Roadmap to SLAs illustrates the links. Deriving a Software Quality View from Customer Satisfaction and Service Data further reinforces the fact that end user satisfaction, which is the reason for IT's existence, should be the overriding reason to integrate applications and service delivery into a coherent organization. Mike and I have done some work this end, which I'm sure he'll address in a future entry - and I'll leave it to him to do that.

A few other interesting resources I have bookmarked include:

I have also collected a few other documents that are more suited to applications delivery, but are valuable to both groups: IEEE Std 1219-1998, which is the IEEE Software Maintenance standard, Software change management processes in the development of Embedded Software (I thought I'd toss this one in for anyone who found Mike's medical device resource useful), and for my software engineering brothers and sisters in applications delivery, Automating Three Modes of Evolution for Object-Oriented Software Architectures is an olive branch to hasten the bridge-building process.

For my friend Muthukumar U I hope the following nugget proves useful: Automated Life-Cycle Approach to Software Testing in the Finance & Insurance Sector

 
I must be bipolar because I have two passions: software engineering processes (a.k.a. applications delivery) and service level management.

Applications delivery is, for the most part, project-oriented, while service level management is operations-oriented. The practitioners who work within each of these domains often seem to live on different planets, and in some organizations, in different galaxies. However, there are many common touchpoints, including: quality, issue management and change control. Once an application has been released into production another touchpoint is maintenance, which is performed to either add enhancements or correct defects. Maintenance is an indication of quality. Too much maintenance may indicate poor development and QA practices on the application delivery side. However, it can also indicate responsiveness to user requests for enhancements (a good thing) or little control over requirements (a bad thing).

Internet development projects, especially those that are designed to acquire and/or retain customers in support of a B2C strategy, are unique in that most applications and systems are in a near-constant state of maintenance from the standpoint of features, adjusting to customer dynamics and other factors that marketing deems important. These systems pose challenges to traditional applications delivery methods because of their 24x7 nature and the immense pressures of business imperatives and competitive advantage that internal business applications normally don't impose.

With respect to initial systems development getting it right the first time is a common sense solution. That, of course, is easier said than done, but Six Sigma for Internet Application Development provides some excellent guidelines. Another article that addresses some of the challenges faced by teams that need to quickly develop and deploy applications is Rapid Application Development: Project Management Issues to Consider. QA and release management, regardless of the type of system, are two closely related areas that interface applications delivery and operations. Since requirements ultimately come from the business (end users - the reason we have jobs in the first place) to operations, which are then communicated to the applications delivery side, understanding how requirements relate to releases of enhancements and fixes is essential. A good starting point to gaining this understanding is An Examination of the Effects of Requirements Changes on Software Releases.

If you want to cast a pall over a social gathering of developers all you have to do is bring up the topic of software maintenance. This is probably the least sexy topic, aside from QA, for applications delivery folks. On the other hand, much of what is inflicted on end users is due to poor software maintenance practices. Did I say poor practices? In many cases they border on criminal - developers wildly patching production systems with no plan, no approval and no business ever touching a production system in the first place. Good practices, on the other hand, incorporate software configuration management, unit and integration testing, dependency analysis and other tasks that most developers hate. Also stirred into this mixture are regression testing, release management and change control. A good introduction to software maintenance is Tutorial on Software Maintenance, which is in PDF format. Another document that covers the basics and adds structure is Software Maintenance Process: A Seven-step approach in PowerPoint format. Once you have the basics down you'll probably want to examine some advanced techniques that tie maintenance to SQA, such as Applying Quantitative Methods to Software Maintenance and a related article titled Measurements to Manage Software Maintenance. If you're in the medical device industry then Architecture Level Prediction of Software Maintenance is essential reading because there is a world of difference between a business application and the software embedded in a dialysis machine.

As you dig into the software maintenance body of knowledge one name keeps popping up: George Stark. He has a lot of software maintenance material on his web site. One that I particularly like is Software Maintenance Management Strategies: Observations from the Field. He also wrote an interesting paper for MITRE titled Return on Investment from a Software Measurement Program. You can also download the paper in MS Word format.

MITRE has another document that you'll find useful: A Management Guide to Software Maintenance in COTS-Based Systems. COTS is Commercial Off The Shelf software, and, yes, there are maintenance strategies that can be applied to shrink-wrap software that comprises portions of business systems.

In a future entry I'll discuss in more detail my thoughts on configuration, change and release management; application acceptance processes and topics related to bridging the gap (chasm?) between applications delivery and operations.

Friday, January 18, 2002

 
It's about the data, right? While doing research on data warehouses and business intelligence I kept bumping into a familiar name: Jill Dyché. I decided to make things fun and see how many gems with her name attached bubbled up. Here's a short list of what I consider to be among her best work that is available on the web: She was also cited in Business Intelligence Improves The Information Value Chain and Dynamic Analysis of Relationship Profitability: A White Paper.

Reading Jill's articles is like reading her books - you marvel at how one person can know so much and be so articulate and straightforward when it comes to explaining it. She is, indeed, world class in data quality, data warehousing and CRM. If you haven't read them already, you should rush out and get copies of her latest book, The CRM Handbook: A Business Guide to Customer Relationship Management and her first book e-Data: Turning Data into Information with Data Warehousing.

My research (remember, I had a purpose before I allowed myself to become sidetracked) led to some valuable finds, among them: a collection of presentations and papers on topics ranging from methodologies to data warehouses, a whitepaper in PDF format titled An Enterprise-Wide Data Delivery Architecture and an article that project managers, business stakeholders and project team members should read titled Why Data Warehouse Projects Fail.

 
Mike and I were discussing one of Doug Kaye's tips from his book Strategies for Web Hosting and Managed Services. The tip reads:
Challenge prospective vendors to exclude from acts of God any services that they claim are particularly robust during their sales cycles or in their marketing materials. Whatever their reaction, protect yourself against acts of God by purchasing insurance.
Here are a few thoughts:Force Majeure clauses and insurance coverage should always be negotiated by an attorney. However, we technical professionals should be familiar with the legal aspects of contracting and outsourcing. To that end there are some resources that will get us up-to-speed if we're not already there. Do bear in mind that laws change rapidly, as do the interpretations of the law. Knowing something about contracting law does not absolve one from the responsibility of seeking the guidance of an attorney. That said, here are a few resources:Some additional resources that are worth reading is a comprehensive list of risks that should be examined for outsourcing and project management. I also like Ernst & Young's whitepaper titled E-Business Leading Practices to Consider and the list of Managed Service Provider Association's Whitepapers.

Citations:

1 Effect of Acts of War and Terrorism on Contractual Obligations (includes detailed discussion of Force Majeure)
2 Paper on Force Majeure
3 New Developments Re: Force Majeure Clauses

Thursday, January 17, 2002

 
Mike's entries raise some interesting points about service levels, legal issues, software quality and other factors that, combined, are our world of delivering and managing systems. Some intersting articles and papers on service level management that I've filed away include The Truth About Service-Level Management that was written by Rick Sturm. In this article Mr. Strum debunks 5 myths of service level management. I first encountered his work in a book he coauthored titled Foundations of Service Level Management. This was the very first book I reviewed on Amazon on 27 December 2000. The book has an associated website that covers service level management in great detail.

Some pages and papers on service level management and service level agreements that are worth visiting and reading are: A Structured Approach to Service Level Management and an Online Service Level Management Assessment. A plethora of service level agreement articles is provided at Business 2.0's Service Level Agreement links.

There are whitepapers, all in PDF format, that add to the service level management body of knowledge:

Not to be missed is an article from tele dot com titled Service Level Disagreement. Also, anyone who is involved with, or interested in, service level management should bookmark and frequently visit the The IT Service Capability Maturity Model® page.

With respect to UCITA and software quality, Cem Kaner's articles on Court Cases and Papers on the Law of Software Quality are essential reading for anyone who provides services or software. If you're following UCITA check for updates and developments on UCITA Online.

Another side of service level management deals with web site visitor experience. A site called Quality of Experience has articles and commentary on the subject. This is directly related to usability, a topic with which I ended today's research. Flash is Evil is an interesting and highly opinionated page dedicated to the dark side of Macromedia Flash in particular and usability in general. QA City is a good site for web testers and usability experts. There is something here for everyone who is involved in software quality.

 
My previous entry addressed some aspects of estimating and metrics, which were slanted towards web site capacity and performance. From software engineering and project management perspectives estimating takes a different form. My preferred technique is function point analysis (FPA), which is succinctly described in Function Point Analysis Overview. Another document that clearly explains FPA is FPA fundamentals. Any gaps in those documents are clarified in the Function Point FAQ.

Function point analysis has matured over the years and has evolved into an interrelated collection of methods called Functional Size Measurement. However, there are other methods and techniques, the most promising of which is Practical Software Measurement (PSM). PSM is supported by a large body of knowledge and software at the Practical Software Measurement site. Among the resources are an estimating application called PSM Insight, and documentation about the Practical Software Measurement methodology.

No list of software estimating would be complete without mentioning the constructive cost model (CoCoMo), which is now in version 2. The COCOMO II site that is managed by Dr. Barry Boehm, who is the father of software estimating, is resource-rich, including offshoots of the CoCoMo methodology and free tools (see the COCOMO Suite page.)

I've made two software estimating resources available for download from my site. They are in a single Zip archive, which contains:

  1. Software Estimating Techniques (PDF format)
  2. Estimating Process Guidelines (MS Word document)
Both of these documents provide detailed guidance on estimating.

Other metrics and estimating resources I dug up include:


 
Some gems from this morning's research:A topic that is near and dear to my heart is the ongoing developments with respect to the Uniform Computer Information Transactions Act (UCITA), which I consider to be harmful to the US software industry and to consumers.

In its original form this act could have hurt the US software industry by not holding them accoutable for quality. If you haven't noticed India's industry has earned a well-deserved reputation for high quality and value that is in no small way due to Indian companies embracing the Capability Maturity Model (CMM) and other quality frameworks as their development paradigm. Contrast this with the US approach where the CMM is not well received outside of the Government and DoD contracting domains. It's clear that any law that further relaxes product liability and quality only shifts the competitive advantage off-shore. That consumers lose under the original provisions of UCITA is a given.

I liken the US software industry of today to the TV industry of days gone by. When I was growing up the top TV brands were all made in the US. Today they are all made by Japanese companies and assembled in Mexico.

What happened? After WW II Japan was rebuilding with a lot of help from the US. We sent teams of experts to assist. Among those experts were Juran, Deming and their colleagues who had sound ideas about quality. Those ideas were not embraced by US companies - it was too hard for one thing, and for another, there were no compelling reasons because consumers were buying everything that US companies could produce. The Japanese, however, had a horrible reputation for quality that extended into the 1950s. Anything marked Made in Japan was clearly a product to avoid because chances were that it would fall apart within days. However, by the 1960s Japan's quality-oriented approach, compliments of US techniques (that US companies did not value), began to pay off. The rest is history.

I see history repeating itself: US software companies cannot be bothered to implement quality methodologies and frameworks because they're too busy releasing new versions, which in reality are maintenance releases or, too often, patches to correct defects in the previous version. In the meantime, off-shore companies are espousing and embracing US developed quality approaches, such as the CMM, and are quietly establishing themselves as producers of quality software. To that end, the original provisions of UCITA would have hastened the migration of competitive advantage to off-shore companies.

In order to understand UCITA, if you are not already familiar with it, you should read the series of articles written by M.E. Kabay. His four articles will get you up-to-speed: Part 1, Part 2, Part 3 and Part 4 paint a dismal picture. However, on 15 January Mr. Kabay's newsletter article, titled The Latest on UCITA, brought a ray of hope. Since this article is not yet in the archives I'm going to extensively quote parts of it:

"On Dec. 17, the UCITA Standby Committee of the National Conference on Commissioners on Uniform State Laws issued a report to its executive committee recommending changes to the draft Uniform Computer Information Transactions Act.
The Standby Committee's report explicitly acknowledged that the "majority of the amendments were submitted by AFFECT, an organization comprised of diverse interest groups and some individual companies for the purpose of opposing UCITA." AFFECT is the Americans For Fair Electronic Commerce Transactions."

"In my opinion, the most significant changes to the draft of UCITA that will be sent to state legislatures in the future are:

  • UCITA does not supersede any consumer-protection laws in force and applicable to the purchase or licensing of software.
  • Software sold through mass-market distribution must not be inactivated by the vendor (the so-called "self-help" provisions of the previous version) in cases of breach of license or contract.
  • Software licenses for products distributed to the public in final form (i.e., not as test versions) cannot extinguish First Amendment rights of consumers to discuss, report, or criticize flaws in those products.
  • Explicit recognition that UCITA "does not displace the law of fraud, misrepresentation and unfair and deceptive practices as they may relate to intentional failure to disclose defects that are known to be material."
  • Explicit rejection of open-source software licenses (and also shareware licenses) from UCITA coverage. UCITA applies only to transactions involving the exchange of money.
  • Reverse engineering is accepted as a legitimate method for ensuring interoperability of licensed software with other products."
The sources of Mr. Kabay's optimism are: REPORT of UCITA Standby Committee December 17, 2001, and Press release from National Conference of Commissioners on Uniform State Laws. Also cited is a Computerworld article titled, UCITA backers agree to changes.
 
My recurring topic lately is Strategies for Web Hosting and Managed Services. I've made a complete pass through the book, but am re-reading parts before I write a review and post it on Amazon. I just completed my second pass through Chapters 9 (Risk Management) and 11 (Traffic Models) and am impressed with the way Doug Kaye organized and wrote them. I especially like his case study in Chapter 11 that illustrates how a project can get out of control if the business and IT factions aren't approaching web design projects as a team, and if IT doesn't perform a quantitative analysis of assumptions. Of course, the assumptions need to have some basis in reality.

This brings me to a few observations about the state of IT as a profession:

  1. IT has no business leading web projects. I came to this conclusion years ago, after watching one disaster unfold after another. The reasons are simple: there is a vast difference between the way IT views the world and the way business process areas view it.
  2. IT, as a rule, shys away from techniques that their business counterparts routinely use. Among the techniques are quantitative analysis, decision methodologies, and sound project management approaches.
  3. The common IT solution to almost any problem is technology. This, in my experience and opinion, only exacerbates the problem (not to mention squandering shareholder value).
Nick Flor's book, Web Business Engineering, validates my conclusions and provides a coherent and effective approach to developing the business approach to a web-based system, be it outsourced or managed by internal IT. The approach that Nick Flor proposes extends and complements Doug Kaye's approach. Some additional reading that augments Chapters 9 and 11 in Doug's book include:I'll have more to say about Doug's book in future entries as well as in the web-hosting discussion forum that supports his book. In the meantime I have a day to start and a list of real-life items that need attending.
 
A friend and associate, Manisha Saboo of eRunway, suggested that I open links in a new browser window so the main page of this weblog remains available. Starting now a second window will be opened and any links you click will open in that window. I thought about opening a new window for each link, but that can become unwieldy. All prior entries will not be changed because I just don't have time to do it.

Wednesday, January 16, 2002

 
Linda's CES report got me thinking about the number of gadgets I have laying around - many of which don't see much use. Two of my monuments to impulse buying are my Jornada 545 and Jornada 680. I use one or the other when I'm on a project that is filled with meetings, action items and the such, but neither are integrated into my daily life. In fact, last May I bought a ThinkPad X20 to replace the PDAs, and lately I've been using it as a laptop while my Sony F580 sits untouched.

I think I've finally learned my lesson with Sony - I've had 3 Sony laptops, 2 of their Vaio desktop systems, their first digital camera and a few other Sony products. Here's the lesson: they charge a premium price for less than premium quality in my experience. Each of the desktop systems had issues within a year of purchase, and two of the laptops developed power problems in less than a year of ownership, as did the digital camera. Worse, my F580 succumbed to the infamous keyboard problem that manifested itself on my laptop with a non-functioning left shift key. My conculsions are:

  1. Sony products are overpriced.
  2. Sony does not know how to engineer power supplies.
  3. I'll avoid buying Sony in the future.
That said, I was more than a little tempted by the newest Sony digital camcorders, one of which uses the Memory Stick technology that Linda wrote about. If you wish to ignore my caveats about Sony you might want to check out the Sony DCR-IP7BT camcorder. It's slick ... it's tempting ... it's not for me.
 
Mike has been sharing his thoughts about Doug Kaye's book, Strategies for Web Hosting and Managed Services, and until he's finished with it and lends it to me, I have been tantalized with bits and pieces. These inspired me to look at a few factors and do some research, which snared some excellent material on data centers, SLAs, facilities management and the like. If you're interested in hosting and managed services you'll be interested in the following:Additional material that is of interest includes notes on facilities management that Mike and I compiled when we were developing processes and procedures for a CLEC, and a succinct description of failure modes and effects analysis (FMEA) that I discovered during the research. Also, NetworkWorld Fusion has research areas that contain case studies, whitepapers and articles that are of immediate interest, including:
 
I had an opportunity to attend the Consumer Electronics Show in Las Vegas last week and I took it. Having never attended a consumer-oriented show I was amazed and overwhelmed. You cannot see everything because there's so much, and as the day wore on my energy waned - although as the day wore on my geek factor increased exponentially. The fun, consumer products that caught my attention were:Next, I had to check out the palms/pocket manager/pocket pc's. I've always been drawn to these things, to the point of having a PDA fetish - my trusty Palm VII is thoroughly integrated into my life. There were a few PDAs that stood out:Oh, and Sony's Memory Stick got its share of exposure, not to mention a growing number of companies that are now supporting it. This very small memory device is so versatile you can mix and match all sorts of content. For instance, you can view pictures on a compatible big screen TV or AV system, you can show the pictures on an office projector, it links to computers and printers, cell phones, digital cameras, PDAs, voice recorders and lots of other stuff. Go to the Memory Stick Consortium for more information about Sony's growing clout in the portable memory market. Personally, the Memory Stick vs. Compact Flash vs. Flash Card stuff conjures up memories of Betamax vs. VHS. We'll see who is left standing next year - and I'll definitely be at CES 2003 to find out.

Tuesday, January 15, 2002

 
I'm an unabashed fan of Jill Dyché (see some of my earlier entries here) and one reason why is because she is in-your-face when it comes to doing the right things the right way and for the right reasons. The following Q&A with Jill in CIO Magazine shows why I am such a fan: The CRM Backlash. You can read an excerpt from Jill's book latest book, The CRM Handbook: A Business Guide to Customer Relationship Management.

I took a little side trip to CIO Magazine's IT Value section after reading Jill's Q&A, and found enough articles and papers on operations management, measuring IT value and related topics to keep me occupied and out of mischief for many hours.

 
Not to beat the subject of TCO to death, but I did come across what I consider to be a well conceived plan by the University of Colorado in A Vision and Plan for Implementing Total Cost of Ownership Strategies. Somebody apparently does understand TCO.

2002 may prove to be the year CIOs finally wake up and make the lawyers wealthy. I am basing this on a January 14, 2002 eWeek article written by Dennis Fisher titled Software Liability Gaining Attention. My personal take: it's about time.

Governance is a topic that, like TCO, is little understood. In fact, it's too often ignored, which accounts for IT running amok with half-baked projects that are eventually scrapped or, if they succeed, add no real value to the business in the first place. Some companies attempt to implement a program management office to oversee projects, which is fine once projects are approved and initiated, but a PMO is no replacement for governance. One problem in the US is we are so enamored with the Project Management Institute's PMBOK (Project Management Body of Knowledge) that we overlook its many shortcomings, not the least of which is its emphasis on process to the exclusion of the real objective of project management - delivering something. I much prefer the UK PM standard called PRINCE2. PRINCE2 stands for PRojects IN a Controlled Environment, version 2. If you're not familiar with PRINCE2, these articles from the PRINCE User Group, or my Amazon reviews of Prince 2: A Practical Handbook (also reviewed by Linda Zarate) and Managing Sucessful Projects with Prince2 summarize it. In a nutshell, PRINCE2 makes provisions for both governance and a PMO.

Among the best governance resources I've compiled are:

For my friends at Kuwait National Petroleum Company: Putting Quality Information in the Hands of Oil and Gas Knowledge Workers is related to governance, and is also an industry-specific article that addresses business/IT alignment in the oil industry.

When governance doesn't work the next step is outsourcing. Yes, that's an extreme statement that is tainted with more than a little hyperbole. At this hour I'm entitled to indulge myself. Outsourcing is one business decision that should be made with care and copious amounts of due diligence. Templeton College, University of Oxford has a large and valuable collection of Research Papers that will support the decision making process with respect to outsourcing. Among the papers that address outsourcing I found the following particularly useful: Developing A Risk Framework For IT Outsourcing Strategy (supports due diligence), Relationships in IT Outsourcing: A Stakeholder Perspective (aligned to PRINCE2) and Exploring Information Technology Outsourcing Relationships: Theory and Practice.

One final gem to share: Systems Engineering Process is a collection of software engineering and production support processes, most in MS Word format, that cover the full spectrum of application and service delivery.

Monday, January 14, 2002

 
Phil Wolff's klog (Knowledge Weblog) is a breathtaking source of information and thoughts on topics ranging from public policy to project management. Phil's approach and philosophy seem to be closely aligned to what TEAM Zarate-Tarrani holds sacred. A few highlights of gems tucked away are: the strategic use of klogs, thoughts and articles about klogs, which led me to a piece titled Invest in a Manager's Klogging Toolkit that covered all conceivable issues, including business case, TCO (there's that topic again) and project management tools and templates. Whew! Interestingly, Phil has made his klog multilingual - you can read the klog topics in a number of languages, including Arabic. That's a nice touch, especially for some of my visitors who read this from Kuwait and the UAE.
 
There are a few noteworthy links worth sharing. Whatis (which was provided by my friend Muthukumar U in Dubai) is a handy encyclopedia of technical terms, plus news and pointers to other link collections. One of my favorite features is the Favorite Technology Quotations. The quotations are a delight and the site is quite useful - thank you Muthukumar!

TechTarget is Whatis' parent site. This site is a treasure trove for competitive intelligence specialists, researchers and industry analysts. If you like quotations in general you should visit (and bookmark) About Quotations.

There are a few other links that I find myself going to when I'm researching particular topics: IT Director, CIO Insight and Baseline Magazine are excellent resources for IT management and operations professionals. Silicon.com and IT World are general news and content sources that cover the full spectrum of IT, development and related topics.

 
Doug's book (see previous entry) inspired me to dig up a few resources on IT costs. While TCO (total cost of ownership) has long since entered into IT jargon, most people who I've heard toss the term about with knowing looks don't appear to understand it.

There are some reasons, in my opinion, for this: (1) too many IT professionals give lip service to business issues, but are more interested in technology (there are exceptions, but they're rare), and (2) their knowlege, such as it is, comes from Microsoft, PC manufacturers or "industry analysts". Considering the spate of security and availability problems with Microsoft's products that are reported daily in industry rags and e-news, I have to surmise that the only folks in Redmond who know what TCO is are the marketing folks. PC manufacturers, on the other hand, have a better story, but that story is slanted more towards asset management - not very total in my opinion. Regarding the "industry analysts", I've not seen anything that wasn't a rehash of the 1987 GartnerGroup work that launched the TCO voodoo in the first place.

I do have a few links that attempt to explain TCO and put it into its context. One of the best is TCO Models, which is focused on the GartnerGroup approach.

Gentronics has some interesting slants on TCO in a marketing blurb about building e-infrastructure. Their value proposition is based on Total Value of Ownership, which is based on information advantage. Try measuring or quantifying that. They tell a compelling story, it makes sense at an intuitative level, and I can see why one would go charging off to buy into it. However, if you ask a few basic questions, such as, how can I quantify the factors? you'll see that this stuff is pure marketing hype.

I have to admit that Compaq's materials (from desktop and infrastructure points of view) are sensible and take a straightforward approach. If your TCO focus in either desktop or infrastructure (or both), the TCO PowerPoint Presentation and TCO whitepaper (in MS Word format) are valuable.

Just bear in mind that the costs of ownership are not total if you're not looking at the total picture. I do have better resources on my old Infrastructure, Life Cycle and Project Management page that will help you get to the big picture. Look in the Tools & Documents section.

 
I spent a few hours reading Doug Kaye's excellent book, Strategies for Web Hosting and Managed Services, and am impressed by how thorough he is, his focus on subjects dear to my heart (operations management, service levels and contract management), and Doug's clear and engaging writing style. This book stands apart because there is a dearth of books that address any of the topics, much less all three. Bravo Doug!

Sunday, January 13, 2002

 
Muthukumar U shared a web site and PowerPoint presentations that support Data Modeling by G. Lawrence Sanders. The site has a wealth of material and the book, besides coming highly recommended by a trusted friend, received glowing reviews on Amazon. Muthukumar also provided a link to a promising new site that is still in early growth stages: Enterprise Acceleration. Registration is free and the articles are interesting. Note: Registration is not working correctly, but you can enter as a registered user regardless of whether or not you've registered. I suspect this backdoor will disappear as soon as their registration process is fixed.

Exploring Data Modeling's web site led to two other treasures: Success Instruments, which is a collection of IT critical success factors that is based on another book that Sanders coauthored titled Information Systems Success Measurement. The other treasure is a web site named Information Systems Effectiveness. This site is a goldmine for service delivery and IT operations professionals and consultants.

 
An XML resource worth bookmarking is zvon.org. This site has everything from the basics to advanced topics.

I was taking another mental trip to times past to recount some of the highlights in my life. One such event was a communications symposium I attended in 1986 in Indian Head, MD. It was there that I was treated to one of Admiral Grace Hopper's presentations. She was very frail when I saw her, and needed a little help getting to the podium, but once there an inner fire fueled her passion. That day the iconoclast who spent her life doing the right things for the right reasons, bureaucracy be damned, inspired us to always think out of the box and to eschew the phrase, we've always done it this way. I also took home one of her famous nanoseconds, which was a piece of wire approximately a foot long that represented the distance electricity traveled in a nanosecond. Her speech was pretty much what she had given thousands of times before, but hearing it from the lady herself was an honor and one of my life's memorable events.

 
My special friend Muthukumar U in Dubai shared this wonderful page titled, How to understand Java by looking at pretty colors. It's a wonderful read on a lazy Sunday and a delightful break from weighty matters of life in general.
 
If CRM is your game, then you should bookmark CRM Community. This site requires free registration in order to read articles and whitepapers, but the content is rich and the array of topics is wide.

In one of my 4 January entries I cited Laura Brown's Integration Models: Templates for Business Transformation as one of the top 5 books I read in 2001. You can get a PDF presentation that was compiled from materials in the book to get a glimpse of Laura's approach. Additional material is available from her web site.

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]