Monday, December 12, 2016

You Can’t Teach Experience.
Al Richardson, VP of Technology

I’m sure that all the readers know Murphy’s Law, if anything can go wrong, it will and does. And probably most of you have had one or more instances where it has happened to you personally or on a project. There are books written on the subject of project management, quality management and from time to time I read one or two of them to determine what might have changed, if anything. In the process of reading one of these books I came across one saying that caught my eye -

Horner’s Five-Thumb Postulate: Experience varies directly with equipment ruined.

Let me take a little poetic license and instead of saying equipment ruined, you might say schedules missed, programs over budget, cables faulted by sea creatures, etc. If you have been in the business long enough, things have gone wrong. So why is this important?  Currently R&R System Solutions is under contract to “teach” cable design, mechanical/electrical/optical performance, etc..   Providing cable system training is one of our favorite business services. The list is almost endless on what can be taught. The easier elements are wire size, insulation thickness, fiber strain, weight in sea water, bending stiffness, etc.  Also, knowing or teaching what questions to ask customers, checklists, lessons learned etc. is a critical part of the training.  As there is no current medical cure for ignorance and to use the cliché- You Can’t Teach Experience may also apply. Most, however, don’t believe that experience can’t be taught. You can teach the lessons of experience but you cannot normally teach the actual experience of a given situation. You can be a teacher, a mentor, a coach, but there remains one thing that seems to elude the education process, actual experience.

Let’s take Calculus as an example. A few years back I had the “opportunity” to teach Calculus to a group of seniors in a high school.  I spent a semester imparting knowledge but not a lot of teaching.  You can show students the methods of integration, parts, substitution, partial fractions, etc. but you can’t “teach” them when to use the methods.  And that is the key to know when to use the methods taught or related by more experienced personnel.

The same can be said for cable engineering.  It is extremely hard to teach what R&R defines as Cable System Engineering. This is the life history of a cable.  Everything from the first set of requirements, through design, through manufacture, through testing, through deployment, and finally ending up in service.  Cable System Engineering is everything the cable comes in contact with, every interface, every environmental concern, everything. Years and years of working on ships, in cable plants, and behind a desk has taught me one very important thing; You Can’t Teach Experience.


In the end, the learning experience reflects upon monies lost or gained for a company and the stability of a workforce. Experience is what is needed to recognize where the best solution may lie.

Wednesday, September 14, 2016

Mom, Kelp Ate My Cable

Mom, Kelp Ate My Cable!
Al Richardson, VP Technology

Yes, it really did, as strange as it may seem. One could understand underwater Zombies maybe, but Kelp? Who would ever consider Kelp as a threat to a cable.  Let me explain.  The other week during one of R&R’s status meetings we wandered off topic and started discussing the weird and not so weird causes of cable faults. As a company and as individuals our experience in design, fabrication, testing, and installation is hard to beat. Over the years we have seen some pretty “interesting” reasons why cables have faulted.  Everything from Southwestern Jack Rabbits and Dungeness Crabs just to name a couple. One of the more interesting reasons one would not expect is Kelp.

Here is how it happened.  During the field testing of a very small powered optical cable known as the 132 cable (the diameter is 0.132”) a study was performed on how the cable would behave on the sea bottom. I had the assignment to deploy the cable over various bottom conditions and observe the cable status and performance. An area was found where there was sand, mud, rock, gravel, and yes, Kelp on the bottom. After deployment and initial observations by divers, all was well. After a week went bye I set up an inspection schedule and departed for home.  Then two months into the one-year test, I received a call at my office.  I was told ‘the Kelp ate your cable’.  Right, sure it did. It took a bit to convince me. When we did the inspections, analysis the facts did show that Kelp was the main contributing factor to the fault.  As one learned, at the various stages during the life cycle of Kelp, it floats.  If you have a very small cable in a Kelp bed, you run the risk of the Kelp bringing the cable to the surface where the prop of a boat engine, or by some other surface means, the cable can break. And that’s what it did!

Who knew? Now, I do.  Who knew about underwater marching sand dunes? Who knew that a small tsunami could take out a shore landing? Who knew that a skate would unbury a cable to take a bite out of it?  Who knew, well R&R System Solutions does and has had the pleasure (as the Chinese say) of “living in interesting times ” involving undersea cable.

Tuesday, July 19, 2016

Final Program Quality
Al Richardson, VP & Head of Technology Requirements

In our previous blogs R&R has addressed program and quality items to point out where a program has a better chance of success. Like most, R&R System Solutions defines Program Quality as how well a program operates in delivering its product or service on time and within budget meeting all the requirements.  We measure it by;
  1.      The program’s ability to develop and manage all requirements
  2.      The programs use of Lessons Learned and checklists
  3.      The effectiveness of peer inspections / reviews
  4.     The techniques and procedures for validation and verification

There is no doubt that excellent program quality needs excellent program management.  Program planning to risk management and everything in between needs to be of a high quality.  And even with everything in place, things can still go wrong. If you are familiar with Murphy’s Law and then you should know about Otoole’s Law. Otoole’s law states that ‘Murphy was an optimist’.

Let us share a quick story of the ½ inch bolt.

At one time there was a requirement for the rapid deployment of a very small electro-optical cable. The design teams started and things progressed smoothly for quite some time. Designs were created, checked, re-checked, fabricated and tested. All was good.  All the elements were assembled and off to the sea trial we went. D-day for deployment came and within minutes after the system was deployed, it failed!  A bit of head scratching and everything was checked and re-checked again.  Assembly instructions and deployment procedures were re-reviewed.  As built books examined, quality records were investigated and nothing came immediately to light. Finally, after several inspections, the cause was determined to be none other than the ½ inch bolt. Yup, one little old ½ inch bolt.

In the design, a ½ inch bolt was specifically called for in various places. The bolt needed to be of various lengths in various places and you guessed it, a long bolt was used in a place where a short bolt was required.  The longer bolt extended into the cable pack and ripped the jacket causing water to penetrate the cable and the cable shorted to sea. As a result, checklists were updated and a new item was added to Lessons Learned.

As you have read in previous blogs, from the color of the jacket, from sharks, skates, and rays, from Lessons Learned, from misused acronyms, to the ½ inch bolt and beyond even the best planning cannot always anticipate what can occur.  Our experience has taught much.  While R&R has not seen everything in the world of undersea programs, we have experienced and seen a lot!


Sunday, June 26, 2016

The Corners of the Box
Al Richardson VP & Head of Technology

Requirements? Let me share with you a short story.  Currently R&R is working with a customer that has an “interesting” set of requirements.  They would like a certain amount of power, a certain minimum breaking strength, and they would like the cable to float.  Let’s examine these items for a second.  Power- copper works and copper sinks, strength- steel works and steel sinks, floats- some insulation floats but I bet there is a requirement for maximum size.  Yes, there is a max size requirement and now we are in a box. In fact, we are deep in the corner of the box where the edges of the requirements meet.

We have all heard that the customer is always right so we labor away attempting to design the perfect design. Take it from me, the customer is not always right and in many instances, the user had / has very little input into the requirements. Sometimes what you see in a Request for Quote or Request for Proposal doesn't make sense. Sometimes the partially conflicting requirements may be as simple as min / max size versus min / max strength.  Sometimes they may be as subtle as strength / size / hydrogen generation (that was a fun design.) Either way, you can end up in the corner of the box with seemingly no way out.

There is a way out and you can do it without appearing arrogant.  Arrogant in this case is defined as you knowing more about the customers’ needs and telling them that.  Most of the time you don’t know which user requirements are really important and which are just there to fill in a blank. How do you tell the difference?  The answer is simple, you ask.  You ask in a way that shows concern for the customer.  Concern for the customers’ needs, time, and money. Sometimes it works, sometimes not. Either way you gain a better understanding of the customer and the customer may have a greater respect for your organization.


R&R is in that situation right now and we are working with the customer.  Will it work out?  Maybe, maybe not.  Since we are engineers, nothing is impossible.  It just takes a little longer and maybe costs more money to get out of that corner.

Sunday, May 22, 2016

Do It Right the First Time
Sometime during the testing phase of your project, if not latter, a defect, an error is found. The project is put on hold until the problem(s) are corrected. Now you are behind schedule and possibly over budget.  You hear the staff saying, “We never have time to do it right but always seem to have time to fix it.” I’ve said it as an engineer and heard it numerous times as a manager. It got me wondering what is there in the program quality tool box that could help this all too common situation.  Engineers will want to take forever and program managers will want it yesterday, tomorrow if you are lucky.
The answer, peer inspections.  These are sometimes called reviews, audits, or walk through. It doesn't matter what they are called.  It matters that you have them and they are set up and done correctly.  I will use the term inspection as it sounds more formal and that is just what it needs to be, formal. Having inspections in the development phase allows for rapid design with the knowledge that errors and defects will be caught before anything is built.
There are 3 main parts of the inspection;
1)      Set up – Possibly the most critical step. Picking the participants for the inspection will lead to success or failure. Giving the participants preparation time with the inspection material is also critical
2)      Conduct the inspection – Seems simple enough and if you have a good inspection leader, it is. Once again I like what CMMI® says about inspections. “The focus of the peer review should be on the work product in review, not on the person who produced it.”  
3)      Review the results – This should be easy but it isn't.  The easy part, errors and defects have been found and can be corrected before time and money is spent on fabrication.  The hard part, inappropriate use of the inspection data. You want to know how to stifle inspections and cause the process to be abandoned? Have some manager, functional or program, start to use the results in performance evaluation.  And that is just one inappropriate use I have seen.

Inspections are not to be the end all to Program Quality, but without them, your chances of doing it right the first time are slim and none.

Tuesday, March 22, 2016

FAT means WHAT???? The use and misuse of acronyms
Alfred Richardson, VP R&R System Solutions
We have all been there.  Sitting in a meeting, reading a document, or writing a report ourselves and we use or hear an acronym. Acronyms can be misleading.  Why? Because your industry meaning may be different from the speaker, reader, or in the worst case, the client. (FAT = File Allocation Table) I have always considered the use of acronyms lazy, time wasting, and dangerous. As consultants we cannot assume the meaning of any acronym we read or hear.  We always ask for the definition to insure clarity. (FAT=Fluorescent Antibody Test)
Why do we consider it lazy?   Let’s look at the acronym TS for Test Set.  Really?  That saves 5 letters. Have you been in a meeting where the people seem to talk in acronyms?  You have no idea what they are saying.  Don’t worry, they don’t know either. (FAT=Fully Automatic Timing)
Why do we consider it time wasting? Take for example a new person on your team / project.  How much time is wasted trying to figure out what some or all of the acronyms mean?  It can take months for someone to come up to speed.  Why add to that? (FAT=Fatty Acid Transferase)
Why do we consider this dangerous?  When reading a specification, when bidding a proposal, when developing a program plan, it is easy to misinterpret the meaning of an acronym.  I am a hardware person so I will use FAT as an example. If you look to “Acronym Finder,” you will find 55 active meanings of FAT and 154 in the “Acronym Attic.”  (FAT = Flight Aptitude Test)  Now most of these do not apply in the development world but some do.  Does FAT mean?
1.       Factory Acceptance Test(ing) or
2.       First Article Test(ing) or
3.       Final Acceptance Test(ing) or
4.       Facility Acceptance Test or
5.       Field Acceptance Test or
6.       Functional Acceptance Test(ing) or
7.       ????

You see what I mean with just these six examples. I have had to halt a kick off meeting between a customer and their supplier because I knew they had different meanings for an acronym. You don’t need them, so don’t use them. Use words, not letters. Make a misread and it can cost you and your company big time. As a final thought, the fourth meaning of FAT by Acronym Finder is Faithful and True. Think about it!

Thursday, February 18, 2016

The Myth of Lessons Learned

We’ve all been there. We are starting a project, we are working on a project, we are about to finish a project and we know, we just know someone has done something like this before. So what do we do? Well, all good companies have a “Lessons Learned Knowledge Base,” a store of historical information and lessons learned about both the outcomes of previous project decisions and previous project performance.  So says the Project Management Institute.  But wait, you can’t seem to find yours. People say we did something like that a few years back.  So why is it so hard to find out what happened?

It’s so hard because most companies do not have an easy way to find lessons learned if they keep them at all.  Do you know where to find them in your company? If so, you are one of the lucky ones.  If not, join the crowd.  Of course we are all responsible.  This is not just a Program Management or Quality responsibility, although they deserve much of the credit or blame. When you finished your last project there were actions and decisions that worked well and others, not so well.  Where did you record these actions so others could benefit from your experience? And can others find these results?


In most companies / organizations, small, medium and especially large, Lessons Learned are a Myth.  Do something about it. In both the short and long run, using tools and techniques to make decisions that have been proven to work before makes money; making the same mistakes over again doesn’t.