Published in January 2007

Zero Defects In AV
By Mario J. Maltese, CTS-D, CTS-I

Exploring the value of defining quality.
Case One: These instruments, referenced in the author’s book, perform most of the tests included in the Staging and Commissioning Tests.

Editor’s Note: Longtime Sound & Communications contributor Mario Maltese has written a book, AV 9000: Defining Quality in Engineered Audio Visual Systems, which outlines a “zero defects” process for AV integrators, with the goal of simplifying the work of designers, installers and technology managers. This material is adapted/excerpted from that publication; for more information, go to www.avres.com/id28.html.

    Just as many of Sound & Communications’ readers, I can recall the go-go days of AV when it was just starting to penetrate the corporate market. The corporate culture was just acquiring the technology. I also can recall gross profit margins of 40%. Customers barely knew what they were buying, other than the fact that it was all the latest fangled whiz-bang. Those who had the (much simpler) operational arts down pat could provide many jobs per month with the efficiency of an assembly line. Both AV consultants and integrators prospered.
    Over the years, the corporate culture not only grasped the technology, it drove it. As the creation of technological improvements started to accelerate, it was the AV culture that lagged. There was increased pressure to learn more, and InfoComm International stepped up with new courses and certifications for its members to keep up.

Finish the Contract
    But did it? It seemed as though the integrators never could finish the contract, let alone adhere to a schedule. The customer grew more and more impatient, and the trust in industry consultants and integrators eroded. Consultants complained that the integrators never read their specifications. Integrators complained that the consultants really didn’t know what they were specifying. Everyone was busy, but busy putting out fires, and not making any money. Profits were down everywhere you looked.
    As progress marched on, something interesting happened. Customers were hiring industry professionals from both the consultant and the integrator labor forces. These “Technology Managers” set up their own “shops,” and they became populated with certified pros. Indeed, there were the pros with advanced certifications (CTS-D, CTS-I), as well. In fact, they dominated the classes at InfoComm International’s Academy. Why? Certainly not for economic reasons! When asked, my customers respond that, except for the larger projects, they feel they can do the job better themselves. The industry let the customer down. It failed to give the customer what he or she expected.

Recognizing the Trend?
    Of course, our industry recognizes this trend and is making haste toward corrective action, right? Wrong! Human nature being what it is, sinking profits bring on the inevitable paranoia. Consultants fear that greedy contractors are pushing the design-build approach to max their profits. Contractors fear the consultants couldn’t wait to open the bidding circles to newer, smaller contractors. Both appear to be hitting the nail right on the thumb.
    It’s really and truly all about quality. And most can’t even define what it means.
    OK, then, what do you mean by “Quality?”
    Quality is, perhaps, the most misunderstood concept in existence. The word itself has been overused, misused and downright abused in marketing literature and commercial advertising.
    Quality in its simplest sense means that customers are getting what they expect to get when they order something. It’s about consistency. Quality does not mean luxury unless, of course, the customer expects luxury at the time of purchase.
    In another sense, quality is an intellectual discipline, a way of thinking about the business. How does the business go about assuring that the customer gets what’s expected, each and every time an order is placed? In the AV industry, I submit that there is a major disconnect about quality, and there is a need for a dramatic paradigm shift to take place.

Case Two: Additional instruments for audio and video testing.
Together with Case One and a laptop, all the tests listed in the book can be performed satisfactorily with sufficient accuracy.

'Substantial Completion'...
    Many think in terms of a project reaching the heights of “substantial completion.” Think what this would mean to the buyer of a new car, and the car salesperson trying to convince him to take possession of a vehicle missing headlights or the ability to go in reverse! In these terms, the word “quality” takes on a whole new meaning. The phrase “zero defects” begins to describe what the industry should really be striving for. Indeed, the ubiquity of defects just might be the cause of the customers’ migration away from the status quo of doing things.
    Let’s imagine that this paradigm shift takes place, and the AV industry achieves awareness of the goal of customer satisfaction and “zero defects.” What happens next?
    Leadership naturally will go about placing available resources in position to ensure that the goal is achieved every time. It won’t take long before people realize that technical training will not, in itself, bring about the goal. More is needed.
    It won’t take long for the devoted to figure out that one of the first things required is a way of verifying the end result. A checklist, one that addresses the Product (what was ordered), the Performance (How loud should the sound system be? How stable is the voice reinforcement? What resolution should the displays be able to handle satisfactorily?) and the Practice (Is the system serviceable? Is it wired in compliance with applicable codes and best industry practices?).
    Further, this checklist should be a practical one, easy to verify conformance with the minimum of instrumentation and training, and should be focused on the results. It either meets the criteria or it does not, and corrective action must take place before presenting the system to the customer.

Boilerplates Fall Short
    I submit also that the proliferation of “boilerplate” performance verbiage found in most specifications falls far short of being of any practical use. Many are misleading. I undertook a most unscientific survey of design consultants, all of whom are in full agreement of my assertion, and none could recall any integrators who ever completed a written test report they included in their specification.
    Many of these specifications are actually a rehash of manufacturers’ specifications, many of which do not apply to a system. For example, a signal to noise test for an amplifier uses the maximum rated output for its signal level. Usable systems hopefully would operate 20dB or more below this level.
    Once this definition of the end game is established, leadership can then set out to establish a means for which it always takes place. They start to define processes and procedures to assure the successful outcome, and soon realize that it starts in the proposal phase, even before the job is initiated. They start to provide a structure wherein information and materials can flow in a manner that not only achieves the goal, and the resultant profits that inevitably occur when the job is done right the first time.
    Currently, these concepts are not being addressed by industry associations, despite the obvious critical importance.
    Wouldn’t it be great if there were a generic definition of a completed AV project, one that just requires the “metrics” or particulars for the project at hand, on which everyone can agree? What if designers and contractors alike understood this definition, and were trained and equipped to prove that the resultant system meets this definition?

Define Requirements
    Designers then could concentrate on establishing and defining customers’ particular requirements in clear, unambiguous terms. Design reviews could then take place to confirm that the system will meet these needs even before it is bid or built. And integrators, and/or third party Testing and Verification Service providers for that matter, then can certify that the system meets the established criteria. Customer satisfaction is much more likely to result.
    Consultants and integrators alike actually might be paid on time! What a concept!
    AV 9000: Defining Quality in Engineered Audio Visual Systems is both a proposed standard definition of a completed AV system, and a series of checklists that can be used for providing structure in an AV company. Our company, Audio Visual Resources, Inc., has been using these checklists and incorporating them into our ISO-9001:2000 certified company’s Quality Management System for years. The book presents the end-game defining Commissioning Checklist that can be found at www.avres.com, but goes well beyond that. It also includes a primer on how to perform the checklist, and what instrumentation is needed.
    There is movement to establish an ANSI standard by InfoComm International, the first for the AV industry. That will not happen until comments are received by as many interested industry participants as possible.
    Wouldn’t it be great?

 

Chapter One: “AV 9000”—The Commissioning Checklist

“Cheshire Puss, would you tell me which way to go from here?” asked Alice.
“That depends a great deal on where you want to get to,” said the Cat.
“I don’t much care where,” said Alice.
“Then it doesn’t matter which way you go,” said the Cat.
from Lewis Carroll’s Alice in Wonderland


    I will discuss “process” later on. As important as focusing on the process really is in this industry, it is the outcome that must be defined first. That is, after all, what the customer really ordered. How the desired outcome came to be is of tangential interest to the customer, if there is any interest at all.
    We will focus only on the successful completion of the installation. More than likely, project completion may have additional deliverables, such as training, system documentation, preventive maintenance visits and warranty periods, as well as requisite documents that may require signatures. These deliverables may be part of a proposal or specification, or may be part of a service provider’s quality plan. The completion of the installation is an important milestone to the customer, because more than likely he can begin using the system. The other deliverables just mentioned can be addressed elsewhere.
    How would we know if a custom-engineered system actually is fully installed, working and with zero defects? With a custom-engineered checklist, of course. Each system is unique in one way or another. Some even go so far as to say that there is no way to specify performance. That does not mean, however, that there are no similarities between systems.
    The fundamentals don’t change. They are indeed common to all systems. They all have to be loud enough, intelligible and stable for all listeners hearing the sound. The images all have to be bright enough, big enough and legible to all viewing the images. None should give noticeable distortion, noise, hum or distracting artifacts. They should all be complete in every way that is tangible: product, audiovisual functions, promised accessories. Finally, they all should be serviceable so they can last beyond when the user actually receives a return on his investment.
    Some of the checklist items can, and should, be objective. The quantitative value chosen should be based on a value that the average user would prefer, because we are dealing with perceived loudness, brightness, etc., and perceptions do vary between observers. Some of the test values have to be qualitative because the nature of the test actually may be too complex to quantify. As such, we would wish that the person evaluating the test have the experience to know what would be satisfactory, and unsatisfactory for that matter, to the majority of users.
    The best way to get the outcome as precise as possible is to include the users in the test values from the beginning of the project. At all costs, “rules of thumb” should be avoided. Customers vary widely. They have different images to display, different sound pressure levels with which they operate and they work in different cultures.
    The key is to educate the user about the test outcomes desired, and use demonstrations, if necessary. Once the outcome is well defined, and the numbers are chosen, it forms the basis of not only the Commissioning Checklist, but also all the intermediate quality confirmations applied during the development of the project.
    Also avoided are tests that generally are performed by a manufacturer’s quality control. The focus is on how the manufactured equipment is chosen for the application, and comes together in a system. When the parts do come together, the resultant performance may change drastically. Stated in another way: It’s very easy to choose perfectly good equipment, and put it together in a way that it performs miserably. The Commissioning Tests should be chosen carefully to confirm that this is not the case. Therefore, the tests should focus on the things that the designer and integrator could choose inappropriately, install and configure incorrectly, or mis-wire altogether.
    If it is suspected that a manufactured item is not performing up to its own stated specifications, the testing team would require the ability to “bench” the item, or remove it from the system altogether, to verify its performance. This would also remove any influence on the system from the product itself. As customers demand new products, and manufacturers are under pressure to release their latest and greatest, “infant mortality” issues do occur, and the integrator must be trained, equipped and motivated to deal with them quickly and efficiently.
    Each test in the checklist has a test number and a test description. A “result” entry value of “1” is applied if the system being tested passes, or a “0” if it does not. The “max” column receives an entry value of “1” if the test applies, and a “0” if the test does not. Not all tests apply.
    If there is no computer video in the audiovisual system, tests involving computer resolutions rate a “0” result with a “0” MAX. The objective is that the customer gets a system that has the “X/X” score, i.e., 46/46 in the case where all the tests apply.
    “Apparent Process Owner” is a method to suggest a possible process owner for corrective action during the punch list stage, and could be the Architect, Communications Services (IP, POTS, ISDN, Cable, etc.), Construction Trade (i.e., General Contractor (GC) or Electrical Contractor (EC), Designer, Integrator, Manufacturer, Programmer or Other). Finally, the Comments Column provides the detail for recording measured results, and should indicate sufficient information for corrective action.
    Absolute values shown are suggestions only; defaulting for the typical conference room with a 40dB SPL A weighted (slow) ambient noise level. The system designer would have to insert the customer’s absolute value as it pertains to that particular system, for that particular customer.
    Finally, references are made to accuracy with the “paper model.” Although this should not require an explanation, I am always surprised at council meetings when I hear of an integrator objecting to this deliverable. I shudder to think of the profits that are lost when this step is sidetracked. An unambiguous depiction of the system is a must, so the design can be reviewed adequately for performance and compliance, so the fabrication, installation and programming team members are on the same page, so changes and improvements can be managed, and so the system can be serviced in a timely manner. Without complete and accurate documentation, an engineered system is not being provided.
    (Note: for the latest version, go to www.avres.com and follow the “Downloads” link.)

TEST #    TEST
1    Record all equipment not present, and why.
2    Have no stray AC voltages on any equipment accessible to a user relative to ground.
3    Have no sharp or jagged surfaces accessible to a user.
4    Thermal gradient inspected; all equipment operating within manufacturers’ guidelines.
5    Visual inspection: labeling, cable dress, signal separation, cable stress, serviceability, tie wraps too tight (none on Cat5). Cable labeling is positioned and oriented in a consistent manner, is legible and unambiguous.
6    Be complete. Demonstrate the full inventory to be all new equipment, in full compliance with the specification, or as modified by approved submission. Record test results as pass/fail, and list exceptions.
7    Confirm that rack elevation and flow drawings, cable and other labels and engravings are an accurate paper model of the furnished system, and are in compliance with latest revised specifications. Record test results as pass/fail.
8    RJ terminations are solid in their connectors.
9    Coax cables respect a bend radius of at least 5 radii, or as recommended by the manufacturer.
10    Record ambient noise, A-weighted, slow.
11    No power amplifier shall have its rated load exceeded. Record the impedance (and at what frequency) of each loudspeaker line on each power amplifier. 63, 250 and 1000Hz are recommended, if available (“Loudspeaker Impedance Test”).
12    Produce a nominal operating level of __(65) dB SPL (Sound Pressure Level) conference speech, __(60) dB SPL program, “A” weighted at all listeners’ ears +/-__(2) dB (“Uniformity of Coverage”) (or at least __(15) dB above the ambient noise, A-weighted, whichever is greater), with the control system volume control indicating “normal” volume. Level test (acoustic) to sample seat locations (record results of program loudspeakers as measured at middle of table, and for ceiling loudspeakers, measure at three different locations). Record the samples taken and their values.
13    Be capable of producing an additional __(15) dB above this level (__(80) dB SPL), with less than 0.5% THD (Total Harmonic Distortion) plus noise. Measure THD plus noise when source is at __(15) dB above nominal operating level at each “destination,” for all sources selected.
14    Develop a noise level that is electrically __(55) dB below that below the normal operating level. “Noise” refers to hum, electrostatic noise, RF interference, etc. Measure and record Signal to Noise (“signal” measured electrically at nominal operating level) at each destination, for all sources selected.
15    Program loudspeakers shall be connected in the same polarity, and speech reinforcement systems shall be polarized such that a positive acoustic pressure on a microphone results in a positive acoustic pressure at the loudspeaker (“Polarity Test”).
16    Produce no more than a __(1) dB variance in program source levels, when each program source is playing a calibrated media (CD, video tape, setup test tone, etc.).
17    There shall be no audible vibration caused by improper mechanical installation. Use continuous sweep signal (from generator or test CD) pass/ fail result or which device at what frequencies. (“Buzzes and Rattles Test”).
18    The speech reinforcement system shall be stable (no feedback).
19    For audio conference systems, adjust microphone input gain so as to demonstrate that “standard talker,” positioned at each talker position in the room, produces a 0dBu level at the output of the output bus of the audio conference DSP device. Record test results as pass/fail. Record level across analog line. Inspect DSP mixer telephone line levels, both transmit and receive, when normal speech is encountered in room.
20    For conferencing mode, at the __(65) dB SPL listening level, be able to demonstrate full duplex operation, with no reports of echo (as detected from the far end).
21    Equalizers shall be adjusted for best intelligibility, and in accordance with the preferred acoustic level response curves. For installations with equalizers, record the “house curve” before equalization, as well as after the equalizers have been tuned, with and without microphone input filters. If requested by the consultant, produce this documentation for systems without equalizers, because this test may apply to the preamp filter settings in cases where intelligibility can be improved.
22    Be intelligible, with an RSTI (Rapid Speech Transmission Index) greater than 0.85. (If requested only) RSTI, using TEF or SIA Smaart Tools. For systems where early reflections may cause intelligibility problems, or when multiple drivers are used, an ETC (Energy-Time Curve) may be requested.
23    For NTSC sources, placing a test generator at each source shall produce 1 volt peak-to-peak to each destination +/-10% (or 1dB). If requested only, record results at each destination using NTSC bars, peak white and 5-step multiburst (0.5, 1.0, 2.0, 3.0, 3.58 and 4.2MHz).
24    Also for NTSC sources, confirm optimum brightness, contrast and color in displays using SMPTE source with PLUGE (Picture Line Up Generation Equipment) display.
25    When several NTSC displays are visible, demonstrate consistencies in displays using NTSC bars with PLUGE signal to all.
26    For RGB sources, demonstrate 700mV +/-10% (or 1dB) from each source to each destination. (If requested only, record results using a flat-field pattern signal at the highest resolution specified, or at least 1024x 768 resolution (VESA 8). For RGB sources, measure and record peak-to-peak voltage using a 200MHz oscilloscope at each destination when a test generator with either multiburst or H pattern is at each source location. Record “peaking” and “level” control settings on any interface at the positions whereby the 70mV voltages were attained.
27    Displays are focused, centered and evenly illuminated. If requested, confirm using the calibrated light meter that the brightest measurement locations shall be no more than +10% above average, and the dimmest locations no less than -5% below average measurement. Also, if requested, document that geometric distortion is within 2% tolerance. Take actual measurements if necessary (top, bottom, left, right dimensions of white portion of screen), and photograph if necessary.
28    Display stable images, with no scaling-related visual artifacts when switching between, at a minimum, _____(1024x768), (1280x1024) and (1280x720) sources, and/or all those specified in the performance criteria for this system. Record test results as pass/fail, and certify that sources have been satisfactorily scaled to the native resolution by the digital display device.
29    The Control System performs all the functions as indicated on the function list “(PDK)” provided, with stability, and in sync with the equipment being controlled without the need to reset any item of equipment.
30    Be serviceable. This includes accessibility to equipment to be pulled easily for repair by one person, neatly dressed cables, bundled in forms, having no excessive pressure on cables at termination points and connectors, utilize service loops, and have each cable number in agreement with the as-built drawings. This includes the equipment rack itself. All switches and receptacles shall be labeled logically and permanently.
31    Image size relative to furthest viewer:____(1:6). Record each, compare to recommended multiplier.
32    Confirm all nomenclature for consistency: drawings, touchscreen, etc.
33    Patch cables have cable numbers.
34    Inspect camera image quality.
35    Camera presets are programmed as specified by the user.
36    Confirm acceptable TV levels.
37    Confirm that all codec options have been installed.
38    All e-controlled equipment properly configured with IP addresses, host names, time servers, Gatekeeper addresses, network configurations, as applicable.
39    Displays have OSDs (On Screen Display), “OFF,” or as specified by the user.
40    Video projector, if any, must have “blue screen” off, or as directed by the user.
41    Log all telephone numbers tested—include time, number, which line, success of connection, who we spoke with, success of full duplex, success of auto disconnect; note if auto disconnect takes too long—as specified.
42    “White Purity” Test (__(7) pixels maximum) Note number of bad pixels.
43    Check for excessive vibration on VC camera at full telephoto position.
44    Video record non-conformances and anomalies.
45    Sanity check: Would the user object to anything about this system?
46    Prepare document report, certifying that the product, performance and practices are in compliance, and noting any exceptions. Distribute accordingly.

 

Mario J. Maltese received an MSEE from Stevens Institute, New Jersey, and a BSEE from Pratt Institute, New York. He developed a self-contained intercom system for use across bulletproof glass that is currently in use in subway tollbooths and movie theaters, and was later patented. He is a member of the Audio Engineering Society, the Institute of Electrical and Electronic Engineers, the Amateur Satellite Corporation and a senior member of the American Society for Quality. A senior faculty member at the InfoComm International Academy, he was named Educator of the Year 2004-2005.

«« Return to Video page                   
2003 - 2009 Archives
 

Video
Audio
IT/AV
Applications
Business


Editorial Team
Masthead
Back Issues
Subscription
Blue Book
More Information
Privacy Policy
 
  Video Celebrating
50 Years of Sound & Communications
Rock 'n' Roll
50th  

 

 

 

 

© 2009 Testa Communications | Privacy Policy