IT/AV

Analyzing Analytics

I’ve got all this data…now what?

AV9000 Checklist Item Under Test: 6.4.4: Any web-based system control or monitoring features, and other IP functionality of system (analytics, time servers, system-generated email, etc.), are functioning.

Reasoning: Analytic engines are being rolled out in enterprise-wide control systems to collect data on how the system is used. Data are being generated and collected by many major AV manufacturers as a default. That data can be extremely valuable to planning teams on how resources are allocated for current and future conference spaces. However, someone needs to interpret those sets of data into meaningful directives for technology managers. Interpreted data sets can be incredibly effective when deployed in large commercial buildings and offices.

The Story: As an Eagle Scout, I’ve been into preparedness for quite some time. As a kid, it was all about carrying around a small first-aid kit and a little cheat sheet with some important phone numbers. As an adult, in the age of the internet, I apparently need a bit more: fixed blade knife, backup fixed blade knife, water filters, water barrel, three weeks’ worth of shelf-stable food for the entire family, solar chargers, power inverters, generators…the list of “essential” gear goes on and on.

There was a period when I would make some pretty outrageous purchases in the name of being prepared. I accumulated a decent pile of stuff in the basement. As luck would have it, a storm knocked out our power one night, and I sprang into action. I went for the generator first, but it had no gas. I went for the tactical flashlights, and I couldn’t find enough of the proper batteries. I went for the candles, and I couldn’t find the matches. It was a bust. I had all this great stuff, and I wasn’t prepared to use it. But, that failure gave me plenty of time to think…in the dark…about AV systems.

Analytics seem to be everywhere. Simple, inexpensive huddle space controllers report back to a service in the cloud about how often the room is used, what aspects of the room are used and, potentially, how well the room functioned for the meeting. As system complexity increases, the data that can be collected about the system increases, as well. If it is linked to a scheduling and occupancy system, data can be collected about how often the room is booked, as well as how often the room is actually occupied. If the room is tied to the user’s Active Directory, additional information can be gathered about who is doing the room booking, as well as who didn’t show up for their booked meetings. This is incredibly powerful. My concern is, all of that data is being collected…but is there anyone interpreting it for the client? If so, how is it being interpreted?

As a commissioning agent, I love numbers. I love target values and tolerances. Putting a number on things removes the wishy-washy subjectivity of observation or memory, replacing it with measured, documented objectivity. I would love to see some metrics for room-usage analytics. I know the data is being collected, but I would like to see what metrics are used to interpret it. Here are some examples:

  1. Room Usage: How are people currently planning the number and size of their conference spaces? I know there is a big push for high-efficiency workplaces, with fewer offices and more “leased” cubicles. That means people need a place to meet. How many huddle rooms does a floor require? How many conference rooms? How many seats should be allowed for each? Analytics would allow us to measure that, as well as to set targets. I would think a starting target room-usage metric would be 85-percent in-use during business hours. I’m sure the accountants would want 100 percent (they’re paying for it), but that doesn’t leave any time for ad-hoc meeting, maintenance, training, etc. I think 85-percent room usage is still a good ROI, while also providing some flexibility for “real life.” Any more than that, and you need more meeting spaces. Less than that, rent is being wasted.
  2. Functions Used: It takes a lot of resources to add a source to the system. Sure, a Blu-ray player might only cost $30 on its own, but, now, the system needs a larger switcher, a power outlet, something to provide transport controls, a maintenance plan to clean the disc reader, someone to test it periodically, etc. It’s a lot more expensive than “just a $30 Blu-ray player.” And how often will it be used? I have the same questions and concerns about VGA cables. If they are rarely used, I can save a lot of money and time by not having to support them; however, if I take them away too early, I risk upsetting my users. Analytics can be used to measure how often they are selected. Is it safe to say that, if they are used less than 10 percent of the average source selection, they can be removed from the standard catalog of designs?
  3. System Performance: One of the best additions of analytics-equipped systems is the user review at the end of each use. It consists of a one- or two-question survey about how well the system performed. The first question is a general rating of system performance: Poor, Fair, Good or Excellent. If the user selected Poor or Fair, the second question appears to find out what went wrong: Audio, Video, Control or Everything. It takes two seconds to complete, but it gives the user a voice, and it gives the technology manager hard data about how well the systems are meeting users’ needs. I think Good or Excellent reviews 90 percent of the time would be a good target metric. We need to account for device failure, user error and aging technology to make up that 10 percent of users who are unsatisfied. More than 90 percent—the room is doing its job. Less than 90 percent—something should be addressed ASAP.

These are the types of interpretations of room analytics that I think can bring tremendous value to our clients. It’s also a new service that can help weather the storm in this world of shrinking conference-space technology. Most importantly, it brings the AV team to the “adult table,” during the programming phase of projects, as opposed to being relegated to the “kid table,” as an afterthought once construction has begun. Having a system with a room-usage analytical engine and no one translating the data is the same as being a “prepper” who has a generator for power outages, but no gas—it sounds cool…but it’s not very useful.

Editor’s Note: James Maltese would love to hear your thoughts on the proposed metrics, and others that you think would be useful. Feel free to email jmaltese@avres.com with any comments.

To read more from Sound & Communications, click here.

Previous ArticleNext Article
Send this to a friend