A significant part of my job managing RecycleMania involved irritating people. Over the twelve years I was involved with the higher ed competition (now Campus Race to Zero Waste), an intern and I would spend hours each week scrolling through the trash and recycling weights submitted by participating schools looking for anomalies. We’d email or call dozens of recycling coordinators to question how they were coming up with their numbers.
While many used actual weights, most relied on volume-to-weight formulas for some or all of their data. Pressed on how those formulas had been calculated, many confessed they’d simply multiplied the number of times dumpsters were emptied against the (presumed full) capacity of their dumpsters. Others acknowledged they had no idea, having simply forwarded numbers provided by their hauler with little or no context.
Refereeing the competition rankings at times required disqualifying schools who couldn’t explain why their recycling tonnage had tripled over the past year, but who’s coordinators were nonetheless convinced they legitimately belonged in the top five of an international recycling competition.
Lack of Data Holds Programs Back
Of all the challenges that have bedeviled college and university (and other large institutions) recycling and waste reduction programs, perhaps none has been more intractable than the simple inability to measure what they’re doing.
At the most basic level, many schools don’t have access to campus-wide tonnage weights for trash or recycling collection streams. 2018 data suggested less than 45% of RecycleMania participants were able to report actual weights for most or all materials. I’m not aware of any stats, but we can assume it’s a much smaller percentage of schools that can get reliable numbers at a building level, let alone the individual bin level.
The same cloud of mystery extends to what’s in the waste stream. CURC’s 2017/18 benchmarking survey of campus recycling programs revealed that even as 75% of schools said they made a “strong effort” to educate and encourage correct recycling, only 40% said they’d done the kind of formal waste audits that provide a detailed understanding of how to prioritize and frame educational messaging. And to put waste in the wider circular economy context, precious few schools have done the kind of life-cycle analysis research to prioritize upstream packaging and recovery program decisions (think compostable vs recyclable vs. reusable food service packaging arrangements) based on concrete environmental benefits.
While energy management and other building services have undergone a revolution in recent decades, using analytics to realize significant efficiency gains and cost savings, too many waste and recycling programs continue to base program decisions on assumptions and guestimates. Beyond the missed opportunity for operational efficiency and the resulting savings, the inability to track waste quantities by exact location prevents or makes it much more difficult to meter and assign costs to individual generators. Likewise, the lack of granular detail about which non-recoverable items are contaminating specific recycling and organics bins prevents educators from being able to target precise information to the relevant audiences.
And of course, you can’t improve what you can’t measure. Without good data, you’re reduced to anecdotal impressions of whether recycling rates in a particular building are going up or down over time. You’re left to conjecture whether new signage is having an impact. And you can’t meaningfully benchmark your school’s efforts against peer campuses, whether for competitive glory or professional knowledge.
Perhaps the least appreciated issue, and I’d argue most problematic, is that lack of performance data makes it harder for waste reduction programs to be taken seriously by other facilities staff and decision makers. Without impact metrics, it’s difficult for these programs to get credit for the value they deliver. It makes it harder to show a track record when proposing new initiatives. It allows recycling and zero waste to be dismissed as feel-good efforts whose purpose is to please student activists. This goes part of the way to explaining why waste reduction rarely gets mentioned from AASHE’s conference main stage, let alone APPA’s.
Why Is It So Hard?
So what are the obstacles to better data? Let’s start with every recycling manager’s favorite bête noire, waste haulers. While some haulers are good faith partners supporting a school’s diversion efforts, the simple truth is that transparency with collection data generally runs counter to their business interest.
On-board truck scales cost money. Routing collection trucks to isolate campus waste from other client’s adds costs. Volunteering volume and weight data that reveals patterns of excess dumpster capacity is… well, anti-capitalist. No one in the private hauling business is against capitalism. Not to paint with broad strokes, but the only way you get most haulers to provide clear, detailed data is to require it in a contract. And even then, good luck.
In many cases, the institutional dynamics of higher ed institutions are an equal or bigger obstacle. For too long, many university administrators have treated trash like death and taxes, an inevitability to just accept and budget for. While diversion programs are often at the front of the line when budget cuts hit, hikes in trash tipping fees or service rates are met with a shrug. Requests for scales and tracking software flounder without a guaranteed ROI. Reporting requirements get stripped from contract language when bids come back with a price premium. Procurement officials decide its not worth rocking the boat when service providers don’t follow through on reporting requirements. Administrators have traditionally had low expectations for trash. Because after all, it is just trash.
Alas, we also can’t let recycling and waste managers entirely off the hook – granted, many lack the authority, bandwidth, or budget to significantly change the situation. Nonetheless, I would argue we’ve not made data analytics enough of a priority, including myself back when I worked on a campus. Already stretched thin just keeping programs running, it’s easy to accept a status quo. Whatever free time exists gets poured directly into setting up new or expanded programs. But we miss opportunities when we don’t carve out time to measure and do the analysis that could lead to greater resources and administration support. While a growing number of schools are shifting to centralized / self-service models for indoor waste collections, it’s very hard to find instances where the impact on contamination (an existential threat to many recycling programs) was measured.
Dumpster capacity is another example. One of the ways I annoyed RecycleMania coordinators was to require they send interns out before the collection trucks to confirm how full their dumpsters actually were. Without firm weight data from their hauler, many had taken on faith the assurances they were all full. Over the years I heard back a dozen or more instances where these manual verification checks revealed systematic over-capacity, resulting in reductions to service levels and thousands of dollars in savings.
Of course, everyone deserves some slack given how difficult it’s traditionally been to get and use reliable data. Roll-offs and compactors have always had to cross a scale, but generally still required someone doing manual data entry from weight tickets. Two other options arrived with the new millennium, but with cost or technical challenges that have prevented widespread use. On-board truck scales offered the ability to track waste by individual dumpster, but until recent years were dogged by questions of accuracy. Big Belly introduced the concept of tracking waste generation by specific bin in real-time, but at $4,000 + a pop they remain too expensive for most schools to deploy widely. With one exception that we’ll come back to, these were basically the only options for tracking waste data short of student interns running around carrying a pen and clip board.
You Say Potato, I say… Waste-to-Energy
Seen from a 60,000 ft perspective, these challenges fit into a much wider, industry-wide landscape of data dysfunctionality. Aggregate campuses with their surrounding community or state, and we start getting a better idea how much waste is being generated. But then we face another intractable problem that has less to do with measuring and more to do with parochial interests.
The waste and recycling industry has been notoriously unable to agree on what to count or how to reliably aggregate data. With no meaningful federal authority to set definitions or standards, everyone from commodity-specific trade groups to local governments have mostly been left to define performance metrics on their own.
Should waste recovery efforts be measured based on diversion (% of overall waste kept out of landfills), capture rate (% of recoverable items actually recovered), or another way? Which is better to compare programs, weight, or volume? If we agree weight-based diversion is the way to go, what should be counted? Trash that is incinerated to create electricity? The state of Florida says yes, but California and many others say no. Should aluminum and steel cans be tracked separately or as a single “metals” category? Does one get credit for waste that wasn’t generated at all due to source reduction efforts? I can go on, but you get the point.
(Quick tangent: I once asked a state official how I should account for operational changes in my university’s required diversion reports. The tons of composted grass clippings I’d always reported from our football field disappeared after it was converted to artificial turf. Technically we’d eliminated a source of waste, yet our diversion rate would suffer without the composting credit. The reply I got back was to use my best judgement.)
To answer these questions, CURC (College & University Recycling Council) formed a committee nearly 25 years ago that produced the Campus Refuse Profile Workbook, which came to be referred to as the CURC Standards. Schools were encouraged to fill out and email back to CURC an Excel file pre-programmed with formulas to calculate a diversion rate. The worksheet included guidelines on what materials to include and which not to (waste-to-energy, construction and demolition waste, materials eliminated through source reduction). Despite the excitement about being able to compare programs side-by-side, the project atrophied without an effective way to collect and display the information in a user-friendly fashion.
Ten years later, AASHE stepped in to fill the void with the waste component of it’s STARS sustainability rating system. Following similar definitions and standards for comparing diversion, STARS offers the added ability to review background details about the initiatives behind the numbers.
Campus Race to Zero Waste (RecycleMania) remains the other opportunity to compare one’s school against a meaningful number of peer campuses across the US and Canada. With a range of different categories, it allows schools to look beyond diversion to compare recovery efforts for specific materials like food organics and electronics. The fundamental problem, though, is that it was not designed to be the professional benchmarking tool that many want it to be. It’s designed to be a platform for leveraging school spirit to engage students. To accomplish this, the program only looks at waste recovery during a short two-month window of time, and sets loose measurement standards that present questionable data alongside quality numbers.
With prodding from the US EPA, and the efforts of organizations like The Recycling Partnership and programs like SWEEP and the Municipal Measurement Program, the broader recycling industry has begun in recent years to coalesce around standardized systems. There’s a lot of work to be done, but these efforts suggest a path forward to be able to coherently track and compare the progress of individual programs.
The Future Is Arriving
The ability to accurately benchmark programs is important, but we still need to resolve how we accurately get the underlying numbers.
The good news is that technology is beginning to fill the void in a meaningful way. Sensors that first arrived with Big Bellys and higher end compactors in the early 2000’s, and more recently with companies like Senonseo and Enevo that can adapt roll off dumpsters, are evolving at an ever-quicker pace. Using infrared and other technologies, these sensors use WiFi or cell signals to send real-time data that can feed collection routing software or be used to track and analyze quantities over time. With costs coming down and the technology improving, we’ll increasingly see affordable sensors available with regular recycling and trash bins (in fact, Busch Systems will be introducing sensors in the near future).
Beyond enabling more efficient collection routing, sensors will – for the first time – allow an efficient way to track how different waste streams are generated throughout a facility. This will in turn open up a myriad of ways recycling and waste system managers can improve recovery programs, target education outreach, and more.
The other big deal is the introduction of AI software by companies like Intuitive AI, Compology, and Zabble that can be used to analyze what is being tossed in the waste stream. Paired with weight and volume sensors, AI will put granular, real-time waste composition data at the fingertips of recycling managers. Using this data, analytics will give insight into broader trends that can be used to optimize signage and inform up-stream packaging design among other things. And as this pilot project from the Vancouver Airport shows, AI holds the potential to identify the waste items in people’s hands as they approach bins, guiding them toward the correct bin opening.
These technologies are still in their early days. But with improvements and economies of scale, they hold the promise of fundamentally revolutionizing waste and recycling collections over the next few decades.
Be an Advocate for Data
All this brings us back to the role of recycling and zero-waste managers. Regardless of what the actual job description says, a fundamental part of the position is to be an advocate for change.
In many ways, the effect of the last 30+ years has been to simply graft recycling and a few reuse and waste prevention programs onto a legacy waste management system that still ultimately gives the landfill a competitive edge. It’s been a long, hard ride – taking years – to make incremental progress. For zero waste programs to move beyond slogans and have a shot at achieving goals, it’s necessary to redesign that legacy system.
Trends such as the move toward centralized collection arrangements and a focus on circular economy planning are promising steps in that process. Using data more effectively is another key step. New technologies will increasingly make this easier, but the onus is on program managers to make data a greater priority. We need to better understand how it can be used and push where we can to get it and then put it to use.
That means continuing to pressure for reporting requirements in hauling contract language. It means engaging professors to build class projects around waste audits and other ways to benchmark and test improvements to waste reduction initiatives. It means pushing decision-makers to think beyond near-term budgets and give the new technologies a try. And it means working to frame the perception with facilities managers and administrators that waste is a “system” too.
We talk in terms of “zero waste” and environmental gains as a communication strategy to engage students, but we need to understand waste is no different than other building services. It is a system that can be optimized to better serve the needs of the institution. And data is the first step to doing that.
What are your experiences or thoughts about the use of data for recycling and waste programs? I’m interested to hear your perspective. Contact me at email@example.com.