“Data is the oxygen that powers our ability to detect and respond to threats to health, and we are at a pivotal moment in the modernization of the public health data infrastructure.” – Dr. Mandy Cohen, former CDC Director
Public health agencies today face a paradox: more data than ever, but fewer resources to analyze it. After the COVID-19 infusion of funding, the pendulum is swinging back – in some cases dramatically. In March 2025, the federal government clawed back $11.4 billion in unspent public health funds, forcing state and local departments to halt programs and lay off staff. Former CDC officials warn of proposals that could slash the agency’s budget by over 50% by 2026. According to a NACCHO survey, 17% of local health departments saw budget cuts in FY2024, and 23% anticipate cuts in FY2025. The fiscal belt-tightening is real, even as communities grapple with ongoing outbreaks and health disparities.
How can public health professionals continue to drive insights and impact under these constraints? Here are five practical tips to do effective analytics on a shoestring – drawn from real-world lessons, federal data modernization guidance, and on-the-ground experience helping health departments modernize in lean times.
When funding is tight, analytics must prove its worth. Prioritize projects that clearly demonstrate health outcomes or cost savings. By showing how your work prevents illness, improves efficiency, or saves dollars, you build the case to preserve (or even expand) your budget. In fact, grant writing and funding strategies now hinge on data-driven proof of impact – competitive proposals are expected to demonstrate ROI and plan for long-term sustainability.
For example, a community coalition in Northern Kentucky used local data to strengthen grant applications for urban health initiatives, resulting in three grants awarded (totaling $60,000) to revitalize underserved neighborhoods. The team leveraged an open-data dashboard (the NKY Atlas) to highlight needs and opportunities, which made their case to funders compelling. Likewise, many health departments are investing in program evaluation components despite limited staff. (In 2024, 84% of local health departments had some programs with evaluation, though nearly 60% reported lack of staff time to do it well.) The takeaway: use your data to tell a story – whether it’s a dashboard for policymakers or a one-page impact brief for a grant, make it clear how every analytics effort contributes to community health and efficiencies. This not only justifies existing funding but can unlock new streams. Don’t just collect data; translate it into outcomes that matter (e.g. reduced overdose deaths, faster outbreak containment, dollars saved by prevention). Public health leaders and elected officials are more likely to protect programs that show tangible results.
Statistic: “23% of local health departments anticipate budget cuts in FY2025” – proving your program’s impact is now essential for survival.
Under budget constraints, technology choices matter. Large proprietary systems and expensive software licenses can quietly drain budgets. Now is the time to embrace open-source and cloud-agnostic tools that are low-cost (or free) and flexible. Modern public health analytics can thrive with a lean tech stack: for instance, R and Python (free programming languages) are widely used for epidemiological analysis and visualization, replacing costly statistical packages. Tools like Posit’s Shiny can power interactive dashboards without hefty vendor fees. Many health departments have also adopted free or open-source platforms like DHIS2, CommCare, or REDCap for data collection – solutions originally designed for low-resource settings that work anywhere. These tools are cloud-agnostic: you can run them on any cloud or local server, preventing vendor lock-in and allowing you to shop for the best hosting deals or use existing government infrastructure. Some departments leverage data collection tools already included in their existing Microsoft 365 or Google Workspace subscriptions, maximizing return on investments already made.
Desktop automation doesn’t require expensive tools. Built-in system utilities like Windows Task Scheduler or cron on Unix/Mac can trigger your R and Python scripts to run on schedules – daily, weekly, or even hourly. For example, a county epidemiologist could set up a Task Scheduler job that runs an R script every morning at 6 AM to download the latest case data, generate updated charts, and email a situation report to stakeholders before they arrive at work. These native schedulers cost nothing yet provide robust automation capabilities that many health departments overlook.
When workloads grow beyond desktop capacity, the same code can be migrated to cloud environments. Tools like Docker containers make this transition nearly seamless – your analysis environment gets packaged with all dependencies, ensuring it runs identically anywhere. Additionally, cloud providers offer serverless options (AWS Lambda, Azure Functions, Google Cloud Functions) that can run your scripts on schedule without managing servers. For instance, a vaccination dashboard that outgrew a local computer could be migrated to run on cloud infrastructure, with the same R or Python code pulling data, performing analysis, and pushing results to a web-accessible location.
Cloud-based services, used smartly, can also cut costs. Rather than investing in on-premises hardware, consider using cloud storage and computing on a pay-as-you-go basis. (Be mindful of governance and security, but most major clouds have government-compliant options.) A cloud approach can be scaled up or down with your needs. During COVID-19, for example, CDC stood up a cloud-based platform to handle the massive influx of data, accelerating analyses and publication of findings. At a local level, this might mean using a basic cloud database or even Google Sheets and free APIs for simpler projects. The key is being agnostic and nimble: use whatever tool gets the job done cheaply and reliably. If a small health department can automate a weekly surveillance report with a Python script on a $5/month server instead of manual work in a pricey software suite, that’s a big win.
Finally, tap into free public data and shared resources. There’s a wealth of open data (CDC’s open data portal, census data, etc.) and open-source analytic scripts/shared code from other jurisdictions. Don’t reinvent the wheel – adapt what’s already available in the public domain. The Data Modernization Initiative encourages sharing technology resources and open data to foster innovation, which can particularly benefit cash-strapped teams. Collaboration in open-source isn’t just cost-effective; it also builds a community of practice that you can lean on.
When every staff hour is precious, automation is your friend. Identify any repetitive data tasks – daily case updates, weekly report generation, data cleaning, sending alerts – and see if they can be automated or at least simplified. Even small efficiencies add up. Automation can range from simple (macros, scheduled database queries) to advanced (AI-assisted data processing), but the goal is the same: reduce manual effort so your limited staff can focus on higher-value analysis.
Prioritize automation over aesthetics in your workflows. Many health departments maintain beautifully formatted spreadsheets with complex formulas and visual elements that look impressive but become maintenance nightmares. These manually-updated documents are not only time-consuming to maintain but also prone to human error – a misplaced decimal or copy-paste mistake can compromise data integrity. Instead, focus on developing standardized, script-driven reports that might look plainer but update automatically and reliably. For example, replace that elaborate Excel dashboard with an R Markdown or Python notebook that generates consistent outputs from raw data. The time saved and error reduction far outweigh the loss of decorative elements. Remember: in public health analytics, accuracy and timeliness trump visual polish every time.
For those looking to move beyond Excel, André shares a practical guide to getting started with R (or Python) and building automated, reproducible workflows—read it here.
AI assistants are emerging as powerful allies for resource-constrained teams. Tools like GitHub Copilot, ChatGPT, or Claude can help generate code snippets, troubleshoot bugs, or even build entire scripts based on plain language descriptions. For example, an epidemiologist with limited coding experience can describe a data transformation need to an AI assistant and receive working R or Python code in seconds. These tools can also help create and maintain documentation – a critical but often neglected task when teams are stretched thin. Well-documented processes ensure continuity when staff turnover occurs and make onboarding faster for new team members.
Standardizing processes goes hand-in-hand with automation. If every analyst does things differently, it’s hard to automate. Instead, develop standard data pipelines and templates, potentially using AI to help design these frameworks. For example, create a reusable analytics pipeline for outbreak response: once configured, it can be quickly applied to the next outbreak with minimal tweaks. Intersect Collaborations (a consultancy focused on low-resource public health settings) emphasizes automated workflows and standardized processes that can scale securely. Even small health departments can build simple “analytics assembly lines” – e.g., a weekly disease report that auto-fetches latest data, populates a template in R or Excel, and emails it to stakeholders. AI assistants can help maintain and enhance these pipelines over time, suggesting optimizations or adapting to new data sources.
The bottom line: work smarter, not harder. Every hour of staff time saved by a script, an app, or an AI assistant is an hour gained for deeper analysis, community engagement, or emergency response. In a time of hiring freezes and attrition, smart automation is like an extra team member that doesn’t draw a salary.
Budget cuts often manifest as hiring freezes or lost positions – making it crucial to work differently with the people you still have. Rather than assuming “do more with less” means each person simply works harder, focus on capacity building and partnerships. Invest in your team’s data skills through cross-training, mentoring, and peer learning. A nurse or program manager with basic training in data analysis can help fill gaps when the one epidemiologist is swamped. Many health departments are embracing a team-based approach where duties are shared and staff are upskilled on the job. As Intersect’s approach highlights, a peer-oriented, team-based approach with co-creation ensures solutions are sustainable even after external consultants leave. Building internal expertise is a long-term strategy to weather funding storms.
Free and low-cost online courses have democratized analytics training, making it possible to upskill your team without breaking the budget. Staff can leverage platforms like Coursera (which offers Johns Hopkins’ “Data Science Specialization”) and DataCamp (with interactive R and Python courses specifically for health data). Many of these platforms offer financial aid or free audit options. For public health-specific training, look to CDC TRAIN, which hosts hundreds of free analytics modules, or CSTE’s webinar archives covering epidemiologic methods and tools at no cost. The Public Health Informatics Institute also provides free resources tailored to health departments’ needs. For team members interested in visualization, Tableau Public offers free licenses to nonprofit organizations along with comprehensive tutorials. Even commercial platforms like LinkedIn Learning are often available through public library partnerships at zero cost to users. The key is establishing a learning culture where staff dedicate a few hours weekly to skill development – perhaps through “Lunch and Learn” sessions where colleagues share new techniques they’ve mastered through these courses.
At the same time, no health department is an island. Forge partnerships to extend your capacity. This can include academic collaborations (e.g. a local university public health program might supply interns or capstone students to assist with data projects), or regional data-sharing collaboratives where multiple small departments share an analyst or a data system. The CDC’s Data Modernization Initiative explicitly prioritizes “developing a state-of-the-art workforce” and “supporting and extending partnerships” across agencies. In practice, that could mean joining a regional consortium for disease surveillance data or partnering with a neighboring county to jointly fund a data manager that serves both. It could also mean tapping into volunteer networks and fellowships. (Unfortunately, NACCHO reports that few local health departments currently host fellows or trainees, putting future workforce development at risk – but this is an opportunity to explore. Programs like the CDC Public Health Associate Program or PHI/CDC Global Health Fellowship can place early-career talent in your agency at minimal cost.)
Peer mentorship is another powerful model. In 2025, NACCHO launched a Wastewater Monitoring Mentorship Program, pairing experienced health departments with those starting out, to ensure everyone can implement new surveillance methods. This kind of peer support costs little but accelerates learning and problem-solving. Consider creating or joining communities of practice for analytics – whether it’s a Slack group of epidemiologists or a standing call among data leads in your state. Share scripts, share lessons, even share failures so others can avoid them. By pooling knowledge and resources, you collectively multiply capacity.
In short, invest in humans and relationships as much as in hardware and software. A small, well-trained, well-networked analytics team can punch above its weight. When budgets rebound (someday), you’ll then have a stronger foundation to build on.
Doing more with less doesn’t mean doing less for the most vulnerable. In fact, under constrained budgets it’s even more critical to align analytics with equity goals and the core health needs of your community. Why? First, health equity is a priority in many funding streams – demonstrating that your data efforts help identify and address disparities can attract support (or at least protect against cuts). Second, focusing on equity ensures you’re deploying limited resources where they make the greatest impact, closing gaps for those who might otherwise be left behind.
Practically, this means using your data to highlight inequities and drive action to reduce them. For example, the Northern Kentucky Health Department created a series of Health Equity Reports examining social determinants of health across their region. By assembling data on income, education, housing, and other factors, they provided insight into which neighborhoods face the greatest barriers to good health – information that guides policymakers and community organizations in targeting interventions. Even if you lack a fancy report, you can start with what you have: maps of disease rates by census tract, analyses of vaccination coverage by demographic group, etc. Identify the gaps and communicate them clearly. This not only helps the community strategize solutions, it also builds public support for your work when people see their needs are being addressed.
An equity focus also means designing analytics projects that benefit frontline public health services. Under budget cuts, it’s tempting to prioritize only efficiency and “keep the lights on” tasks. But consider low-cost analytics that can directly bolster programs for underserved groups – for instance, an automated outreach tool to remind low-income clinic patients of appointments (preventing costly no-shows), or analyzing transportation data to plan mobile clinic stops in rural areas. These efforts can prevent worse outcomes that would be even more expensive to manage later. In other words, equity and prevention are cost-effective in the long run.
Finally, engage the community in your data efforts. Make your dashboards and reports accessible to non-experts. Northern Kentucky’s public-facing Atlas with an AI chatbot is a great example of lowering the barrier to data – anyone can ask a question like “what’s the cigarette smoking prevalence in my county?” and get an answer. By democratizing data, you empower local partners (NGOs, residents, businesses) to collaborate on solutions, effectively extending your reach without straining your budget. Transparency builds trust, and trust brings more allies to advocate for public health funding when it’s on the chopping block.
In summary, equity isn’t a “nice-to-have” even in lean times – it’s a strategic imperative. It guides you to invest your limited resources where they matter most, aligns with federal and philanthropic priorities, and fosters community support. By keeping an eye on who’s being left behind and using data to illuminate those needs, you ensure your analytics drive meaningful, inclusive improvements in health.
Public health analytics under budget constraints is challenging, but it also can be a catalyst for innovation. When resources are limited, we are forced to be creative, agile, and focused on what truly matters. Lean times reward the bold – those willing to adopt new tools, form new partnerships, and rethink business-as-usual. Whether you’re in a big city health department or a small rural district, I hope these tips spark ideas for maximizing impact with minimal resources.
What strategies have you found effective in doing more with less? I invite you to share your experiences and tips in the comments. Let’s learn from each other – after all, we’re all working toward healthier communities with whatever we’ve got. Feel free to connect with me here on LinkedIn to continue the conversation. Together, even under tight budgets, we can modernize public health data in ways that save lives and prove our value.
– André van Zyl, MPH, Founder & Principal Consultant at Intersect Collaborations LLC (former CDC Health Scientist and passionate advocate for data-driven public health in low-resource settings)