Skip to main content

Bolstering agreement with scarce resource allocation policy using education: a post hoc analysis of a randomized controlled trial

Abstract

Background

The COVID- 19 pandemic prompted rapid development of scarce resource allocation policies (SRAP) in case demand for critical health services eclipsed capacity. We sought to test whether a brief, educational video could improve alignment of participant values and preferences with the tenets of the University of California Health’s SRAP in a post hoc analysis of a randomized controlled trial (RCT) conducted during the pandemic.

Methods

An RCT of an educational video intervention embedded in a longitudinal web-based survey conducted from May to December 2020, analyzed in August 2024. The “explainer” video intervention was approximately 6 min long and provided an overview of the mechanics and ethical principles underpinning the UC Health SRAP, subtitled in six languages. California residents were randomized to view the intervention or not, stratified by age, sex, education, racial identity, and self-reported health care worker status. Non-California residents were assigned to the control group. 1,971 adult participants were enrolled at baseline, and 939 completed follow-up. 770 participants with matched baseline and follow-up responses were analyzed. Self-reported survey assessments of values regarding components of SRAP were scored as the percentage of agreement with the UC Health SRAP as written. Participants responded to items at baseline and follow-up (approximately 10 weeks after baseline), with randomization occurring between administrations.

Results

After the intervention, overall agreement improved by a substantial margin of 5.2% (from 3.8% to 6.6%, P <.001) for the intervention group compared to the control group. Significant changes in agreement with SRAP logistics and health factors were also observed in the intervention group relative to the control, while no significant changes were noted for social factors. Differential intervention effects were observed for certain demographic subgroups.

Conclusions

A brief educational video effectively explains the complex ethical principles and mechanisms of the SRAP, as well as how to improve the alignment of participant values with the foundational principles of UC Health SRAP. This directly informs practice by providing a framework for educating individuals about the use of these policies during future situations that require crisis standards of care, which can, in turn, enhance agreement and buy-in from affected parties.

Trial registration

ClinicalTrials.gov registration NCT04373135 (registered 4 May 2020).

Key points

AbstractSection Question

Can a brief educational video improve agreement with the ethical principles underlying scarce resource allocation policies (SRAP)?

AbstractSection Findings

In this post hoc analysis of a randomized controlled trial, we observed that overall agreement with SRAP improved by a significantly greater margin of 5.2% (3.8% to 6.6%, P <.001) for intervention relative to control. Effect heterogeneity was seen for some demographic subgroups.

AbstractSection Meaning

Educational interventions are effective for nudging alignment of personal values with ethical principles while observed effect heterogeneity highlights the need for additional research to tailor and target messaging to maximize buy-in.

Peer Review reports

Introduction

Scarce resource allocation policies (SRAP) outline processes by which limited resources, such as mechanical ventilators, are allocated during critical care shortages [1, 2]. During the COVID-19 pandemic, many policies were designed with limited community engagement due to the emergent nature of the rapidly worsening crisis and difficulty recruiting and convening advisors outside of the working group during active stay-at-home orders [3,4,5,6]. Having designed such a policy in 2020, the University of California (UC) [7] chartered the Understanding Community Considerations, Opinions, Values, Impacts, and Decisions (UC-COVID) study to seek public opinion on draft UC SRAP rapidly [8]. We previously demonstrated moderately high (67% to 83% by domain) community agreement with SRAP tenets across domains of logistical concerns (how SRAP would be implemented), health factors (how a patient’s current and historic health status would affect allocation decisions), social factors (how factors unrelated to health, such as age, would be incorporated), and exceptions (situations where SRAP may be temporarily deferred or exempted) [9].

We also embedded a clinical trial within the UC-COVID study to test the impact of a video intervention on knowledge and trust in SRAP, finding improved community-level SRAP understanding [10]. However, less is known about whether these interventions can enhance agreement with SRAP, of critical importance during emergencies such as the pandemic, as health policy decisions and the authorities that promulgate them can be met with distrust or disagreement, exacerbated by poor communication of the underlying rationale for what can be viewed as a heavy-handed or top-down decision [11,12,13]. While knowledge about the policy was the principal outcome of our previous study, during the course of our analyses, we formed a secondary research question about whether such an intervention could improve agreement and buy-in with such policies, and if so, among which groups we could affect the most improvement. Here, we present an additional post hoc analysis of the UC-COVID trial to evaluate the impact of an educational video on agreement with SRAP tenets and frameworks.

This analysis offers an empirical test of the efficacy of an explanatory video intervention for this use case but also has implications that extend beyond SRAP to other policy issues where improving key interested party knowledge, trust, or agreement in a potentially controversial policy element is a goal. Such tools would be useful when key informants may not be readily available for the timely collection of input in the design of a complex health service intervention. Additionally, they offer an opportunity to provide rapid, easily disseminable explanations to interested parties who would be affected by such an intervention. As such, the knowledge gained in this study is applicable beyond the use for SRAPs in preparation for a public health emergency but also for public health practitioners and policymakers wherever an intervention, policy, or program may not be readily accepted without public education.

Methods

Eligibility and recruitment

As this analysis draws upon our earlier work, eligibility and recruitment have been previously described, further detailed here in Supplemental Table 2 [8]. Briefly, we enrolled adults aged 18 or older between May and September 2020 using internet-based snowball sampling in partnership with community patient and health care professional organizations. Additionally, social media sites, including Twitter (currently known as X), Facebook, LinkedIn, and Doximity, were used to recruit participants. Upon enrollment, participants provided informed consent and completed a baseline survey discussing health care disruptions during the pandemic and their opinions and values surrounding scarce resource allocation policies with the framing that such policies may be implemented if pandemic-related surges in hospital utilization reached a breaking point. Surveys were translated (International Contact, Berkeley, CA) and available in English, Spanish, simplified Chinese, Korean, Tagalog, and Vietnamese, California's top 6 spoken languages. Participants were also informed that they would be invited to subsequent surveys and that some would be randomized to receive an educational intervention between survey administrations. As such, participants who provided informed consent and enrolled in our baseline assessment were invited to participate in follow-up. Follow up was collected between October and December 2020.

Randomization

California participants were randomized to watch a brief educational video explaining UC SRAP [7] tenets. Non-California respondents were allocated to the control arm. California resident respondents underwent stratified randomization at a 1:1 allocation using the native randomization algorithm in QualtricsXM (Qualtrics, Inc., Provo, UT), stratified by self-reported age group (< 35, 35–55, > 55), gender (female vs. all others), race (white vs. all others) and ethnicity (Hispanic/Latin vs. non-Hispanic/Latin), educational attainment (less than bachelor’s degree vs. all others, and health care professional (HCP) employment status (yes vs. no). Study staff were blinded to allocation until the time of analysis; participants were not blinded as to their randomization group, as during the informed consent they were informed of randomization to an intervention or not.

The rationale for this schema was that the UC SRAP would only affect California residents. Therefore, it would be problematic and even potentially unethical (by causing unnecessary emotional distress in considering a potentially moot policy) to randomize non-California residents to learn about a policy that would not necessarily apply to them. However, since we did not restrict the study and received responses from non-California participants, we included them as an additional control for transparency, allowing us to consider cultural effects within California compared to other locations. Participants answered items regarding their values and preferences related to SRAP principles and implementation at baseline and follow-up with those randomized to intervention who received the video immediately before the second assessment. Control participants did not receive the intervention and proceeded directly to the follow-up survey assessment. Power was calculated post hoc as this is a secondary analysis. When comparing the mean agreement between California control and intervention groups, we calculated our sample of 578 with an allocation ratio of 1.09:1 treatment to control to have 85% power to detect a 0.5% difference-in-differences with a standard deviation of 2%.

Intervention design

The intervention was an animated 6-min video, which explained ethical frameworks (e.g., saving the most lives possible), logistics underpinning how SRAP would function and their rationale (e.g., blinding of patient identity from decision-makers to reduce bias and promote equity, temporary exemptions for health workers), as well as how health and sociodemographic factors would influence allocation priority [10]. The authors (RGB and LEW) drafted the video script with input from the UC Critical Care Bioethics Working Group and targeted a sixth-grade reading level. The animation was designed by a professional video production studio (WorldWise Production, Los Angeles, CA) with voiceover in English and subtitles available in Spanish, simplified Chinese, Korean, Tagalog, and Vietnamese. This video was housed on a private server and embedded directly into the follow-up survey to prevent instrumentation of the control group. Further details on its design are detailed in Supplemental Table 2.

Endpoint & measurements

For this post hoc analysis, our endpoint was a change in participants’ agreement scores, which denoted alignment with UC SRAP as drafted. Survey instruments developed for this study have been previously published, including their psychometric properties and validation [8]. Agreement scores were defined as the arithmetic distance between the response for each item on a Likert scale and the point on the scale that matched the concept from UC SRAP. For example, for the tenet “Policies should try to save the most lives possible,” a response of 10 on the 10-point scale would denote 100% agreement, and a score of 1 would be 0% agreement.

Items that evaluated how patient factors would influence allocation were operationalized on a 9-point Likert, with 1 being “Should be much less likely” versus 9, “Should be much more likely to get life support,” with 5 as “Should not one way or the other.” The 9-point Likert Scale was chosen to allow maximum variability in responses. Correspondingly, for items on prioritization factors where a factor would not influence resource allocation, a response of 5 would be 100% agreement, while 1 or 9 would each be 0% agreement. These scores were calculated by item and aggregated into four domain scales by taking the mean participant score per domain at each time point. Overall agreement was tabulated by taking the aggregate mean of the four domains at each time point.

Missing data

There was a small amount of missing data per item used in this analysis, ranging from 2.8% to 5.9%. Based on prior work where we found no substantive differences in imputed versus complete case analyses [9, 10], we did not fit additional models using imputed data for this secondary analysis.

Statistical analyses

We conducted an intention-to-treat analysis, collating and pairing responses within participants. To determine whether the intervention changed agreement, we employed a difference-in-differences approach to compare the change in score from baseline to follow-up between randomization groups. This approach allowed the calculation of the average treatment effect (ATE) from the intervention while also accounting for secular changes related to different levels of media attention on the possibility of SRAP implementation during the intensifying COVID- 19 crisis [14,15,16]. To determine ATE, we employed a fractional probit regression with clustered standard errors at the participant level to model percent agreement. Then, we fit marginal estimates with 95% confidence intervals for California intervention vs control groups. Statistical significance was measured by the Wald Z test with Bonferroni corrected 95% confidence intervals and p-values for multiple comparisons. To explore the potential effect of heterogeneity by sociodemographic variables, we fit stratified models by self-reported race/ethnicity, age, education, health care professional employment status, and political affiliation in the same manner. All analyses were completed in Stata 18.0 (StataCorp, College Station, TX) with a two-tailed alpha of 0.05.

Results

A total of 1,971 adult participants provided informed consent and completed the baseline assessment between May and September 2020. Nine hundred thirty-nine participants completed follow-up assessments between September and December 2020. The time distributions of responses are detailed in Supplemental Figs.1 and 2. Among these participants, 770 individuals had complete baseline and follow-up data and, therefore, were eligible for this analysis (Fig. 1). Participant demographics are shown in Table 1. There were no significant differences in demographics between the randomization groups. Similar to our previously published analyses, we noted high overall baseline agreement across each domain [9]. When we evaluated the effect of viewing the intervention video, we found significant improvements overall and across the domains of policy implementation logistics, health factors considered for allocation, and how exemptions to such policies would apply, as shown in Fig. 2.

Fig. 1
figure 1

CONSORT diagram demonstrating participant flow and randomization. Alt text: a flow diagram demonstrating how participants were allocated into groups for the study

Table 1 Participant characteristics by group allocation
Fig. 2
figure 2

Marginal estimated change in agreement of respondents with UC SRAP tenets by randomization group, overall, and by domain. *P < 0.05, **P < 0.01, ***P < 0.001, N.S = not significant. Alt text: a series of line charts showing estimates of change in agreement with scarce resource policy by randomization group and timepoint

At baseline, the California control group exhibited 78.7% (95% CI 77.5% to 79.8%) overall agreement, while the intervention group reported 78.1% (76.9% to 79.3%, P = 0.36). After the intervention, overall agreement improved by a statistically significantly greater margin of 5.2% (3.8% to 6.6%, P < 0.001) for intervention relative to control. Non-California control participants reported consistently lower agreement compared to California control participants: agreement was 3.4% lower agreement at baseline (95% CI 1.5% to 5.3%, P < 0.001) and 2.7% lower at follow-up (0.8% to 4.5%, P = 0.005).

Agreement with policy logistics improved by 6.5% more in the intervention group, compared to controls (3.9–9.1%, P < 0.001), as well as for health factors, with an average treatment effect of 5.3% (3.0%−7.6%, P < 0.001). Analyses determined a 7.3% (4.9% to 9.5%, P < 0.001) improvement in exemptions to SRAP. However, no significant difference was noted for social factors after intervention. High agreement was measured (> 90%) at baseline for both intervention and control. Agreement per individual item is found in Supplemental Table 1.

In stratified analyses (Table 2), those who self-identified as Asian/Pacific Islander, Hispanic/Latin, or white significantly improved in agreement after the intervention, while those who reported Black, other, or multiracial identity did not. The largest effect was seen in Hispanic/Latin respondents (Supplemental Fig. 3), whose agreement increased by 9.0% (2.9% to 15.1%, P = 0.004). There were, otherwise, no substantive differences in the intervention effects by levels of educational attainment (Supplemental Fig. 4), health care professional employment status (Supplemental Fig. 5), or education level (Supplemental Fig. 6). Those reporting conservative political ideology reported a non-significant decrease in agreement of − 0.2% (− 5.9% to 5.5%, P = 0.94) post-intervention. Those who reported moderate or liberal political ideology both reported similarly significant increases in agreement of 5.8% (2.5% to 9.3%, P = 0.001) and 5.7% (4.0 to 7.4%, P < 0.001) respectively (Supplemental Fig. 7).

Table 2 Average treatment effect of intervention on change in agreement with UC SRAP, stratified by demographics

Discussion

In this post hoc analysis of the UC-COVID trial, we found that our educational intervention improved knowledge of the UC SRAP and helped improve community-level agreement with its tenets and principles. Notably, we observed improvements both overall and across stratified subgroup analyses.

Our findings highlight the potential of concise educational interventions to influence public understanding and acceptance of SRAP, a finding extendable to other complex or controversial health care interventions, programs, and policies. These results have important implications for communication with the public, even in the post-pandemic period [17]. While local considerations should be considered whenever feasible, the tenets of the SRAP are rooted in widely accepted medical ethics, which emphasize principles such as fairness, autonomy, beneficence, and justice [2, 3]. As such, not all allocation considerations are negotiable, particularly when they conflict with these foundational principles. Thus, strategies to improve community agreement with these principles are crucial to ensure their successful implementation and adherence. Prior research has explored strategies to foster community agreement, highlighting the importance of transparency, apparent procedural fairness, and adherence to ethical and moral frameworks [18,19,20,21]. Consensus building has also been noted to be critical for policy development [19, 22, 23]. However, crises, such as intensive care unit overload during a public health emergency, may limit the ability to perform rigorous community-based participatory research in real-time. This, coupled with the complexity of engaging in partnered research amidst a viral respiratory pandemic with the need for physical distancing, was the rationale for this study. While the use of asynchronous learning or advertising campaigns to improve buy-in to public health campaigns has previously shown mixed effectiveness [24, 25], our findings suggest that combining these strategies with transparent communication has the potential to address barriers to community engagement during times of crisis.

Of note, our analysis identified important demographic differences. For example, we found numerically, although not statistically significantly higher levels of agreement with the SRAP among Californians even at baseline compared to those residing elsewhere. Though the 95% confidence intervals overlapped for this comparison, this suggests potential unmeasured cultural differences that vary across different states and regions that require consideration if these interventions were to be deployed more broadly [26, 27]. We also found heterogeneity in how effective our intervention was at improving agreement across various demographic groups. This highlights the need for additional research to tailor and target messaging to maximize buy-in and ameliorate distrust [28, 29]. For example, while our intervention was available with subtitles in multiple languages, we did not tailor our video to reflect potential cultural differences in key constituent groups. Our findings highlight an area for ongoing research, both in how best to customize messaging for maximal effect and how to balance this with scalability to rapidly deploy an intervention such as this across multiple settings and contexts [29].

Another ongoing question in promoting buy-in of affected parties to policy decisions is the ability to overcome potential entrenchment. Fixed beliefs related to health decisions and public policy can be challenging to move even in the most ideal circumstances. The pandemic has underscored a climate of extreme polarization, with multiple competing interests clashing over how to best balance concerns about mitigating the risk of infection, economic stability, and social cohesion, among other factors. Though prior research has demonstrated the feasibility of many methods in engendering trust in and agreement with health policy decisions [22], these methods are often labor and time-intensive [30]. Our findings suggest that though an asynchronous video learning module cannot wholly overcome a perceived lack of legitimacy or credibility per se, it can assuage some concerns about a lack of transparency or the power imbalance associated with a “top-down” decision. While our study represents the first published intervention of its type in this policy area, replication studies, including multi-site trials or meta-analysis, would improve and validate the long term effectiveness of such interventions.

Further research should continue to explore various communication strategies for disseminating the tenets of an SRAP and rigorously study and disseminate the results of impact analyses of how SRAPs might affect interested parties [31]. Though there are logistical and ethical challenges to empirically studying SRAPs, simulation methods are promising and, for example, can be used to study the impact of the significant state-level variability in how SRAPs are operationalized [6]. There are, however, potential pitfalls and unintended consequences of using simulation methods, including exacerbating health disparities [32,33,34,35]. Nevertheless, the results of any empirical evaluation can be used by policymakers to refine SRAPs further and determine how additional data might influence public agreement.

Ultimately, these findings have practical implications for policymakers and healthcare leaders who are tasked with developing and implementing SRAPs during periods of resource scarcity, such as pandemics or public health emergencies. Transparent and accessible communication strategies, such as the educational video used in our intervention, can potentially improve public agreement with its tenets, thus reducing conflicts and improving policy outcomes [1, 20].

Limitations

Our sampling strategy yielded a sample less diverse (i.e. predominantly female, White) and more highly educated than the overall composition of California [8] limiting generalizability, particularly among historically marginalized groups that may have different perspectives on resource allocation policies. However, our sampling strategy is often acceptable in social science research [36, 37]. Agreement may have been bolstered by social desirability and instrumentation biases, although using control groups will have reduced this to the degree possible. While we did not collect data on reasons for non-completion of surveys nor loss to follow-up, attrition bias should be considered as a potential confounder, as is the case in any longitudinal survey-based study.

Conclusion

Brief educational interventions provide a robust, transparent tool for improving knowledge about complex health policies and agreement with key ethical principles. We show that even in a policy as ethically complex and politically volatile as determining who should receive the last life-saving critical care resources in a shortage, improvement of agreement with such controversial health services interventions is feasible, acceptable, and effective. This is particularly salient when consensus-building approaches during the promulgation of new policy may not be feasible due to extenuating circumstances. Further research into optimal messaging and dissemination across various constituent groups is needed to build upon this research and improve effectiveness.

Data availability

Data are avaliable from the authors upon reasonable request and an executed data use agreement with the University of California.

Abbreviations

RCT:

Randomized Controlled Trial

SRAP:

Scarce Resource Allocation Policy

UC:

University of California

UC-COVID:

Understanding Community Considerations, Opinions, Values, and Impacts Study

References

  1. Daugherty Biddison EL, Faden R, Gwon HS, Mareiniss DP, Regenberg AC, Schoch-Spana M, Schwartz J, Toner ES. Too many patients…a framework to guide statewide allocation of scarce mechanical ventilation during disasters. Chest. 2019;155(4):848–54.

    Article  PubMed  Google Scholar 

  2. White DB, Katz MH, Luce JM, Lo B. Who should receive life support during a public health emergency? Using ethical principles to improve allocation decisions. Ann Intern Med. 2009;150(2):132–8.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Emanuel EJ, Persad G, Upshur R, Thome B, Parker M, Glickman A, Zhang C, Boyle C, Smith M, Phillips JP. Fair allocation of scarce medical resources in the time of Covid-19. N Engl J Med. 2020;382(21):2049–55.

    Article  PubMed  Google Scholar 

  4. Cleveland Manchanda EC, Sanky C, Appel JM. Crisis standards of care in the USA: a systematic review and implications for equity amidst COVID-19. J Racial Ethn Health Disparities. 2021;8(4):824–36.

    Article  PubMed  Google Scholar 

  5. Hempel S, Burke R, Hochman M, Thompson G, Brothers A, Shin J, Motala A, Larkin J, Bolshakova M, Fu N, et al. Allocation of scarce resources in a pandemic: rapid systematic review update of strategies for policymakers. J Clin Epidemiol. 2021;139:255–63.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Piscitello GM, Kapania EM, Miller WD, Rojas JC, Siegler M, Parker WF. Variation in ventilator allocation guidelines by US state during the coronavirus disease 2019 pandemic: a systematic review. JAMA Netw Open. 2020;3(6):e2012606–e2012606.

    Article  PubMed  PubMed Central  Google Scholar 

  7. University of California Critical Care Bioethics Working Group. Allocation of Scarce Critical Resources under Crisis Standards of Care. 2nd ed. Oakland: University of California; 2020.

    Google Scholar 

  8. Wisk LE, Buhr RG. Rapid deployment of a community engagement study and educational trial via social media: implementation of the UC-COVID study. Trials. 2021;22(1):513.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  9. Buhr RG, Huynh A, Lee C, Nair VP, Romero R, Wisk LE. Health professional vs layperson values and preferences on scarce resource allocation. JAMA Netw Open. 2024;7(3):e241958.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Buhr RG, Romero R, Wisk LE. Promoting knowledge and trust surrounding scarce resource allocation policy: a randomized controlled educational intervention trial. JAMA Health Forum. 2024;5(10):e243509.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Albarracin D, Oyserman D, Schwarz N. Health communication and behavioral change during the COVID-19 pandemic. Perspect Psychol Sci. 2024;19(4):612–23.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Basch CH, Basch CE, Hillyer GC, Meleo-Erwin ZC. Social media, public health, and community mitigation of COVID-19: challenges, risks, and benefits. J Med Internet Res. 2022;24(4):e36804.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Ubel PA, Scherr KA, Fagerlin A. Empowerment failure: how shortcomings in physician communication unwittingly undermine patient autonomy. Am J Bioeth. 2017;17(11):31–9.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Knowles H. Hospitals overwhelmed by COVID are turning to ‘crisis standards of care.’ What does that mean? Washington Post; 2021. Retrieved 21 October 2021, from https://www.washingtonpost.com/health/2021/09/22/crisis-standards-of-care/.

  15. Baker M. In Alaska’s Covid Crisis, Doctors Must Decide Who Lives and Who Dies. The New York Times; 2021. Retrieved 8 September 2023, from https://www.nytimes.com/2021/10/03/us/coronavirus-crisis-alaska.html.

  16. Golstein J, Rothfeld M, Weiser B. Doctors Facing Brutal Choices As Supplies Lag. New York: The New York Times; 2020. p. 1.

    Google Scholar 

  17. Riggan KA, Nguyen NV, Ennis JS, DeBruin DA, Sharp RR, Tilburt JC, Wolf SM, DeMartino ES. Behind the Scenes: Facilitators and Barriers to Developing State Scarce Resource Allocation Plans for the COVID-19 Pandemic. Chest. 2024;166(3):561–71. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.chest.2024.04.006.

  18. Emanuel EJ, Persad G. The shared ethical framework to allocate scarce medical resources: a lesson from COVID-19. Lancet. 2023;401(10391):1892–902.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Biddison ELD, Gwon HS, Schoch-Spana M, Regenberg AC, Juliano C, Faden RR, Toner ES. Scarce resource allocation during disasters: a mixed-method community engagement study. Chest. 2018;153(1):187–95.

    Article  PubMed  Google Scholar 

  20. Lansing AE, Romero NJ, Siantz E, Silva V, Center K, Casteel D, Gilmer T. Building trust: Leadership reflections on community empowerment and engagement in a large urban initiative. BMC Public Health. 2023;23(1):1252.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Mohammadi MMD, Sheikhasadi H, Mahani SA, Taheri A, Sheikhbardsiri H, Abdi K. The effect of bio ethical principles education on ethical attitude of prehospital paramedic personnel. J Educ Health Promot. 2021;10:289.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Boivin A, Lehoux P, Burgers J, Grol R. What are the key ingredients for effective public involvement in health care improvement and policy decisions? A randomized trial process evaluation. Milbank Q. 2014;92(2):319–50.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Sarkies MN, Bowles K-A, Skinner EH, Haas R, Lane H, Haines TP. The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare: a systematic review. Implement Sci. 2017;12(1):132.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Dunlap AF, Ciari A, Islam N, Thorpe LE, Khan MR, Huang TTK. Using digital storytelling and social media to combat COVID-19 Vaccine hesitancy: a public service social marketing campaign. J Prev (2022). 2024;45(6):947–55. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10935-024-00799-7.

  25. Kite J, Chan L, MacKay K, Corbett L, Reyes-Marcelino G, Nguyen B, Bellew W, Freeman B. A model of social media effects in public health communication campaigns: systematic review. J Med Internet Res. 2023;25:e46345.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Evans TW, Nava S, Mata GV, Guidet B, Estenssoro E, Fowler R, Scheunemann LP, White D, Manthous CA. Critical care rationing: international comparisons. Chest. 2011;140(6):1618–24.

    Article  PubMed  Google Scholar 

  27. Jecker NS, Berg AO. Allocating medical resources in rural America: alternative perceptions of justice. Soc Sci Med (1982). 1992;34(5):467–74.

    Article  CAS  Google Scholar 

  28. Cheung ATM, Parent B. Mistrust and inconsistency during COVID-19: considerations for resource allocation guidelines that prioritise healthcare workers. J Med Ethics. 2021;47(2):73–7.

    Article  PubMed  Google Scholar 

  29. Nan X, Iles IA, Yang B, Ma Z. Public health messaging during the COVID-19 pandemic and beyond: lessons from communication science. Health Commun. 2022;37(1):1–19.

    Article  PubMed  Google Scholar 

  30. Keeney RL, von Winterfeldt D, Eppel T. Eliciting public values for complex policy decisions. Manage Sci. 1990;36(9):1011–30.

    Article  Google Scholar 

  31. National Research Council, Division of Behavioral and Social Sciences and Education, Commission on Behavioral and Social Sciences and Education, Committee on National Statistics, Panel to Evaluate Microsimulation Models for Social Welfare Programs. Improving Information for Social Policy Decisions--The Uses of Microsimulation Modeling: Volume I, Review and Recommendations, vol. 1. National Academies Press; 1991.

  32. Ashana DC, Anesi GL, Liu VX, Escobar GJ, Chesley C, Eneanya ND, Weissman GE, Miller WD, Harhay MO, Halpern SD. Equitably allocating resources during crises: racial differences in mortality prediction models. Am J Respir Crit Care Med. 2021;204(2):178–86.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Miller WD, Peek ME, Parker WF. Scarce resource allocation scores threaten to exacerbate racial disparities in health care. Chest. 2020;158(4):1332–4.

    Article  CAS  PubMed  Google Scholar 

  34. Gershengorn HB, Holt GE, Rezk A, Delgado S, Shah N, Arora A, Colucci LB, Mora B, Iyengar RS, Lopez A, et al. Assessment of disparities associated with a crisis standards of care resource allocation algorithm for patients in 2 US hospitals during the COVID-19 pandemic. JAMA Netw Open. 2021;4(3):e214149.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Gershengorn HB, Patel S, Shukla B, Warde PR, Soorus SM, Holt GE, Kett DH, Parekh DJ, Ferreira T. Predictive value of sequential organ failure assessment score across patients with and without COVID-19 infection. Ann Am Thorac Soc. 2022;19(5):790–8.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Wisk LE, Nelson EB, Magane KM, Weitzman ER. Clinical trial recruitment and retention of college students with Type 1 diabetes via social media: an implementation case study. J Diab Sci Technol. 2019;13(3):445–56.

    Article  Google Scholar 

  37. Wisk LE, Levy S, Weitzman ER. Parental views on state cannabis laws and marijuana use for their medically vulnerable children. Drug Alcohol Depend. 2019;199:59–67.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

The authors acknowledge the invaluable contributions of the University of California Critical Care Bioethics Working Group and the University of California Office of the President. The authors also thank the UC-COVID Study’s professional society and community partner organizations for helping to disseminate survey invitations.

Funding

This project was supported by a contract from the University of California Office of the President (62165-RB) and the UCLA Clinical and Translational Science Institute (NIH/NCATS UL1TR001881). Dr. Buhr was additionally supported by a career development award from the UCLA CTSI (NIH/NCATS KL2 TR001882) and a loan repayment program award (NIH/NHLBI L30HL130125). Dr. Huang is additionally supported by a training grant from the National Institutes of Health (NIH/NHLBI T32072752). Dr. Wisk was additionally supported by a career development award from the National Institutes of Health (NIH/NIDDK K01DK116932). The funding bodies had no role in the determination to publish nor the presentation of the findings in this manuscript.

Author information

Authors and Affiliations

Authors

Contributions

RGB, RR, and LEW collected data. RGB and LEW oversaw the project and obtained funding. RGB and LEW had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis. All authors drafted the manuscript and approved the final draft for publication.

Corresponding author

Correspondence to Russell G. Buhr.

Ethics declarations

Ethics approvals and consent to participate

The UCLA Institutional Review Board (20–000683) reviewed and approved this study in accordance with the provisions in the Declaration of Helsinki. All participants provided electronic informed consent.

Consent for publication

Not applicable.

Competing interests

Dr. Buhr reports personal consulting fees for DynaMed/American College of Physicians and Optum and serving on an advisory board for Chiesi, unrelated to this work and is an employee of the Veterans Health Administration. The views and positions in this manuscript do not necessarily reflect those of the Department of Veterans Affairs. Drs. Huang, Wisk, and Ms. Romero have nothing to report.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Buhr, R.G., Huang, C.X., Romero, R. et al. Bolstering agreement with scarce resource allocation policy using education: a post hoc analysis of a randomized controlled trial. BMC Health Serv Res 25, 540 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-025-12712-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-025-12712-x

Keywords