Skip to main content

From an idea to the marketplace: identifying and addressing ethical and regulatory considerations across the digital health product-development lifecycle

Abstract

Widespread adoption of digital health tools has the potential to improve health and health care for individuals and their communities, but realizing this potential requires anticipating and addressing numerous ethical and regulatory challenges. Here, we help digital health tool developers identify ethical and regulatory considerations – and opportunities to advance desirable outcomes – by organizing them within a general product-development lifecycle that spans generation of ideas to commercialization of a product.

Peer Review reports

Introduction

“Digital health” is a broad term that encompasses a heterogeneous set of scientific concepts and technologies. Indeed, it can be difficult to define because it includes tools intended for use as a medical product, in a medical product, as companion diagnostics, or as an adjunct to another medical product. Digital health embraces mobile health apps, electronic medical records, telemedicine, wearable devices, and more. Further, digital health is “an interdisciplinary field, drawing together stakeholders with expertise in engineering, manufacturing, clinical science, data science, biostatistics, regulatory science, ethics, patient advocacy, and healthcare policy, to name a few” [1].

This heterogeneity should not, however, obscure two simple points. First, digital health tools have the potential to contribute to better health and health care for individuals and communities. Second, to realize this potential, developers in the digital health space must acknowledge that each stage of the product-development lifecycle requires them to make decisions or to take actions that – implicitly or explicitly – have ethical or regulatory dimensions. These points of decision and action are rightly understood as opportunities to promote desirable outcomes.

Our primary goal in this review article is to identify and characterize a core set of ethical and regulatory considerations that cut across many types of digital health tools and to highlight points within the product-development lifecycle at which they arise or should be anticipated and addressed. For an overview, see Fig. 1. Because digital health tools are heterogenous, the ethical and regulatory considerations highlighted herein are not and cannot be exhaustive, and not all considerations will be relevant to all tools. Therefore, a secondary goal of this article is to provide real world examples that inspire developers to reflect deeply on their particular products and to consider whether these or other ethical or regulatory issues demand their attention.

Fig. 1
figure 1

This general product-development lifecycle is meant to be broadly applicable to digital health products, including both software and devices. At each step of the product-development lifecycle, developers must make decisions or take actions that – implicitly and explicitly – have ethical or regulatory dimensions. If they are made or taken mindfully, these decisions and actions are opportunities for advancing desirable outcomes. If these decisions and actions are not approached mindfully, developers risk engaging in ethically problematic behavior or running afoul of regulators

Generating ideas

The first step in developing a digital health tool is the generation of an idea. Ideas may be the result of, for example, identifying individuals’ unmet needs, understanding how individuals use currently available products, or leveraging technological innovations. At this step of product development, as in each step that follows, developers should anticipate and address ethical and regulatory considerations.

Identify user needs

Design a digital health tool that people need. This advice makes good business sense, but it is also ethically sound advice. To translate this truism into action, developers should understand user needs and leverage this understanding throughout product development. (Herein, the term “developer” is used to describe both individuals and entities involved in the creation and dissemination of digital health tools.)

Because digital health is by nature interdisciplinary, there is value in having a diverse development team that can offer valuable disciplinary perspectives. Additionally, it may be helpful consult with a digital health tool’s likely users. They might, depending on where and how the tool will be used, be clinicians or community health workers. They might be patients or informal caregivers. They might be healthy individuals or members of still another group. And within any of these broader groups, as the examples in this section and throughout the article illustrate, there can be important axes of diversity – including but not limited to age, gender, race and ethnicity, socioeconomic status, disability status, or comfort and familiarity with digital health tools. Fostering diversity in viewpoints – on your team and amongst potential users – can enhance innovation and increase applicability to a wide user base.

All ideas have limitations, and consultation may help identify them; in some cases, these limitations are ethical in nature because they implicate individuals’ values or sense of what is right or wrong. Consider for example products that allow a caregiver to passively surveil an older adult, remotely monitoring their location and activities. The purported benefits of such products include promoting older adults’ safety and independence. Yet, many were developed without older adults’ input or consideration of their values, including the importance they place on personal privacy. This has likely limited the efficacy of these surveillance products and negatively affected their adoption [2]. Underscoring the importance of talking to many stakeholders, subsequent research has revealed that older adults view remote monitoring less favorably than their adult children because they weigh the benefits and risks differently [3].

Consultation with diverse stakeholders can serve other ethical ends — such as suggesting ways to promote justice and fairness by enhancing access to health care or reducing health disparities. For example, persons with disabilities might offer insights that could help developers see what features would make their digital health tools more accessible. Deaf users might benefit from having visual or vibrotactile alarms in addition to auditory alarms, [4] whereas visually impaired users might need high-contrast software or audio recordings [5]. Similarly, transgender individuals could speak to the effects of discrimination in the health care system and help developers think about ways to improve access to providers with expertise in transgender medicine [6].

Designing and prototyping

In this stage of the product-development lifecycle, promising ideas generated in the prior stage are refined and turned into prototypes. The goal of this stage is to design and build a prototype that: embodies the key attributes of the animating idea, performs safely under normal use conditions, and can be produced within budget. Testing prototypes allows developers to respond to new information and, as needed, to explore alternatives. As we will show, ethical considerations should be front of mind.

Assess and address bias

Developers should be mindful of the potential for bias. Failure to assess and address bias problematically undermines fairness by creating pockets of disadvantage, often encompassing vulnerable and historically marginalized groups. Diversity of viewpoints is one means of combatting this. A team that is professionally or personally homogenous may lack awareness or understanding of biases in medicine and in society more broadly that could adversely affect product development [7, 8]. As the following three examples illustrate, bias can manifest in many different ways, and responses must be tailored accordingly.

First, devices that utilize photoplethysmographic (PPG) green light signaling, including pulse oximeters and heart rate monitors, have been shown to be less accurate for individuals with darker skin tones, likely because darker skin contains more melanin and therefore absorbs more green light than lighter skin [9, 10]. One solution is for developers to be transparent about when their devices may be less accurate, and some companies recommend only using their devices for individuals with light skin tones [10]. A preferable strategy, however, is to design digital health tools that function well for all users and advance health equity [11].

Second, stigma (a negative set of ideas or beliefs about a group) can infect design, minimizing a digital health tool’s efficacy. One effect of stigma is stereotype threat. It occurs when an individual in a particular social group is concerned about confirming negative ideas or beliefs that others hold about their group; this can lead the individual to underperform within the threatened domain. For instance, the intense stigmatization of overweight and obese individuals often leads outsiders to think they are lazy, weak willed, or self-indulgent; individuals who are aware (i) that others see them as overweight and (ii) of negative weight-related stereotypes may in turn feel less capable [12]. It has been found that exercise-based videogames were more effective for overweight children when the avatars were of normal body size than when they were overweight as a result of stereotype threat [13]. When designers are aware of stereotype threat, they can assess how their own biases or others' biases might problematically shape design.

Of course, stigma is hardly limited to obesity. A study of digital depression screening found that such tests can be subject to stereotype threat, which can lead to statistically significant changes in scores for women and non-binary participants [14]. Because these changes could adversely affect diagnosis and management, researchers recommended that such tests be thoughtfully designed to avoid biasing scores.

Third, there is growing concern that, due to systemic biases reflected in the datasets used to train them, algorithms may reproduce or amplify racial, gender, economic, and other disparities [7]. Adding to the unfairness, people who identify with more than one underserved group often experience compounded biases. For instance, academic researchers looking at AI-based chest X-ray prediction models found that patients belonging to two under-served subgroups (e.g., both Hispanic and female) were more likely to be misdiagnosed than those belonging to one or no under-served groups [15]. This is an example of intersectionality – that is, how the interconnected nature of social categorizations creates overlapping and interdependent systems of disadvantage.

Independent evaluation of algorithms can document disparities in outcomes. But independent evaluation requires a developer to negotiate an algorithm’s proprietary status and create trust with the evaluator. Without this, the evaluator will struggle to ascertain how and why these disparities occur and, by extension, how to address them [16]. While it is possible to employ post-hoc technical solutions to promote fairness, these are not necessarily straightforward, nor are they unambiguously good [15]. Thus, it is preferable for designers proactively to address this issue.

Consider risk–benefit tradeoffs

An important aspect of product development is risk–benefit analysis, which is also a requirement of beneficence or the ethical principle of making efforts to secure individual’s wellbeing [17]. As part of this analysis, a developer identifies potential risks and benefits of using their digital health tool and also determines whether the benefits outweigh the risks. In the Designing and Prototyping phase, it may be possible to maximize benefits, minimize risks, or both.

As benefits will depend heavily on the particulars of the tool, we will not linger on them here. The nature of risks, including their probability and magnitude, will also depend on the precise nature of the digital health device. Still, it is worth noting that risks associated with a digital health tool can be multifaceted—for example, physical, emotional, economic, or social.

Further, it is important not just to consider obvious risks, like skin irritation due to prolonged contact with an adhesive, but also to consider downstream or distal risks like GPS-data enabled stalking or other intimate partner violence [18,19,20]. The following two examples highlight salient, though perhaps less obvious, risks posed by digital health tools.

In 2022, period-tracking apps were in the spotlight after the U.S. Supreme Court held in Dobbs v. Jackson Women’s Health Organization that the Constitution does not confer a right to abortion. There was substantial worry that data from these apps – which help people track their menstrual cycles and sometimes offer predictive information, for example, about windows of fertility – could be used to determine when a pregnancy had been aborted, thereby exposing users to criminal liability [21].

Risks may also arise from how the digital health tool is used (or misused) by others. Developers who assume their product will always be used in the manner intended will likely be surprised. Even in the face of warnings that device safety was not well established, early prototypes of simple brain stimulation devices led to the rise of a do-it-yourself community [22]. Thus, developers should try to anticipate from the beginning of how their products may be used in unintended ways. (As an aside, when users innovate without the consent or even the knowledge of developers, they can become a valuable source of input, e.g., on how to improve products or appeal to a wider user base.)

A clear understanding of risks will inform selection of appropriate risk-mitigation strategies. Reducing risks to users can reduce developers’ exposure to liability (discussed further below), but it is also the right thing to do from an ethical perspective. Obviously, many of the risks associated with digital health tools are privacy risks. Because privacy protections are of particular importance in the digital health space, these are discussed at greater length next.

Protect user privacy

Digital health tools can sense, process, and transmit data about their users. This can include a variety of data types, including but not limited to text, videos, audio, or pictures inputted by the user or collected by a digital health product, as well as any associated metadata, such as the date and location of a photograph. There is a general sense that the privacy of sensitive information about an individual’s body or health ought to be protected. Indeed, this sentiment animates the laws that govern how health care providers handle individuals’ medical records and personal health information. Many people expect that their privacy will be similarly protected when using digital health tools, but this is not always the case. It depends on culture.

In the United States (US), a substantial, even ironic, disparity exists in the protections afforded to personal health data depending on whether it is collected in clinical or research contexts or outside of these contexts. The former receives a high level of protection, while the latter receives a low level of protection, even if the data are substantially the same [23]. Many digital health tools for use outside the clinical or research context are developed by private firms. These developers are not subject to the same regulatory oversight as health care providers; rather, they are subject to a patchwork of laws and regulations governing commerce. As a result, they generally do not offer users the level of privacy protections that individuals expect for their health data [24,25,26].

By contrast, European law provides a more comprehensive framework for safeguarding users’ personal information. The European General Data Protection Regulation (GDPR) mandates that organizations disclose the specific types of data being gathered, obtain users’ explicit consent for data collection, and grant users the right to access information about collected data as well as request the deletion of their data [27]. Because the GDPR applies to all organizations operating within the European Union (EU) or collecting data about individuals within the EU, developers operating outside the EU should become familiar with its provisions if they want to market their products to EU residents.

At minimum, developers must be aware of and comply with any relevant privacy laws. They should also be aware, however, that their ethical obligations to protect privacy and confidentiality will – due to considerations of beneficence – often exceed the obligations imposed on them by law.

It is generally insufficient from an ethical perspective to say that if users are unhappy with the level of privacy protections offered, they can simply opt-out [28]. Given society’s increasing, seemingly inescapable reliance on digital health technologies, opt-out may not be a truly viable option. During COVID lockdowns, for instance, some governments and companies mandated use of COVID-tracking apps [29]. Similarly, patients will rarely have a say in technologies used by their health care provider (and may not even have a choice of health care provider).

Privacy protections can take various forms. In some cases, a developer might consider low-tech physical safeguards like a slide-cover for a camera built into a digital health tool. More often, developers will need to be aware of a long and evolving list of cyber-security threats [30]. Means of addressing such threats could include limiting what data about users is stored and processed (or making it possible for users to do this via privacy settings) or incorporating data security measures such as encryption or authentication to prevent unauthorized access to data.

Promises of anonymization or deidentification may be insufficient to protect user privacy, as studies have shown that, even when datasets are anonymized, users can often be reidentified [31, 32].

Additional points are worth considering. If a digital health tool allows users to grant others access to their data, there should be a straightforward process for revoking such access. Individuals using digital pills for the remote monitoring of medication intake can, for instance, choose to share their information with others, including family members or friends [33]. While it’s easy to imagine how a family member might support medication adherence (e.g., by checking an app to make sure a digital pill was taken and offering a reminder if not), it is also easy to imagine relationships changing (e.g., by divorce) in ways that would make revoking access to information desirable.

Furthermore, developers should be aware of the potential for aggregated data to inadvertently reveal details about individuals or groups that have opted not to disclose their data. A striking instance is the utilization of DNA data uploaded by individuals onto public databases. This data can yield genetic insights about users' families, even when their family members have refrained from uploading their own genetic profiles. Criminal investigators have taken of advantage of this approach to resolve cold cases, such as that of the Golden State Killer, who was identified on the basis of DNA his relatives had shared in non-forensic public databases [34].

Consider environmental impacts of design

Given the substantial environmental impact of the health care system – and the negative externalities for public health – there have been calls for the health sector to green itself and reduce its ecological footprint [35]. In 2021, “60 countries committed to creating climate-resilient, low-carbon, sustainable health systems, with 20 countries committing to net-zero health care system emissions by 2050” as part of the United Nations Climate Change Conference (COP26) [36].

Developers are not exempt from such calls to action. It has been observed that the “actual material used to build and distribute the devices through which humans interact with digital health technologies are often ignored in ethical analyses, but are highly relevant for a comprehensive perspective on the ethics of digital health” [37]. For instance, the raw materials required to produce digital health products can cause environmental degradation through mining, producing toxic waste, and changing land-use, while storing the copious data generated by digital health tools can require large servers that consume vast amounts of energy [38].

The Designing and Prototyping phase affords an opportunity to improve the ratio between a digital health tool’s usefulness and its environmental impact. Developers might, for instance, consider whether it is possible to rely on renewable materials or whether the tool can be made more energy efficient. Additionally, developers might design for repairability, which increases a device’s lifespan, and for recyclability, which increases the components that can be recycled while reducing the use of raw materials.

As discussed below, environmental stewardship continues into the Commercializing phase, as it is part of supply chain management. Environmental stewardship is ethically important but may also be useful for marketing. In surveys, many consumers indicate that they care about buying environmentally sustainable products; however, there are questions about the extent to which this informs purchasing [39].

Validating and certifying

The validation process demonstrates that the final product satisfies user expectations, as well as other stakeholders’ expectations; this can promote trust and transparency, which are ethical ends. In some instances, it may be necessary to get approval or clearance from a regulator before a digital health tool can be launched in the marketplace.

Fulfill your value proposition

The variety of digital health tools makes it important for a developer to discern the value of their product [1, 18]. It has been suggested that digital health tools could, depending on their functionality, be evaluated along various dimensions, including technical, clinical, usability, and cost dimensions [40].

Technical validation attempts to answer questions like: does the tool function with accuracy and precision? Clinical validation assesses how the tool performs along measures of clinical quality, including whether it will result in improved health outcomes or provide useful information about diagnosis, management, treatment, or prevention. Usability validation seeks to ensure the technology aligns with users’ needs and preferences. And an assessment of cost can include not just what a consumer pays for the tool but also, for instance, the costs of integrating the technology into the clinical workflow. As these brief descriptions suggest, evaluation is a multi-step process that requires relevant expertise as well as interdisciplinary collaboration [1].

It is an ethical imperative that validation studies prospectively evaluate digital health tools with diverse populations. This is a matter of justice, as experience tells us that failure to do so can negatively affect users, leading to problems up to and including excess morbidity and mortality. For example, pulse oximetry is widely used to inform diagnostic and treatment decisions, yet, “validation studies were done in homogenous samples with inadequate external validation in representative populations” [9]. It has subsequently been shown that there is measurement bias when, as noted above, patients have darker skin tones [41]. During the COVID-19 pandemic, this bias contributed to delays in recognizing patients’ eligibility for COVID-19 treatment, with racially and ethnically minoritized groups disproportionately affected; in turn, this contributed to disparities in health outcomes [42]. Representation is not, of course, confined to race and ethnicity. A clinical validation trial of the Apple Watch’s ability to detect atrial fibrillation, an abnormal heart rhythm, found that the sensitivity was much lower in older adults than in younger adults [9].

It is also essential that validation studies account for how the digital health tool itself might change what is being measured, given the potential to alter health outcomes; this could occur in different ways. For example, it has been noted that new clinical norms may be needed for digital neuropsychological assessment as “normative data that exists for paper-and-pencil tests cannot simply be applied to digital tests, as performances … are not directly comparable” [43]. Another example comes from a study of smartphone-based testing for monitoring the disease trajectories of patients with multiple sclerosis; researchers found practice effects (improvements in test performance due to repeated exposure to test materials) on some cognitive and dexterity tests that might exaggerate treatment effects or mask deterioration [44]. A third example is stereotype threat, which was introduced above; choices made in the Designing and Prototyping phase might, albeit unintentionally, trigger stereotype threat with unfortunate consequences for users [14].

Some aspects of validation are duplicative of regulatory requirements, discussed below. Yet, because not all digital health tools are regulated, it is important to emphasize that there is an independent ethical obligation for developers to determine their products’ value.

Conduct ethical human subjects research

If a developer interacts with or collects information from living individuals to validate their digital health device – for example, to better understand performance or function – they are conducting “human subjects research.” Multiple national and international consensus documents set forth ethical guidelines for conducting such research. In their influential article “What Makes Clinical Research Ethical?,” Emanuel, Wendler, and Grady looked across these documents and identified seven key requirements that are relevant to human subjects research with digital health tools [45]. We briefly outline them here.

First, the research must have value; such value might, for instance, stem from generating important knowledge or evaluating an intervention that could improve health or well-being. Second, the research should be conducted in a methodologically rigorous way to ensure scientific validity. Third, there must be fair subject selection; this requires both that inclusion and exclusion criteria are driven by the study’s scientific goals and also that the risks and benefits of research are fairly distributed. Fourth, the research should offer a favorable risk–benefit ratio. Fifth, a study must undergo independent review by a body such as an institutional review board (IRB) or research ethics committee (REC). Sixth, in most cases, participants must give valid informed consent for their research participation; seeking consent demonstrates respect for individuals by allowing them to decide if participation is consistent with their values and interests. Note that there are special considerations when prospective participants—such as children or adults with substantial cognitive impairment—lack capacity to make their own decisions about research participation. Finally, the seventh requirement is that researchers demonstrate respect for potential and enrolled subjects. This is a broad requirement that encompasses activities including respecting privacy and confidentiality, monitoring the well-being of those who enroll, providing material information, and sharing study results.

Importantly, many of these ethical requirements are also regulatory requirements [46]. For example, regulators often require that research supporting a request for approval to market a device has been conducted in accordance with ethical guidelines; pre-market approvals are discussed next. Even absent a regulatory requirement, however, other downstream gatekeepers, such as academic journals or app stores, might require developers to show that they have conducted or are conducting ethical human subjects research [46, 47]. Therefore, developers conducting human subjects research should be aware of and comply with relevant regulations as well as gatekeepers’ policies.

Seek necessary pre-market approvals

Some but not all digital health tools will need to have regulatory approval before they can be marketed to consumers [48, 49]. Regulatory approval has legal, evidentiary, and normative dimensions, as approval rests on determinations of safety and efficacy [50, 51]. Developers preparing to market a digital health tool should proactively consider whether regulatory approval is necessary. As the following examples suggest, these are highly fact-specific determinations, and the decision to seek regulatory approval or not will depend on both features of the digital health tool and relevant laws.

In the US, a “medical device” is defined in Sect. 201(h) of the Food, Drug, and Cosmetic Act as a product “intended for use in the diagnosis of disease or other conditions, or in the cure, mitigation, treatment, or prevention of disease, … or intended to affect the structure or any function of the body of man.” The U.S. Food and Drug Administration (FDA) does regulate medical devices (i.e., products that meet the Sect. 201(h) definition). By contrast, the FDA does not plan to regulate low risk “general wellness products” (though these products might colloquially be referred to as devices) – that is, it exercises enforcement discretion. Paraphrasing FDA guidance, a general wellness product, such as an exercise bicycle or wearable activity tracker, has either an intended use that relates to maintaining or encouraging a general state of health or an intended use that relates to the role of healthy lifestyle with helping to reduce the risk or impact of certain diseases or conditions [52]. To illustrate the difference, it is a general wellness claim to say a product such as a video game that teaches dance routines promotes a healthy weight, but it is not a general wellness claim to say that same video game will treat obesity.

Notably, software may be considered a regulated medical device (even if this would not colloquially be referred to as a device) [53]. For example, in 2020, Fitbit – the company that makes the eponymous wearable activity trackers – announced it had FDA clearance for an algorithm designed to passively and continuously check for atrial fibrillation [54]. While the FDA does not consider products for monitoring pulse rates during exercise to be medical devices (they are considered general wellness products [52]), the added functionality of atrial fibrillation monitoring necessitated obtaining prior clearance from the FDA.

European laws governing medical devices also focus primarily on products with an intended medical use. However, a new regulation introduced in 2021, known as the Medical Device Regulation (MDR), now includes some devices “without an intended medical purpose” within its scope [55]. Examples include devices such as noninvasive methods of brain stimulation, contact lenses, and high-intensity electromagnetic radiation equipment intended for use on the human body.

An example illustrates the difference in approaches between the US and EU. A noninvasive neural stimulation device marketed for cognitive enhancement would be classified as a Class III medical device (i.e., the highest risk category) in the EU and its manufacturers would be required to comply with the same stringent regulations that govern implantable neural stimulation devices [56]. By contrast, in the US, the regulatory pathway for this same product is less clear: the FDA has stated that such products are not considered “low risk general wellness products” given their safety profile, [52] yet the agency has not indicated exactly how they should be regulated, nor has it taken regulatory action against manufacturers of these devices [57].

Commercializing

The decision to commercialize launches a digital health tool into the marketplace with great hope that consumers will adopt it. Even at this stage of the product-development lifecycle, there are important ethical and legal considerations.

Educate users

Educating users is an important aspect of respecting their autonomy and enabling them to make informed decisions about whether any given digital health tool is right for them, their families, their patients, or others whose interests may be implicated. Users need to understand the nature of the digital health tool they are using; this includes what the product does and does not do, as well as any limitations. The Fitbit example provided above – in which the wearable itself is not FDA-regulated but the atrial fibrillation algorithm on it has FDA clearance – illustrates the blurring of the boundaries between medical and wellness products. This may be a source of confusion for consumers [58].

It is also important for users to understand the potential risks of using a digital health tool, as well as the probability and magnitude of potential harms; risks were discussed at length above. Users should, for instance, be aware that wearable electroencephalogram (EEG) devices, which are intended to enable at-home seizure detection and monitoring for people with epilepsy, may capture data regarding subclinical seizures that could limit their driving privileges [59].

Users should be informed about what kind of data a digital health product collects and whether data may be shared, with whom (e.g., any third parties), and for what purposes (e.g., research or advertising). If users have the ability to request the deletion of their data or to limit others’ access to it, the process for doing so should be clearly articulated.

It is not enough simply to provide information about the product, its benefits, its risks, and data collection. This information must be shared in a way that promotes understanding by means such as using plain language, ensuring general readability, and minimizing length [60]. Unfortunately, this does not always happen. Academic researchers have shown that the terms of service and privacy policies for period-tracking apps, mentioned above, are often neither easily accessible nor understandable [61, 62]. This is problematic, particularly in light of the legal risks these apps may impose on users.

Finally, education should be ongoing. For products that continuously capture data from naturalistic settings, such as wearable sensors or devices that record audio or video streams, efforts should be made to periodically remind users about the nature of continuous data collection. It is also important to share material information with consumers as it becomes available —that is, information that might reasonably be predicted to change whether or how a consumer uses a particular digital health tool. Material information would include but not be limited to alterations to the terms of use (e.g., if one company is acquired by another).

Promote equitable access

Although a potential use of digital health tools is to advance equitable healthcare, equity is not ensured. Many digital health tools require users to have internet or smart phone access; yet, access to these technologies cannot be assumed, and those who lack access may be left out. In the US, lower-income households lag their middle- and upper-income counterparts in terms of Internet access, [63] and over a quarter of the population lacks broadband connectivity [64]. There are also significant racial disparities in access to high-speed Internet [65] and digital health devices [66]. Internationally, researchers have found important differences in access to mobile phones by gender [67].

Obstacles to uptake may also be financial or informational in nature. A study of patients at Federally Qualified Health Centers, clinics that serve medically underserved areas and populations in the US, found a majority of respondents expressed interest in having a wearable activity tracker, but less than a quarter had such a tracker; reported barriers included tracker cost and lack of relevant information [66].

Users may also encounter accessibility challenges that result from how a digital health tool is designed. People often think of individuals with disabilities when talking about accessible design, and above we discussed how tools might be made accessible to users with auditory or visual impairments [68]. But it’s important to think about other aspects of a product, such as size, that might also affect usability or accessibility. Blood pressure cuffs are often linked to a digital health device; if a cuff is too large or too small, it can provide inaccurate readings [69]. Gender-aware design may also play a role in fostering engagement. Companies have, for instance, designed wearables that measure wearers’ UV exposure, but research suggests that men are less accepting of these and other sun protection interventions [70].

Though not all of these barriers to equitable access are within developers’ control, [71] many – like information, product cost, and design – are within their control and should be addressed to promote equity and justice.

Manage your supply chain

Ethical supply chain management complements the opportunity in the Designing and Prototyping phase to reduce environmental impact, as it requires developers to consider diverse issues including stewardship of natural resources that go into a product and the effects of extraction and shipping practices on climate [37]. Additionally, it requires consideration of any economic inequalities perpetuated by low-wage workers laboring for corporations in high-income countries and labor practices that can violate human rights.

This is an opportune time to point out that, although sound business decisions are often ethically sound, this is not necessarily the case. Therefore, developers should be attuned to potential tensions between what might be advantageous from a business perspective and what is acceptable from an ethical perspective. Recall the bad press Apple received when it was accused of using forced labor in its supply chain [72]. Using low-cost labor may be good for shareholders, but paying poverty wages or relying on forced labor is wrongful exploitation.

Be aware of liability questions

Liability laws exist to advance both practical and ethical ends: to make sure that users are aware of products’ risks, protected from faulty or dangerous products, and compensated if they are harmed. While measures taken above, such as assessing and minimizing risks, can reduce exposure to liability, they cannot eliminate it. Developers should therefore be aware of their legal risk and also that there are unsettled liability questions in the digital health space [73]. Consider, for example, a scenario where a digital health product claims to be able to detect atrial fibrillation or seizure, yet fails to do so. It is unclear who may be held responsible in this case, or in analogous ones, such as a user experiencing harm following the provision of erroneous health information or the failure to disclose incidental findings. The situation becomes even more complex if errors are related to improper device application or the user inputting inaccurate data.

Conclusion

The field of digital health continues to evolve, and as it does, its potential to enhance health and health care grows. Developers of digital health products must, however, be mindful that the product-development lifecycle brings with it numerous ethical and regulatory considerations. Even seemingly straightforward decisions can have important ethical and regulatory implications. Here, we have identified common issues for developers to consider and address throughout the product-development lifecycle and provided numerous examples to spark further reflection.

Availability of data and materials

Not applicable.

References

  1. Goldsack JC, Coravos A, Bakker JP, et al. Verification, analytical validation, and clinical validation (V3): the foundation of determining fit-for-purpose for Biometric Monitoring Technologies (BioMeTs). npj Digit Med. 2020;3(1):55. https://doi.org/10.1038/s41746-020-0260-4.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Ienca M, Wangmo T, Jotterand F, Kressig RW, Elger B. Ethical Design of Intelligent Assistive Technologies for Dementia: A Descriptive Review. Sci Eng Ethics. 2018;24(4):1035–55. https://doi.org/10.1007/s11948-017-9976-1.

    Article  PubMed  Google Scholar 

  3. Berridge C, Wetle TF. Why Older Adults and Their Children Disagree About In-Home Surveillance Technology, Sensors, and Tracking. Gerontologist Published online. 2020. https://doi.org/10.1093/geront/gnz068.

    Article  Google Scholar 

  4. Domingo MC. An overview of the Internet of Things for people with disabilities. J Netw Comput Appl. 2012;35(2):584–96. https://doi.org/10.1016/j.jnca.2011.10.015.

    Article  Google Scholar 

  5. Adepoju OE, Chavez A, Duong K. Telemedicine During The Pandemic: Leaving The Visually Impaired And Others With Disabilities Behind? Published online September 6, 2022. https://doi.org/10.1377/forefront.20220902.944304

  6. Radix AE, Bond K, Carneiro PB, Restar A. Transgender Individuals and Digital Health. Curr HIV/AIDS Rep. 2022;19(6):592–9. https://doi.org/10.1007/s11904-022-00629-7.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Cho MK. Rising to the challenge of bias in health care AI. Nat Med. 2021;27(12):2079–81. https://doi.org/10.1038/s41591-021-01577-2.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  8. Nichol AA, Batten JN, Halley MC, Axelrod JK, Sankar PL, Cho MK. A Typology of Existing Machine Learning-Based Predictive Analytic Tools Focused on Reducing Costs and Improving Quality in Health Care: Systematic Search and Content Analysis. J Med Internet Res. 2021;23(6):e26391. https://doi.org/10.2196/26391.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Zinzuwadia A, Singh JP. Wearable devices—addressing bias and inequity. The Lancet Digital Health. 2022;4(12):e856–7. https://doi.org/10.1016/S2589-7500(22)00194-7.

    Article  CAS  PubMed  Google Scholar 

  10. Bent B, Goldstein BA, Kibbe WA, Dunn JP. Investigating sources of inaccuracy in wearable optical heart rate sensors. npj Digit Med. 2020;3(1):18. https://doi.org/10.1038/s41746-020-0226-6.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Raza MM, Venkatesh KP, Kvedar JC. Promoting racial equity in digital health: applying a cross-disciplinary equity framework. npj Digit Med. 2023;6(1):3. https://doi.org/10.1038/s41746-023-00747-5.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Major B, Hunger JM, Bunyan DP, Miller CT. The ironic effects of weight stigma. J Exp Soc Psychol. 2014;51:74–80. https://doi.org/10.1016/j.jesp.2013.11.009.

    Article  Google Scholar 

  13. Li BJ, Lwin MO, Jung Y. Wii, Myself, and Size: The Influence of Proteus Effect and Stereotype Threat on Overweight Children’s Exercise Motivation and Behavior in Exergames. Games for Health Journal. 2014;3(1):40–8. https://doi.org/10.1089/g4h.2013.0081.

    Article  PubMed  Google Scholar 

  14. Tlachac ML, Reisch M, Lewis B, Flores R, Harrison L, Rundensteiner E. Impact assessment of stereotype threat on mobile depression screening using Bayesian estimation. Healthcare Analytics. 2022;2:100088. https://doi.org/10.1016/j.health.2022.100088.

    Article  Google Scholar 

  15. Seyyed-Kalantari L, Zhang H, McDermott MBA, Chen IY, Ghassemi M. Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations. Nat Med. 2021;27(12):2176–82. https://doi.org/10.1038/s41591-021-01595-0.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  16. Obermeyer Z, Powers B, Vogeli C, Mullainathan S. Dissecting racial bias in an algorithm used to manage the health of populations. Science. 2019;366(6464):447–53. https://doi.org/10.1126/science.aax2342.

    Article  CAS  PubMed  Google Scholar 

  17. The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report. Published online April 18, 1979:10.

  18. Perakslis E, Ginsburg GS. Digital Health—The Need to Assess Benefits, Risks, and Value. JAMA. Published online December 28, 2020.https://doi.org/10.1001/jama.2020.22919

  19. Sun N, Esom K, Dhaliwal M, Amon JJ. Human Rights and Digital Health Technologies. Health Hum Rights. 2020;22(2):21–32.

    PubMed  PubMed Central  Google Scholar 

  20. Figueroa CA, Luo T, Aguilera A, Lyles CR. The need for feminist intersectionality in digital health. The Lancet Digital Health. 2021;3(8):e526–33. https://doi.org/10.1016/S2589-7500(21)00118-7.

    Article  CAS  PubMed  Google Scholar 

  21. Torchinsky R. How period tracking apps and data privacy fit into a post-Roe v. Wade climate. NPR. https://www.npr.org/2022/05/10/1097482967/roe-v-wade-supreme-court-abortion-period-apps. Published June 24, 2022. Accessed 3 Aug 2023.

  22. Wexler A. The practices of do-it-yourself brain stimulation: implications for ethical considerations and regulatory proposals. J Med Ethics. 2016;42(4):211–5. https://doi.org/10.1136/medethics-2015-102704.

    Article  PubMed  Google Scholar 

  23. Gross MS, Miller RC, Pascalev A. Ethical Implementation of Wearables in Pandemic Response: A Call for a Paradigm Shift. Safra Center for Ethics: Edmond J; 2020. https://ethics.harvard.edu/sites/hwpi.harvard.edu/files/center-for-ethics/files/18ethicalwearables.pdf?m=1590163395.

    Google Scholar 

  24. Gostin LO, Halabi SF, Wilson K. Health Data and Privacy in the Digital Era. JAMA. 2018;320(3):233. https://doi.org/10.1001/jama.2018.8374.

    Article  PubMed  Google Scholar 

  25. Bari L, O’Neill DP. Rethinking Patient Data Privacy In The Era Of Digital Health. Health Affairs Blog. Published August 4, 2023. Accessed 12 Dec 2019. https://www.healthaffairs.org/content/forefront/rethinking-patient-data-privacy-era-digital-health

  26. Kalokairinou L, Cho R, Wei N, Wexler A. Policies of U.S. Companies Offering Direct-to-Consumer Laboratory Tests. JAMA Internal Medicine. Published online 2023.

  27. Wolford B. What is GDPR, the EU’s new data protection law? Accessed 17 Aug 2023. https://gdpr.eu/what-is-gdpr/

  28. Véliz C. Privacy and digital ethics after the pandemic. Nat Electron. 2021;4(1):10–1. https://doi.org/10.1038/s41928-020-00536-y.

    Article  CAS  Google Scholar 

  29. Gebhart G, Hoffman-Andrews J, Crocker A. University App Mandates Are The Wrong Call. Electronic Frontier Foundation. Published July 30, 2020. Accessed 30 Oct 2023. https://www.eff.org/deeplinks/2020/07/university-app-mandates-are-wrong-call

  30. Filkins BL, Kim JY, Roberts B, et al. Privacy and security in the era of digital health: what should translational researchers know and do about it? Am J Transl Res. 2016;8(3):1560–80.

    PubMed  PubMed Central  Google Scholar 

  31. Sweeney L. Simple Demographics Often Identify People Uniquely. 2000. https://dataprivacylab.org/projects/identifiability/.

    Google Scholar 

  32. Rocher L, Hendrickx JM, De Montjoye YA. Estimating the success of re-identifications in incomplete datasets using generative models. Nat Commun. 2019;10(1):3069. https://doi.org/10.1038/s41467-019-10933-3.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  33. Sideri K, Cockbain J, Van Biesen W, De Hert M, Decruyenaere J, Sterckx S. Digital pills for the remote monitoring of medication intake: a stakeholder analysis and assessment of marketing approval and patent granting policies. J Law Biosci. 2022;9(2):lsac029. https://doi.org/10.1093/jlb/lsac029.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Ram N, Guerrini CJ, McGuire AL. Genealogy databases and the future of criminal investigation. Science. 2018;360(6393):1078–9. https://doi.org/10.1126/science.aau1083.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  35. Eckelman MJ, Sherman J. Environmental Impacts of the U.S. Health Care System and Effects on Public Health. PLoS ONE. 2016;11(6):e0157014 10.1371/journal.pone.0157014.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Hough E, Gumas ED, Seervai S. Action to Decarbonize the U.S. Health Care System: Lessons from the U.K.’s National Health Service. Published July 26, 2022. Accessed 29 Aug 2023. https://www.commonwealthfund.org/publications/issue-briefs/2022/jul/action-decarbonize-us-health-care-system-lessons-uk-nhs#:~:text=Set%20targets%20for%20reducing%20emissions.&text=In%20April%202022%2C%20HHS%20and,report%20progress%20toward%20these%20targets.

  37. Shaw JA, Donia J. The Sociotechnical Ethics of Digital Health: A Critique and Extension of Approaches From Bioethics. Front Digit Health. 2021;3:725088. https://doi.org/10.3389/fdgth.2021.725088.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Thompson M. The Environmental Impacts of Digital Health. DIGITAL HEALTH. 2021;7:205520762110334. https://doi.org/10.1177/20552076211033421.

    Article  Google Scholar 

  39. White K, Hardisty DJ, Habib R. The Elusive Green Consumer. Harvard Business Review. Published online August 2019. Accessed 30 Oct 2023. https://hbr.org/2019/07/the-elusive-green-consumer

  40. Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. npj Digit Med. 2019;2(1):38. https://doi.org/10.1038/s41746-019-0111-3.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Sjoding MW, Dickson RP, Iwashyna TJ, Gay SE, Valley TS. Racial Bias in Pulse Oximetry Measurement. N Engl J Med. 2020;383(25):2477–8. https://doi.org/10.1056/NEJMc2029240.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Valbuena VSM, Merchant RM, Hough CL. Racial and Ethnic Bias in Pulse Oximetry and Clinical Outcomes. JAMA Intern Med. 2022;182(7):699. https://doi.org/10.1001/jamainternmed.2022.1903.

    Article  PubMed  Google Scholar 

  43. Spreij LA, Gosselt IK, Visser-Meily JMA, Nijboer TCW. Digital neuropsychological assessment: Feasibility and applicability in patients with acquired brain injury. J Clin Exp Neuropsychol. 2020;42(8):781–93. https://doi.org/10.1080/13803395.2020.1808595.

    Article  PubMed  Google Scholar 

  44. Woelfle T, Pless S, Wiencierz A, Kappos L, Naegelin Y, Lorscheider J. Practice Effects of Mobile Tests of Cognition, Dexterity, and Mobility on Patients With Multiple Sclerosis: Data Analysis of a Smartphone-Based Observational Study. J Med Internet Res. 2021;23(11):e30394. https://doi.org/10.2196/30394.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Emanuel EJ, Wendler D, Grady C. What Makes Clinical Research Ethical? JAMA. 2000;283(20):2701–11.

    Article  CAS  PubMed  Google Scholar 

  46. Wexler A, Largent E. Ethical considerations for researchers developing and testing minimal-risk devices. Nat Commun. 2023;14(1):2325. https://doi.org/10.1038/s41467-023-38068-6.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  47. Meyer MN. There Oughta Be a Law: When Does(n’t) the U.S. Common Rule Apply? J Law Med Ethics. 2020;48(S1):60–73. https://doi.org/10.1177/1073110520917030.

    Article  PubMed  Google Scholar 

  48. FDA. The Device Development Process. Accessed 30 Oct 2023. https://www.fda.gov/patients/learn-about-drug-and-device-approvals/device-development-process

  49. IMDRF. Personalized Medical Devices – Production Verification and Validation. Published February 2023. Accessed 30 Oct 2023. https://www.imdrf.org/sites/default/files/2023-04/IMDRF%20Personalised%20Medical%20Devices%20WG%20N74%20FINAL%20%202023.pdf

  50. Largent EA, Peterson A, Lynch HF. FDA Drug Approval and the Ethics of Desperation. JAMA Intern Med. 2021;181(12):1555–6. https://doi.org/10.1001/jamainternmed.2021.6045.

    Article  PubMed  Google Scholar 

  51. Lynch HF, Largent EA. Considering tomorrow’s patients in today’s drug approvals. BMJ. Published online June 8, 2023:e075000. https://doi.org/10.1136/bmj-2023-075000

  52. FDA. General Wellness: Policy for Low Risk Devices Guidance for Industry and Food and Drug Administration Staff. Published September 27, 2019. Accessed 3 Aug 2023. https://www.fda.gov/media/90652/download?attachment

  53. FDA. Policy for Device Software Functions and Mobile Medical Applications Guidance for Industry and Food and Drug Administration Staff. Published September 28, 2022. Accessed 3 Aug 2023. https://www.fda.gov/media/80958/download

  54. Park A. Fitbit tails Apple Watch with FDA-cleared algorithm to passively check for afib. Fierce Biotech. https://www.fiercebiotech.com/medtech/fitbit-tails-apple-watch-fda-cleared-algorithm-passively-check-afib. Published April 11, 2022. Accessed 3 Aug 2023.

  55. European Commission. Manufacturers of devices without an intended medical purpose. Accessed 17 Aug 2023. https://health.ec.europa.eu/medical-devices-topics-interest/reprocessing-medical-devices/manufacturers-devices-without-intended-medical-purpose_en

  56. Baeken C, Arns M, Brunelin J, et al. European reclassification of non-invasive brain stimulation as class III medical devices: A call to action. Brain Stimul. 2023;16(2):564–6. https://doi.org/10.1016/j.brs.2023.02.012.

    Article  PubMed  Google Scholar 

  57. Wexler A. A pragmatic analysis of the regulation of consumer transcranial direct current stimulation (TDCS) devices in the United States: Table 1. J Law and the BioSci. Published online October 12, 2015:lsv039. https://doi.org/10.1093/jlb/lsv039

  58. Eadicicco L. Fitbit and Apple know their smartwatches aren’t medical devices. But do you? CNET. Published January 14, 2022. Accessed 4 Aug 2023. https://www.cnet.com/tech/mobile/features/fitbit-apple-know-smartwatches-arent-medical-devices-but-do-you/

  59. Antwi P, Atac E, Ryu JH, et al. Driving status of patients with generalized spike–wave on EEG but no clinical seizures. Epilepsy Behav. 2019;92:5–13. https://doi.org/10.1016/j.yebeh.2018.11.031.

    Article  PubMed  Google Scholar 

  60. Miron-Shatz T, Yaniv H. Digital consent: engaging patients with plain language and better communication. BMJ. Published online October 5, 2022:o2378. https://doi.org/10.1136/bmj.o2378

  61. Fowler LR, Gillard C, Morain SR. Readability and Accessibility of Terms of Service and Privacy Policies for Menstruation-Tracking Smartphone Applications. Health Promot Pract. 2020;21(5):679–83. https://doi.org/10.1177/1524839919899924.

    Article  PubMed  Google Scholar 

  62. Fowler LR, Gillard C, Morain S. Teenage Use of Smartphone Applications for Menstrual Cycle Tracking. Pediatrics. 2020;145(5):e20192954. https://doi.org/10.1542/peds.2019-2954.

    Article  PubMed  Google Scholar 

  63. Vogels EA. Digital divide persists even as Americans with lower incomes make gains in tech adoption. Pew Research Center. Published June 22, 2021. Accessed 4 Aug 2023. https://www.pewresearch.org/short-reads/2021/06/22/digital-divide-persists-even-as-americans-with-lower-incomes-make-gains-in-tech-adoption/

  64. Early J, Hernandez A. Digital Disenfranchisement and COVID-19: Broadband Internet Access as a Social Determinant of Health. Health Promot Pract. 2021;22(5):605–10. https://doi.org/10.1177/15248399211014490.

    Article  PubMed  Google Scholar 

  65. Wang HL. Native Americans On Tribal Land Are “The Least Connected” To High-Speed Internet. NPR. https://www.npr.org/2018/12/06/673364305/native-americans-on-tribal-land-are-the-least-connected-to-high-speed-internet. Published December 6, 2018.

  66. Holko M, Litwin TR, Munoz F, et al. Wearable fitness tracker use in federally qualified health center patients: strategies to improve the health of all of us using digital health devices. npj Digit Med. 2022;5(1):53. https://doi.org/10.1038/s41746-022-00593-x.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Blake A, Hazel A, Jakurama J, Matundu J, Bharti N. Disparities in mobile phone ownership reflect inequities in access to healthcare. Lichtner V, ed. PLOS Digit Health. 2023;2(7):270. https://doi.org/10.1371/journal.pdig.0000270.

    Article  Google Scholar 

  68. Henni SH, Maurud S, Fuglerud KS, Moen A. The experiences, needs and barriers of people with impairments related to usability and accessibility of digital health solutions, levels of involvement in the design process and strategies for participatory and universal design: a scoping review. BMC Public Health. 2022;22(1):35. https://doi.org/10.1186/s12889-021-12393-1.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Ramsey M. Blood pressure monitoring: Automated oscillometric devices. J Clin Monitor Comput. 1991;7(1):56–67. https://doi.org/10.1007/BF01617900.

    Article  Google Scholar 

  70. Khayamian EB. THE IMPORTANCE OF GENDER-AWARE DESIGN IN DIGITAL HEALTH WEARABLES: A CO-DESIGN STUDY FOSTERING SUN PROTECTION BEHAVIOUR IN YOUNG MEN. Proc Des Soc. 2021;1:3031–40. https://doi.org/10.1017/pds.2021.564.

    Article  Google Scholar 

  71. U.S. Department of Commerce. FACT SHEET: Biden-Harris Administration’s “Internet for All” Initiative: Bringing Affordable, Reliable High-Speed Internet to Everyone in America. Published May 13, 2022. Accessed 17 Aug 2023. https://www.commerce.gov/news/fact-sheets/2022/05/fact-sheet-biden-harris-administrations-internet-all-initiative-bringing

  72. Albergotti R. Apple’s longtime supplier accused of using forced labor in China. The Washington Post. https://www.washingtonpost.com/technology/2020/12/29/lens-technology-apple-uighur/. Published December 29, 2020. Accessed 17 Aug 2023.

  73. Simon DA, Shachar C, Cohen IG. Unsettled Liability Issues for “Prediagnostic” Wearables and Health-Related Products. JAMA. 2022;328(14):1391. https://doi.org/10.1001/jama.2022.16317.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

Drs. Largent, Karlawish, and Wexler acknowledge funding from the NIH National Institute on Aging (NIA) under award number P30-AG-073105.

Author information

Authors and Affiliations

Authors

Contributions

All authors participated in the conception of the manuscript. Dr. Largent wrote the first draft; Drs. Karlawish and Wexler made substantive revisions. All authors gave final review and approval.  Dr. Largent created Fig. 1.

Corresponding author

Correspondence to Emily A. Largent.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Largent, E.A., Karlawish, J. & Wexler, A. From an idea to the marketplace: identifying and addressing ethical and regulatory considerations across the digital health product-development lifecycle. BMC Digit Health 2, 41 (2024). https://doi.org/10.1186/s44247-024-00098-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s44247-024-00098-5

Keywords