Digital technologies have accelerated globalization, transformed labor and education markets, and propelled shifts in lifestyles, health management, social interactions, and civic engagement. They have expanded opportunities for people in all their diversity to express themselves in online spaces, share and gather knowledge, access education, and economic opportunities, engage in democratic discussion, build community, power movements and resistance, exercise their rights, and share their voices and interests. Digital access to information, services, and commodities increases capacities and skills thus fostering empowerment and agency. Safe and ethical technology, deployed equitably, can promote and protect the dignity and rights of all people – principles that are at the heart of sustainable development. Currently, the design and deployment of technology services its developers only and is mostly based on maximizing profit and revenue generation. The current model of digital technology shapes our daily choices based on how they are presented to us. Social media platforms have notoriously applied an addiction model to their product design to build applications that keep users engaged regardless of the benefits or harms of their experience. Technology is often being created and deployed without consideration of its impact on deepening socioeconomic, gender, geographic, and age-based inequalities thereby causing individual and systemic harm. Since most developers are based in Global North, design reflects the biases and assumptions of its developers who comprise a group of predominately men, based in high-income countries. While more women in low- and middle-income countries are using the Internet, their rate of adoption has slowed, and women remain 19 percent less likely than men to use the Internet. Of the 900 million women who are still not connected, almost two-thirds live in South Asia and sub-Saharan Africa, where gender gaps are widest. Women’s Peace and Security efforts to increase women’s participation in the personal and political sphere are curbed due to Technology Facilitated Gender Based Violence(TFGBV) threats women could face as they step into leadership and advocacy roles. Similarly, although data are scant, older people’s use and uptake of technology are lower than for younger cohorts. While the digitization of society and widespread deployment of digital technologies provide the potential to enhance healthy ageing, they can also negatively impact older people’s ability to manage their affairs, place them at risk of exploitation, and undermine their autonomy, dignity, and self-determination. Issues around affordability, availability, safety, digital literacy, and harmful social and gendered norms dictate uneven access. The online world is both a facilitator and amplifier of human interactions. And while technology does not drive behavior, it is a medium through which these behaviors can manifest. The design of digital environments plays a vital role in shaping people’s experiences and their safety online. Over the past decade, online platforms and technologies have provided untold benefits and unforeseen opportunities. However, the same digital technologies that connect and enable us can also be weaponized to perpetrate abuse. The online risks and harms are multi-layered and multi-dimensional and can occur as a result of access, exposure, and interactions with others. Each type of online harm is not exclusive and an incident may involve multiple types of online harms. Online harm can violate the right to personal safety; health and well-being; dignity, privacy, participation, or free speech. It can perpetuate discrimination and abuse and also lead to deception and manipulation. Translation of human rights in the digital world would translate to advancing digital safety in a rights-respecting way, driving multi-stakeholder alignment, encouraging positive behaviors and actions across the digital ecosystem, and informing and enabling regulatory, industry, and societal efforts and innovations. Safety, security, and privacy by design are pillars of processes to ensure that digital technologies operate in line with human rights. Effective regulation and systems of accountability are critical to protect the human rights of users and provide redress when these rights are denied or violated. Applying safety by-design principles to social platforms would require that products are designed in the best interest of their users, setting safety and security defaults to the strongest option with transparency and control over the recommendation and communication features. Under such a scheme, users would have the ability to adjust what they see, have personal information hidden by default, and have addictive features like autoplay videos turned off by default or be given reminders to take a break. Users would be prompted to review who sees what they share, who can contact them, or decide what data can be used for the ads and content that appear in their feeds and notifications—choices that could fundamentally change our online experiences. But safety by design approaches also require careful balancing to preserve civil liberties and to ensure that they provide protections for all online users, not just the children whose safety concerns have come to dominate debates about how to regulate online life. Developing rules that only protect children would be a missed opportunity to empower all consumers to make individual decisions about their well-being online. It would be better to provide greater protections to all users than to only offer heightened safety features for children who may attempt to sidestep those controls. The burden of safety should never fall solely upon the user. Every attempt must be made to ensure that online harms are understood, assessed, and addressed in the design and provision of online platforms and services. Safety by design standards would hold online platforms and services responsible for the safety of users by assessing, addressing, and mitigating potential harms before they occur. Similarly, rules need to be crafted in a way that provides consistent guidance for the industry while offering a framework that is broad enough to be applied to future online social spaces—from live chat and video applications to the metaverse and beyond which aim to place accountability, user empowerment, and transparency at the heart of rules for online life. From the equity lens- harnessing the power of digital technology for all requires understanding the impacts of its design, deployment, and underlying business model. The accelerated investments in innovative digital technologies all around the world have the potential to drive change at an unprecedented pace, including in the underlying norms and assumptions that underpin technology tools. Hence, digital mediums must serve as platforms to actively change social reality rather than mirroring social biases and prejudices. To reverse the essentially biased technological infrastructure upon which the world is being built today, it is urgent to ensure that, moving forward, digital technologies reflect the diversity of people around the globe, and include safety by design so that no one is left behind.