Safety Net Projecthttps://www.techsafety.org/Fri, 23 Aug 2024 00:17:27 +0000en-USSite-Server v6.0.0-f7c6a7e017a96ad282e33de8deef039d6bd3206c-1 (http://www.squarespace.com)Managed by the Safety Net Project at the National Network to End Domestic Violence (NNEDV), this blog explores the intersection between technology, intimate partner violence, sexual assault, and violence against women.

]]>
Leading in Living Color: The Legacy and Impact of the Women of Color in Advocacy and Tech Reception at Tech SummitAudace GarnettFri, 23 Aug 2024 01:21:12 +0000https://www.techsafety.org/blog/2024/8/22/leading-in-living-color-the-legacy-and-impact-of-the-women-of-color-in-advocacy-and-tech-reception-at-tech-summit51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:66c7d517e48f6003332c3884

The Women of Color in Advocacy and Tech Reception was born from a deep need for community and connection amongst Black women and Women of Color. It was because of this need nine years ago, Kristelyn Berry and Rachel Gibson created this space at the National Network to End Domestic Violence’s (NNEDV) 4th annual Tech Summit after experiencing numerous environments where as Black women, they rarely saw other Women of Color or women who looked like them.

The space was initially launched with panel discussions, recognizing that the voices of Women of Color in the tech and advocacy communities are often overlooked. The goal was to create a space for women of color to network, spark conversations to facilitate change, and unite the advocacy and technology communities in their efforts to work together.

WOC REception 2018 - Audace, Shalini, rachel and kristelyn

Although Kristelyn and Rachel have journeyed on from NNEDV, they have remained supportive, often returning to participate, share their wisdom, and connect with attendees.

The Women of Color in Advocacy and Tech Reception has grown from an event into a strong, supportive community. Each year, more participants come together to share their experiences, discuss important issues, and support one another in their personal and professional lives.

This reception is now a key part of Tech Summit, and known for its welcoming environment where attendees can openly share their ideas and concerns around being a woman of color in the mainstream work place.

A big part of the reception’s success is its commitment to being inclusive and focusing on the diverse experiences of Women of Color. The topics covered are carefully chosen to reflect how technology and advocacy intersects with important issues like racial justice, gender equity, and violence prevention. This space is not just about sharing information but also about laughing, networking and inspiring attendees to take action in their own communities and workplaces.

This year’s, theme “Leading In Living Color” explored the evolving nature of leadership in today's technology landscape, particularly in virtual environments where navigating cultural dynamics is increasingly significant. Panelists Tonjie Reese, Kristelyn Berry, Amanda Takes War Bonnett , Tonya King and Esmeralda Pena shared their perspective on topics such as Creating a Sense of Belonging in Digital and In-Person Spaces and Balancing Cultural Authenticity with Professional Expectations. Attendees left inspired and encouraged to take back the many lessons learned.

Looking ahead, we hope to carry this reception forward and inspire other conference events to incorporate spaces like this into their gatherings. The legacy of the event’s founders, Kristelyn and Rachel, lives on in our conversations, connection, and effort to uplift and empower women of color in advocacy and technology. We are grateful they planted the seed for the space to continue to evolve and grow.

We left the space holding onto the powerful Harriet Tubman quote shared by NNEDV CEO, Stephanie Love-Patterson: 'If you hear the dogs, keep going. If you see the torches in the woods, keep going. If there's shouting after you, keep going. Don't ever stop. Keep going. If you want a taste of freedom, keep going.' We hope those words resonated with all attendees, inspiring them to continue their journey with unwavering determination.

Harriet Tubman was a Black woman who led boldly in living color. Born into the chains of slavery in Maryland in March 1822, she broke free and became a fearless abolitionist, masterminding the Underground Railroad and liberating around 70 enslaved people. Her unwavering courage and fierce determination made her an enduring symbol of resistance and freedom. Tubman’s legacy, which continued until her passing on March 10, 1913, is a powerful testament to the unyielding strength of resilience and the profound impact one extraordinary person can have on the world.

]]>
Leading in Living Color: The Legacy and Impact of the Women of Color in Advocacy and Tech Reception at Tech Summit
Tech Summit 2024Jessie LThu, 22 Aug 2024 16:26:44 +0000https://www.techsafety.org/blog/2024/8/22/tech-summit-202451dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:66c75fa85211a1649aa1ee47Last week, we welcomed almost 300 people to Safety Net’s Tech Summit 2024 in Washington, DC. Registrants overcame weather-related travel difficulties to attend, and we are grateful for everyone’s commitment to being in this space and doing this work. For three days (four for state and territorial coalition staff who attended Coalition Day), we laughed, strategized, and built new ways to think and talk about how privacy and tech safety impact the lives of survivors of abuse and harassment.

We had over 30 brilliant and passionate presenters from around the world, including representatives from local and national domestic and sexual violence programs, academia, policy NGOs, the White House Gender Policy Council, and numerous tech companies. Presenters covered a vast range of content, such as: immigrant survivors and tech, what AI means for survivors and advocates, dating violence and tech, the importance of encryption in tech safety, and many more.

In addition to our expert presenters, we also had a large and impressive group of attendees. Attendees came from numerous US states as well as Australia, Belgium, Guam, Puerto Rico, and the Northern Mariana Islands. Advocates, technologists, attorneys, coalition staff, and others enriched the conversation as participants and presenters. Many attendees came from programs providing emergency shelter, transitional housing, non-residential services, and crucial legal assistance. Others teach coding and tech skills to survivors to help them gain financial freedom. All of them make a difference every day.

Throughout the conference, attendees discussed all aspects of technology safety for survivors, including:

  • How abusers misuse tech;

  • How survivors can strategically use tech to maintain their safety and privacy;

  • How tech public policy is made;

  • How agencies can use tech to increase accessibility and ensure privacy; and

  • The importance of designing technology with survivors in mind.

At the National Network to End Domestic Violence (NNEDV), we work and play hard, and Tech Summit 2024 was no different. Receptions, dinners, snacks, networking opportunities, and informal discussions provided a chance for participants and presenters to connect and collaborate. The tech rotations included a walkthrough of Apple’s Safety Check feature for iPhone as well as detailed Q&As with Uber and ReloShare. Coalition Day featured facilitated, in-depth discussions about hot topics on confidentiality and tech safety with coalition staff.

We are thrilled that the conference was a success and it wouldn’t have been possible without the participation of everyone involved and the support of our generous funders. We appreciate the support of Presenting Sponsors Google, Meta, and Norton; Platinum Sponsors Amazon and Uber; Silver Sponsor Airbnb; and Bronze Sponsors Kaspersky, Match Group, and ReloShare.

We look forward to taking back many great ideas on how to make next year’s conference even better. The conference was filled with ideas on how technology safety can improve the lives of survivors of abuse and harassment. We are excited to provide that information in the coming year through technical assistance, new written materials, new e-learning content, and our ever-expanding training catalogue.

We are already gearing up for Tech Summit 2025, so send along ideas for what you want to see in 2025! If you were unable to join us for Tech Summit this year, you can see a little of the fun by looking at the agenda, checking out pictures, or searching for #TechSummit on Twitter/X, Instagram, Facebook, and LinkedIn.

]]>
Tech Summit 2024
NNEDV Joins New National Partnership to Address Image-Based AbuseChad SniffenMon, 08 Jul 2024 18:59:40 +0000https://www.techsafety.org/blog/2024/7/8/nnedv-joins-new-national-partnership-to-address-image-based-abuse51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:668c3398afb47942d67c77ffThe National Network to End Domestic Violence (NNEDV) is excited to be a partner in a new working group that will focus on addressing image-based abuse. The partnership, led by NNEDV’s Safety Net Project, the Cyber Civil Rights Initiative (CCRI), and the Center for Democracy and Technology (CDT), will bring together victim service advocates, privacy experts, and technology companies to curb the nonconsensual distribution of intimate images and the rapid growth of AI-generated content (also known as synthetic images, digital sexual forgeries, or deepfakes). This partnership comes in response to a call to action by the White House Gender Policy Council and the Office of Science and Technology Policy.

AI-generated content has been a rapidly growing concern. These are fake images that are incredibly realistic and can have the same negative impacts when this technology is used to create fake intimate or sexual content with someone’s likeness. According to Deeptrace, an organization that detects and monitors this type of content, over 90% of AI-generated or manipulated videos online are sexually explicit and many of these may have been created without the consent or knowledge of those depicted.

The nonconsensual distribution of intimate images (NDII) has been a significant concern for survivors for many years, and this tactic of abuse can present in different ways. Images taken and shared in the context of a trusting relationship are often later weaponized as a tool for harassment, intimidation, extortion, and control. In other situations, images may have been originally taken without consent—either without the victim’s knowledge, or by coercion. Research shows that this tactic of abuse has significant and long-lasting effects, including the loss of community and employment, as well as substantial mental health impacts. CCRI has been operating a national Image Abuse Helpline for survivors (844-878-2274) and a Safety Center for many years and continues to provide survivors with resources and guidance around image-based abuse.

To learn more about the new partnership, see the CDT press release. For more information about these forms of abuse, see our resources on Image-Based Abuse and CCRI’s Safety Center.

]]>
New Report Highlights Trend in Sharing our Digital LivesAudace GarnettThu, 27 Jun 2024 22:12:17 +0000https://www.techsafety.org/blog/2024/6/27/new-report-highlights-trend-in-sharing-our-digital-lives-251dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:667de39055b8ca48e9ff4500

Technology has allowed relationships to extend far beyond the physical realm, which can create a complex web of shared digital spaces. For some, sharing some digital access with intimate partners, such as passcodes, location data, and social media accounts, can feel helpful and can foster a sense of trust. Many couples don’t share constantly, but they may decide to share when one person is traveling or going for a run, for example. For survivors of abuse, however, their partner having access to their digital accounts can create many harmful risks. Unfortunately, many abusive individuals misuse this access as a means to stalk, harass, threaten, and monitor their partners in order to maintain power and control.

A recent report from Malwarebytes titled WHAT’S MINE IS YOURS: How couples share an all-access pass to their digital lives highlights how digital sharing is common among couples, with 85% of people in committed relationships granting their partners access to personal accounts. The report explains just how commonplace this has become: “Sharing digital access with your partner appears to be inescapable: Whether it’s sharing accounts for household smart devices or granting your partner access to your social media accounts and messages, every single couple shares access in some way.”

This form of sharing is even more common among Gen Z, where 95% share access. This form of access is often seen as a major “key to building trust,” but it can also lead to undue pressure and significant regret. Forty-three percent of all respondents admitted feeling pressured to share their digital lives, with Gen Z and Millennials experiencing this pressure at higher rates than older generations.

The report goes on to share that nearly three in four partners acknowledge that there is much to learn about navigating a shared digital footprint. Half of those in a committed relationship admit that digitally disentangling their location from their partner would be difficult, given how much access they share, and 56% state that they could use some guidance on how to handle shared digital access.

The Safety Net Project at the National Network to End Domestic Violence (NNEDV) advised on the survey and report. We believe that privacy is deeply connected to safety, especially for survivors of abuse. This report highlights the critical need for all of us to examine norms around privacy as they exist within intimate relationships, as well as the need for tools that allow us to easily disentangle ourselves from shared accounts. While sharing account access or information can be useful, consent should remain at the core of these decisions. It should be a person’s decision to share, and no one should feel pressured into allowing access to their digital lives.

Malwarebytes has also developed an online resource hub to help people learn how to increase privacy and disconnect from shared accounts if they want to. To learn more about Securing Devices, take a look at our Securing Devices & Accounts and Assessing for Technology Abuse and Privacy Concerns resources in our Survivor Toolkit.

]]>
New Report Highlights Trend in Sharing our Digital Lives
Proactive Location Tracking Alerts are Coming to Android Shalini BatraWed, 15 May 2024 15:07:01 +0000https://www.techsafety.org/blog/2024/5/15/proactive-location-tracking-alerts-are-coming-to-android-151dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:6644cddfddb3a941237f997e

On May 13th, 2024, Apple and Google announced news of their furthered partnership to address unwanted location tracking. Users of Android devices 6.0 and above will soon be able to receive alerts if an item that allows for location tracking is moving with them but not paired with their phone. This means that if someone were to misuse a location-enabled device to monitor someone else’s movement and location without their consent or knowledge, then the person would get an alert that will allow for an opportunity to interrupt and stop the monitoring, if it’s safe for them to do so.

Stalking and the monitoring of location without consent are common tactics of abuse. Small Bluetooth tracking devices that are intended to help us find our lost and missing items can easily be misused as surveillance devices without proper safeguards in place. It’s critical that the companies behind these devices continue working together to identify ways to improve their design and implementation to minimize the possibilities for abuse. We are happy that Apple and Google have committed to a long-term partnership for this purpose, and we welcome this news.

The alerts will launch on Android devices 6.0+ and Apple 17.5. If you’re unsure of what version you have, go to the settings of your phone, look for “About Phone,” and look for the software version. If you have an older version, you won’t get alerts for possible tracking. If you are concerned about stalking, you can download a Bluetooth tracker detection app to scan your surroundings for a tracker. These aren’t always accurate, however, so it’s always important to trust your instincts. For more information on technology safety planning, see the resources in the Survivor Technology & Privacy Toolkit and check out our recent blog on The New Anti-Stalking Features of iOS 17.5.

]]>
Proactive Location Tracking Alerts are Coming to Android
Enhanced Security: The New Anti-Stalking Features of iOS 17.5Chad SniffenFri, 19 Apr 2024 19:16:38 +0000https://www.techsafety.org/blog/2024/4/19/enhanced-security-the-new-anti-stalking-features-of-ios-17551dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:6622bfda671dc56afcb50947The upcoming release of iOS 17.5 introduces significant enhancements to security and privacy, particularly in combating unwanted tracking. These updates are important advancements for personal safety, including for victims and survivors of tech-facilitated abuse.

Broadening Device Detection

Currently, Apple’s Find My network can send alerts to users with Apple devices when an unauthorized tracking accessory (one not connected to your device), like an AirTag or AirPod, is detected. This means that if someone put an AirTag in someone else’s belongings in order to track them without their knowledge, a notification of the AirTag would come through that person’s iPhone to alert them of the tracker. However, AirTags are not the only type of location-tracking device that can be misused for this purpose.

With this upcoming update, iOS 17.5, the capability for proactive notifications will also include non-Apple and non-Find My-certified tracking devices, offering a more comprehensive safety net for iPhone users. This enhancement comes as a part of a collaborative effort between Apple and Google to create a universal system that addresses unwanted tracking across both iOS and Android platforms. This aims to identify and disable unauthorized tracking devices, with the goal of enhancing user privacy across a broader range of potential threats.

This collaboration and update should make it easier to identify and disable unapproved tracking devices, helping to interrupt stalking and unwanted tracking and providing victims with helpful documentation. When an unwanted tracker is detected, the user will receive a notification and can then follow instructions to disable the device and stop it from sharing location information.

These updates are set to roll out with the public release of iOS 17.5, expected in mid-to-late May. For more information about location trackers, including addressing possible stalking, visit Apple’s Help Center page on what to do if you receive a tracking notification, EFF’s Self-Defense How-to Guide to Detect Bluetooth Trackers, and our Survivor’s Guide to Location Tracking.

]]>
Enhanced Security: The New Anti-Stalking Features of iOS 17.5
Black History Month Tech SpotlightAudace GarnettWed, 21 Feb 2024 16:40:17 +0000https://www.techsafety.org/blog/2024/2/21/black-history-month-tech-spotlight51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:65d6229668aeb46f5e2b2b23

If you have ever used Zoom, Google Voice, Slack Huddles, or made any type of call over the internet, you have to thank the trailblazer and powerhouse engineer, Dr. Marian Rogers Croak. Dr. Croak developed Voice over Internet Protocol (VoIP) technology, which allows us to connect with our colleagues, work remotely, join meetings, provide virtual video support to survivors, and chat with friends and family members.

Within recent years, the global use of this technology, particularly during the pandemic lockdowns increased. For some survivors, this transformative technology provided them with access to advocacy and support from a local program, access to the justice system, which allowed them to join remote court hearings, access to virtual support groups, telehealth appointments, and more.  VoIP has reshaped how we connect in our personal and professional lives and provided some survivors who have access with an alternative option to receive support.

VoIP technology converts your voice into a digital signal, enabling you to make calls using the internet from devices such as computers, cellphones, and tablets.

Born in New York City in 1955, Dr. Croak had a fascination with fixing things. A graduate of Princeton University and the University of Southern California, she holds a Ph.D. in social psychology and quantitative analysis.

She initiated her career in the tech world in the 1980s at AT&T Bell Laboratories, where she designed text-to-donate fundraising technology. This technology enables people to raise funds and make donations by texting a single word to a specific number. We first saw this technology during the devastating 2005 storm, Hurricane Katrina in Louisiana.

Dr. Croak holds over 200 US patents and continues to support women while advocating for diversity in the tech industry. She currently serves as the Vice President of Responsible AI and Human-Centered Technologies at Google. Formerly a VP for Site Reliability Engineering for Ads, Corporate Engineering, and YouTube, she is recognized as an innovator and powerhouse.

Dr. Croak is a mentor and an advocate who encourages Black women and girls to pursue careers in technology due to underrepresentation in this space.

Dr. Croak was inducted into the National Inventors Hall of Fame in 2022 for her groundbreaking and innovative VoIP technology, making her one of the first two Black women to receive this honor. She has played a pivotal role in changing the way we use technology, providing us with the ability to connect like never before.

Learn more about Dr. Croak’s current work on Artificial Intelligence (AI) and more here.

PDF ]]>
Black History Month Tech Spotlight
Safety Incident Reporting Options Matter for SurvivorsSafety NetFri, 02 Feb 2024 15:00:48 +0000https://www.techsafety.org/blog/2024/2/2/safety-incident-reporting-options-matter-for-survivors51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:65bd01b2fa0de61f407b76b1Recently, some friends had travel experiences that left them wanting to report concerning behavior to the transportation provider. These experiences highlighted just how different reporting options can be across companies and platforms. One was with a ride-sharing app, where the reporting functionality was within the app itself, directly to the company. Another was using public transportation, and the reporting went through a webform and required communication with a prosecutor. This led us to look a bit closer at the way reporting flows are designed and, across industries, they vary greatly.

Disclosing abuse, including domestic violence, sexual assault, harassment, and stalking, can be incredibly difficult, even if that disclosure is made through more accessible means like an app or an online platform. Victims have to navigate a new or unfamiliar system while also grappling with the impacts of the abuse itself.

People are often unaware of how to report abuse through certain tools or platforms unless they go through it. When someone experiences violence as they are traveling in the world, the options for safety incident reporting matter.

Reporting design can have a significant impact on not just a survivor's willingness and ability to report abuse, but also on the overall well-being of the survivor. Design can also directly influence the number of reports a company receives and their capacity to take meaningful action.

Advocates and researchers have long highlighted that sexual violence, domestic violence, and other forms of gender-based violence are all vastly underreported. People are often unsure of where or how to report, don’t know if what they have experienced is reportable, fearful of not being believed or taken seriously, hesitant to involve the legal system, and distrustful of the ability of platforms and systems to handle their cases well.

What if investments in reporting technology could be leveraged to help put choice and confidence into the hands of victims and survivors, helping to facilitate reporting, making it easier for individuals to come forward, and potentially increasing accountability?  

Certain factors will result in victims, survivors, and all users feeling empowered and more comfortable to report. Whereas, other factors might lead to underreporting or users not feeling comfortable proceeding with sharing their experience.

As we think more about the reporting experience, we must consider the following factors:

Are multiple reporting channels provided?

Creating various, accessible ways for victims and survivors to report any type of victimization could lead to an increase in reports that companies and organizations receive. Providing multiple avenues for reporting helps to provide options for those who would like to report an incident. For some, reporting discreetly is important, while others may want to speak directly to a person. Whether it is through an in-app button, a safety crisis line, a website, or an email, companies creating products, platforms, and devices for consumers should prioritize diverse reporting methods for all of their users. For example: if the only clear reporting options are within an app as the consumer, then how does a bystander report an incident? Or, what are the options for a survivor of domestic violence whose abuser is monitoring their phone activity?

How visible are reporting options for survivors? 

Following an incident, platform users deserve to have information on reporting options at their fingertips. When reporting options are not visible, this can deter reporting; when they are visible, it is more likely that victims and survivors will use them. Companies and organizations should prioritize making reporting channels visible and the options easy to understand, and they should also focus on socializing their technology with the public.

Are survivors informed on how their report will be used?

For many survivors, the experience may have been traumatizing and the impacts may be long-lasting. It’s on all of us to minimize the challenges people can face in attempting to report abuse and find resources. When a survivor makes a decision to report to a company or organization, they deserve clarity on how the personal information provided in a report will be used. For example, some survivors want to report to a company or an organization, but they are not interested in reporting what they have experienced to law enforcement for a variety of reasons. Providing users with information about the technology they are using, and building policies that provide survivors with control over how, when, and with whom information gathered in reports is shared, creates opportunities to build trust with consumers, creates a system of accountability, and offers a compassionate and survivor-centered response to a traumatic experience.

Are survivors and victims able to access support in a timely manner?

When users are ready to report an incident, it is critical to provide access to timely support. Whether through a 24/7 reporting line or in-app reporting mechanism, how quickly someone receives a response from companies is integral to maintaining their agency and healing. 

Conclusion

While these technology and design questions may seem like simple product decisions, they have a direct impact on consumers’ trust in the platform as well as a victim or survivor's willingness and ability to come forward and report their experience. The reporting functionality is just the beginning. This is where survivors can disclose and companies can respond. It’s critical that any reporting options get a thorough review to ensure that it’s meeting the needs of users and encouraging communication, not hindering it. After a report is received, there are many other things that need to be considered to ensure that companies are communicating with sensitivity and with safety in mind, deleting or retaining information appropriately, and following strict policies on how information is kept confidential as to keep survivors in control of their story.

]]>
Safety Incident Reporting Options Matter for Survivors
Data Privacy Day 2024EventJessie LMon, 29 Jan 2024 15:46:29 +0000https://www.techsafety.org/blog/2024/1/29/data-privacy-day-202451dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:65b7c85ad5ffa6187de73a62

Image Via the Stalking Prevention and Awareness Resource Center (SPARC)

January is National Stalking Awareness Month and January 28 is Data Privacy Day. In observance of these, we recognize four pillars to ensuring both survivors’ and the general population’s well-being: Safety, Privacy, Security, and Access. These guide our work every day in helping to achieve technology safety for all. While they go hand-in-hand, we’ll look at each one below.

Safety: Safety is the result that we are working for when we implement the other pillars. Protecting location, online activity, home and work addresses, or children’s whereabouts may be critical for a survivor’s safety. Survivors have a right to technology and shouldn’t have to choose between staying safe and using a device or platform. Many people rely on the internet to shop, look for jobs, search for resources, and maybe even conduct business as part of their livelihood. Abusers often isolate survivors as a tactic of abuse, and survivors may use the internet to decrease isolation. Strong privacy and security policies and settings, along with access to technology, help keep all of us safe.

Privacy: January 28 is Data Privacy Day, but we – survivors and non-survivors both – always need data privacy. Whether we are talking about everyone’s needs to keep their medical and financial data private, stalking survivors’ needs to know that the platforms they use are not sharing their or their support networks’ information, or the needs of LGBTQ+ youth to seek support and community without being outed to non-supportive family, privacy benefits everyone. Strong privacy policies, settings, and protections give survivors and everyone else one way to take back control over their digital presence.

Security: Having a secure way to communicate with trusted individuals, seek online resources or help, or have a place to store legal, health, or other personal documents is incredibly important. We share our information when using online spaces, services, and apps and hope that it remains secure. Strong security measures help ensure that personal information does not get into the wrong hands.

Access: Building a platform that centers the needs of its users, including survivors, by design, means considering the accessibility needs of those who have disabilities, speak languages other than the ones spoken by the designers, or have context-specific or culturally specific privacy and safety needs. Accessibility barriers that keep survivors from getting assistance can be a significant safety risk; so can accessibility features and services that require users to make large privacy and security compromises. Accessibility without loss of safety, in products, platforms, and technologies, should be a core tenet of our work.

Building and using technology with all this in mind can be challenging. For survivors, it can be exhausting and terrifying. One encouraging development is that more and more online platforms and services are building in End-to-End (E2E) Encryption to protect the privacy and security of users and their data. We have always been happy to see these announcements and even more thrilled when the platform has clearly also considered safety and accessibility! You can learn more about E2E Encryption by reading the resource that Safety Net developed with the Internet Society to help survivors and service providers understand encryption more.

]]>
Data Privacy Day 2024
Meta enables end-to-end encryption by defaultChad SniffenTue, 12 Dec 2023 18:59:23 +0000https://www.techsafety.org/blog/2023/12/12/meta-enables-end-to-end-encryption-by-default51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:6578abf7ae728e5868849adcStrong encryption is an important element of technology safety for survivors of violence, and for the advocates who support them. Privacy technologies, like encryption, can help survivors seek support and resources with less risk of abusers becoming aware of those help-seeking activities. Encryption can also limit online data-collection processes that may identify a survivor’s help-seeking though targeted advertisements, tailored search results, or recommended content.

Last week, Meta announced that it would enable end-to-end (E2E) encryption for personal messages and calls on Messenger and Facebook, options that were previously available but not turned on by default. This change means that communications between two people using Messenger or Facebook are unlikely to be read or otherwise intercepted by a 3rd party, including employees of Meta and Meta’s platforms.

Safety Net supports and encourages the availability of encryption options for both communications and data storage. Survivors will benefit from this change, even though E2E encryption will not significantly change the way that messaging is commonly misused against them (E2E will not protect against a person being forced to unlock their device, or from spyware monitoring a device’s activity). E2E encryption means that survivors can actively participate in, and make decisions about, the use of their communications during court proceedings since Meta cannot itself provide that information.

Survivors will also benefit from this change because Meta’s platforms will offer more security by default in the event that they are used for communications with advocates. Most programs for survivors of domestic violence, sexual assault, and stalking in the United States receive funding through the Violence Against Women Act (VAWA). VAWA-funded programs have strict confidentiality obligations to protect the personal information of survivors from any non-consensual disclosure, including disclosure to the companies that own messaging apps and other communications platforms. Although social media platforms are not designed for service provision and are not best for sensitive conversations, they can still be the platform that some survivors prefer to use. By enabling E2E encryption by default, Meta is helping to ensure confidentiality and privacy in instances where survivors are communicating with programs through Messenger and Facebook. That means advocacy can be more accessible to survivors, and survivors have more options for finding the help and support they deserve.

]]>
Meta enables end-to-end encryption by default
Ring Donation Program Renewed for Another YearChad SniffenThu, 07 Dec 2023 13:00:00 +0000https://www.techsafety.org/blog/2023/12/6/ring-donation-program-renewed-for-another-year51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:6570b768e31a23521e30bab4NNEDV is pleased to announce that the Ring Donation Program for Domestic Violence Survivors has been extended for another year! The first round of the program, which ran from November 2022 to October 2023, saw 10,000 devices distributed to domestic violence advocacy organizations though-out the United States and U.S. Territories. There was significant positive feedback and a clear demand for a continuation.

 

This next round will again provide 10,000 devices. Organizations who are providing direct services to survivors are eligible to access this program, including those that received devices during the first round. As before, Ring will be donating Video Doorbells, Security Cameras, and a free Ring Protect Plan subscription for the life of each device.

The donation program will continue to include the sharing of privacy and security best practices with organizations to ensure advocates can effectively support survivors who want to use these devices. In line with prioritizing privacy and safety, Ring will never have information about the individual survivors who receive the donated devices. Interested domestic violence organizations can review eligibility requirements and request devices here.

Safety Net has created additional resources for survivors and advocates to assist in enhancing survivor safety. You can access our FAQs and other resources here. We will also communicate product and feature feedback from the field to Ring to ensure that survivor safety is kept at the forefront of Ring’s work. If you have any questions about the program or applying for the devices, please contact us.

]]>
Ring Donation Program Renewed for Another Year
Norton Donates 5000 Free Product Licenses for Domestic Violence SurvivorsChad SniffenTue, 14 Nov 2023 19:34:12 +0000https://www.techsafety.org/blog/2023/11/14/norton-donates-5000-free-product-licenses-for-domestic-violence-survivors51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:6553c8c7a8d2362e6ff6faf2Through a unique partnership with TechSoup, and in collaboration with the National Network to End Domestic Violence, Norton is donating up to 5,000 free product licenses to survivors to help them recover from financial or technological abuse. Licenses are available for Norton 360 Deluxe and Norton Secure VPN. From the Gen Digital blog:

“Norton 360 Deluxe, one of the products offered through the new donation program, can prevent activity or location tracking, information theft, installation of malicious programs and uninvited changes to devices. We are also offering Norton Secure VPN as a standalone product, which protects a person’s online privacy by hiding the computer’s address from websites visited from any device. Eligible organizations may request up to 100 licenses of each product, which then must be distributed to survivors of domestic abuse.”

Domestic violence survivors face unique challenges to securing their technology and information online. While tools like these can make it easier for survivors to increase their security online, often the cost of these products prevents survivors from accessing them. Security tools like those offered in this partnership can be a valuable piece of a survivor’s tech safety planning. Domestic violence service providers who are members of the Safe Shelter Collaborative can apply online through TechSoup to help connect survivors with these donations.

]]>
Norton Donates 5000 Free Product Licenses for Domestic Violence Survivors
Global Encryption Day 2023Current EventsJessie LFri, 20 Oct 2023 20:18:10 +0000https://www.techsafety.org/blog/2023/10/20/global-encryption-day-202351dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:6532e086acfe264c1f0d15b4On October 21, NNEDV’s Safety Net Team marks Global Encryption Day 2023. Internet-connected technology is now part of people’s everyday lives. This makes it important for survivors to have access to secure online communication channels. Survivors strategically use these channels for many reasons. They make plans, store and send evidence, and seek help. For safety reasons, all of these require strong encryption. Encryption is the process of turning readable data to unreadable data. It can be used to protect survivors’, service providers’, and consumer companies’ data. When using an encrypted method for communicating, disclosures and requests for assistance that survivors share with others cannot be intercepted and read by anyone else. Encryption is critical to survivors’ privacy, safety, and self-determination.

Some encryption methods are stronger than others. End-to-end (E2EE), zero-knowledge encryption is uniquely strong. It means that data is secure on each step of its journey. When used with third-party platforms, it means that even the companies providing our tech – such as email providers, Internet Service Providers, and database companies – cannot access it. The safety and privacy of survivors is vital. Without secure encryption measures, they can be placed at further risk. In addition, isolation and monitoring of communication are common tactics of abuse. Survivors need secure methods to communicate with their friends and providers to navigate the abuse safely and strategically.

The COVID-19 pandemic led to a significant increase in remote communications tools being created and used within victim services. These can greatly increase accessibility and help survivors of gender-based violence who want to connect with services. But, to be safe and helpful they must be secure. This security requires strong encryption.

The Internet Society has more information on encryption and Global Encryption Day. Service providers who want to learn more about secure communication technology when working with survivors can check out our Digital Services Toolkit.

]]>
Emergency Alerts and Hidden DevicesAudace GarnettThu, 28 Sep 2023 21:06:54 +0000https://www.techsafety.org/blog/2023/9/28/emergency-alerts-and-hidden-devices51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:6515e98bc596b82e304370f1Survivors of domestic violence are creative, resourceful, resilient, and tech-savvy. They often devise unique strategies to protect themselves and maintain their privacy and safety. One of the strategies that survivors may use is having a hidden phone, which often serves as a lifeline for connecting to support and assistance.

On Wednesday, October 4, 2023 at 2:20pm ET (1:20pm CT/12:20pm MT/11:20am PT), every cell phone user nationwide will receive a Wireless Emergency Alert (WEA). WEA tests are sent out by authorized government agencies to ensure that every cellphone can receive warnings about national emergencies such as natural disasters, public safety threats, and more.

While these alerts are invaluable for public safety, they can pose risks to survivors who have hidden phones by alerting the abuser that the phone exists. Receiving alerts such as Amber Alerts is a common occurrence for those with a cellphone and is a feature you can turn off through your phone settings for year-round protection from unexpected alerts. However, this upcoming national test is not an alert that can be disabled or turned off within the device. Therefore, survivors should power off their devices during the test and not schedule phone calls on their hidden cellphone during that time.

The decision to have a hidden device is one that a survivor should make with careful consideration based on the potential risks involved. In some situations, it may not be safe or advisable to have a hidden device, as it could escalate violence or lead to further harm if the device is discovered. Survivors should always trust their instincts and prioritize their safety about everything.

Read more about how survivors who own hidden devices can protect themselves ahead of the upcoming WEA test alert and learn more about securing devices and accounts in our recently created resource in partnership with Norton.

If it is safe to do so, survivors can always contact one of the national hotlines to get connected with a local advocate who can provide them with safety planning support, and information about local resources.

]]>
Emergency Alerts and Hidden Devices
Leave the Meeting: A Recommendation for Advocates Concerned About AI Meeting AssistantsChad SniffenTue, 05 Sep 2023 13:00:00 +0000https://www.techsafety.org/blog/2023/9/1/leave-the-meeting-a-recommendation-for-advocates-concerned-about-ai-meeting-assistants51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:64f24ee3924ada05e241c910Several organizations have contacted Safety Net about recent changes in Zoom’s privacy policies, and we have followed much of the reporting about these changes and updates from Zoom. Achieving clarity on Zoom’s privacy policies is crucial since advocates and service providers often discuss confidential information that they are obliged to protect. Disclosure of that information could put survivors at risk, and the nature of AI models and training processes implies that confidential information could be exposed in unpredictable ways.

Of concern to most people is the new language of Zoom's policy sections 10.2 and 10.4, which imply that Zoom has broad rights to collect user content (video, audio, and text) for use in training AI models. See here and here for more detailed discussions about these concerns. Zoom’s response, given here, has been that “we do not use audio, video, or chat content for training our models without customer consent.”

“Consent” from Zoom’s perspective seems to be obtained when the administrator of a Zoom account has enabled Zoom IQ Chat Compose or Meeting Summary. Once enabled, participants joining a meeting will see a consent notice screen that may look something like this:

A meeting participant’s view of the Meeting Summary notification once the host enables the feature. (Zoom)

From a victim advocate confidentiality perspective, the most obvious problem with AI training is the human review component that is part of the training process. As Zoom states here (emphasis mine):

Zoom’s use of your Zoom IQ feature data, when shared by your account owner or admin, will include automated processing methods and may include manual review. Automated processing applies methods that utilize artificial intelligence technologies to provide predictions and improve the accuracy of automated responses. Zoom may retain your data for up to 18 months for automated processing. Manual review occurs when Zoom employees, or contractors working on Zoom’s behalf, manually review these inputs and automated responses to provide human feedback to improve accuracy and quality. Zoom may retain data for manual review for a longer period.

The manual process that is a requirement of AI training means that a Zoom employee could have access to any information shared in a Zoom meeting where Zoom IQ features are enabled, and any confidential information shared in that meeting would be disclosed. Although Zoom is very clear that meeting participants retain their "ownership" of shared content, that ownership is irrelevant to an advocate's confidentiality obligations and the safety of survivors.

As such, we strongly recommend that advocates refuse to participate in any meetings where "Meeting Summary" or other Zoom IQ features are enabled, if it is possible that any confidential or otherwise sensitive information may be discussed. If organizations want to use the service for meetings that do not involve survivor information or sensitive content in any way, that is the decision of the organization based on the comfort level of the staff involved.

Regarding Microsoft Teams, Microsoft has had "Intelligent Recap" and other AI-based tools available on its Teams Premium tiers since February. Those other tools include 365 Copilot (Copilot for Teams) and Live Captions, and Microsoft also makes an Azure Cognitive Services (their AI platform) software development kit available to developers who make 3rd party products for the Teams platform. Like all AI products, these tools require human review of their outputs to train them to be more effective, and it is not clear how Microsoft will make use of user-generated content in that process. While Microsoft's Privacy Policy was not updated when these products were released (though it has been updated recently), substantial data security and privacy concerns at Microsoft have been reported on this year.

We also recommend avoiding meetings using Teams where Intelligent Recap or Live Captions are enabled for conversations that can involve confidential or sensitive information. Teams Premium products are not used as widely among non-profits as Zoom, however, so these products may not come up for many service providers.

As artificial intelligence continues to advance and become integrated into numerous products and services we rely on, it becomes increasingly important for us as service providers and advocates to be well-informed about how it may impact survivors. It's imperative that we remain vigilant to ensure that technologies do not inadvertently harm those who are most vulnerable.

In light of this, we suggest that you continue to stay informed and engage in a conversation with the administrator of your organization's Zoom account about disabling this feature, or keeping the feature disabled if it has not been enabled. Express the concerns highlighted above and emphasize how these features might compromise the safety and well-being of survivors who depend on these services.

]]>
Leave the Meeting: A Recommendation for Advocates Concerned About AI Meeting Assistants
Google Brings Unknown Tracker Alerts to AndroidLaisa SchweigertWed, 30 Aug 2023 21:17:06 +0000https://www.techsafety.org/blog/2023/8/30/google-brings-unknown-tracker-alerts-to-android51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:64efaff0de917815ff4715c8

Example of an unknown tracker alert

Google has announced that unknown Bluetooth tracker alerts are coming to Android devices. This update is intended to prevent stalking facilitated by Bluetooth trackers by identifying when a tracker is following a device it is not paired with and then notifying the owner of the device. Once notified, users are given information on locating the device, how long the device has been traveling with them, and ways to disable it. Google has made a helpful video (see below) including a visual walk through of how unknown tracker alerts will function on Android devices. For more information on how unknown tracker alerts work, visit Google’s blog and their unknown trackers support site.

Bluetooth trackers are typically marketed as a convenient way to locate lost keys, pets, or valuables. As they become more affordable and easier to use, however, these trackers have been repeatedly misused to stalk unknowing victims. Bluetooth trackers are small, often make no noise, and can operate for long periods of time on little power – all features which make them convenient to use for their intended purpose AND silent surveillance. These features make it easy for an abusive person to put a Bluetooth tracker inside a purse, car, or stuffed animal undetected where it can remain unnoticed until they choose to retrieve it. This can turn an ordinary object into a tracking device.

Safeguarding location is an absolute necessity for survivors of domestic violence, sexual assault, stalking, and other forms of abuse, particularly when they are fleeing violence. These trackers are easily misused by abusers as instruments to surveil, exert control, and foster an environment of intimidation. The ability of Bluetooth trackers to monitor a survivor’s location and movement without being discovered creates a potential additional vulnerability for survivors of violence, and one that can be uniquely difficult to detect. This vulnerability is lessened with the built-in ability of devices to detect unknown trackers.

While this update will only detect Apple AirTags at the moment, Google is continuing to work with other tracking device manufacturers through a joint industry specification to expand detection to other Bluetooth trackers. It’s imperative that tech companies continue to work towards the universal ability to detect and disable unknown trackers. NNEDV’s Safety Net team partnered with the Center for Democracy and Technology to advise on these standards and released a joint op-ed highlighting recommendations for safer Bluetooth tracker technology. We will be working to gather stories and examples from the field of misuse and how a universal process for proactive notification could be helpful for the safety and well-being of survivors.

For more information regarding efforts to address unauthorized location tracking, check out our May 2nd blog post.

]]>
Google Brings Unknown Tracker Alerts to Android
New Guide to Securing Devices and AccountsJessie LWed, 12 Jul 2023 14:04:07 +0000https://www.techsafety.org/blog/2023/7/12/new-guide-to-securing-devices-and-accounts51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:64aeae78d8098a0d81185d0eWe're thrilled to continue our partnership with Norton by launching "Securing Devices and Accounts" a NEW privacy- and security-focused resource for survivors of abuse, stalking, and other gender-based violence. You can read more about the partnership and the resource launch at their blog. Check it out and see our Survivor Resources Toolkit for more survivor resources!

]]>
New Guide to Securing Devices and Accounts
Partnering to Address Location TrackingShalini BatraTue, 02 May 2023 17:24:08 +0000https://www.techsafety.org/blog/2023/5/2/partnering-to-address-location-tracking51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:6451471ac7af535f2352fcc9Today, Apple and Google announced they are partnering to address the misuse of Bluetooth location tracking devices. Unwanted tracking is an increasingly common tactic of abuse and concern for survivors. NNEDV and the Center for Democracy and Technology (CDT) have been advocating for universal standards to minimize opportunities for abuse and to decrease the burden for detecting possible trackers. Part of that advocacy effort was shared in a joint op-ed where we provided some recommendations for standards. Without universal solutions, there are limited proactive notifications to let someone know there could be a tracker following them and survivors have to download multiple apps to scan and detect a nearby tracker. This partnership is a significant step towards creating the universal standards necessary to address these concerns.

To learn more, check out Apple’s newsroom postGoogle’s Security Blog, and our joint statement with CDT.

]]>
March 2023 - New & Updated Tech Safety ResourcesJessie LMon, 13 Mar 2023 20:38:42 +0000https://www.techsafety.org/blog/2023/3/13/march-2023-new-amp-updated-tech-safety-resources51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:640f88a552aaca0ab353c7f5Safety Net is excited to announce our new and updated resources (below)! We hope these materials will help survivors and victim service providers who are trying to navigate technology, privacy, and safety. All of the materials are available in both English and Spanish.

Passwords: Increasing Your Security

This resource is an overhaul of our previous resource on this topic, containing new content. It explains the risks to survivor privacy posted by insecure passwords and different potential strategies to address this issue in privacy planning. It is aimed at survivors, but the content is also useful for victim service providers, both in terms of doing tech advocacy with survivors, and in protecting their own accounts.

Online Privacy and Safety Tips

This resource updates our previous content on online privacy and safety, as well as adding new content aimed at both survivors and providers.

Mobile Advocacy: Privacy & Safety

This resource, aimed at victim service providers, is an overhaul and combination of two previous closely-related resources – one on using mobile phones to communicate with survivors, the other on mobile advocacy – into a single document. It contains additional new content.

]]>
Teen Dating Violence Awarenss Month (TDVAM)Audace GarnettFri, 24 Feb 2023 19:32:09 +0000https://www.techsafety.org/blog/2023/2/24/teen-dating-violence-awarenss-month-tdvam51dc541ce4b03ebab8c5c88c:51dc541de4b03ebab8c5c890:63f90ed0164a684657d15d2f

February is Teen Dating Violence Awareness Month (TDVAM). Although technology has advantages, it also introduces serious concerns. We should continually – not only in February – raise awareness about the misuse of technology in abusive teen relationships. Dating is a normal part of adolescent development, and teens are increasingly using technology for entertainment, connection, and communication. Teens can and do use technology in positive ways that help them form healthy relationships and develop their sense of identity. It is important that adults recognize this and encourage and promote positive usage of tech.

As we embrace these new ways of communication and connection, it is critical to comprehend the prevalence and trends of teen dating violence. As teens build relationships online, they can be deliberate in using key skills and practices to make and keep their relationships healthy. Setting healthy boundaries, respecting their partner's interests, and communicating are vital skills that young people can use in their dating relationships and beyond.

Check out our new TDVAM resource, Five Truths About Dating At A Distance, in partnership with our grant partner, Tonjie Reese of eleven24. This resource is also available in Spanish.

 

]]>
Teen Dating Violence Awarenss Month (TDVAM)