Rave Mobile Safety Acquires AppArmor, Extends Market Reach in Safety and Incident Response Solutions

Rave Mobile Safety, the maker of the Rave panic button app used by 10,000 K–12 schools across the nation, today announced the acquisition of AppArmor, a major provider of custom-branded mobile safety and emergency notification apps serving hundreds of higher ed, public safety, corporate, and healthcare organizations.

The acquisition will merge the “scalability and robustness of the Rave platform with agile mobile app customization through AppArmor to deliver best in class technology for critical communication and collaboration,” a news release said.

“AppArmor shares our mission to disrupt and innovate critical communications and incident collaboration and has established a robust customer base that speaks to their ability to deliver technology that makes a difference,” said Rave CEO Todd Piett. “With the addition of AppArmor, the combined company can now offer our clients an even broader suite of market-leading solutions to enhance safety and response for higher education institutions, public safety agencies and private enterprises.”

Q&A: How Rave Mobile Safety Keeps Schools Safe

Rave SVP Todd Miller recently spoke with THE Journal about Rave’s panic button app and how it’s helping keep schools and students safe, how Rave helps automate the instantaneous sharing of information that helps safety responders, and why he believes a statewide approach to school safety technology works best. Read the Full Report.

Founded in 2011, AppArmor solutions for K-12 anonymous tipping are used statewide in Florida and Hawaii; the company’s newest product is a COVID-19 module used for vaccination validation.

The deal combines Rave’s multi-modal mass notifications, crisis management solutions and deep integration into 911 systems and emergency-response processes with AppArmor’s configurable app capabilities and content management. With the addition of AppArmor’s higher education customer base to Rave’s existing clientele across the U.S. and Canada, Rave will now protect 75% of North American higher ed organizations, according to the news release.

Learn more at RaveMobileSafety.com and AppArmor.com.

About the Author

Kristal Kuykendall is editor, 1105 Media Education Group. She can be reached at [email protected].


Featured

  • stylized makerspace with levitating tools, glowing holographic displays, and simplified characters collaborating

    TinkRworks, 1st Maker Space Partner for Hands-on STEAM Learning

    STEAM curriculum provider TinkRworks and 1st Maker Space, a provider of customized makerspaces and STEAM labs, have partnered to help foster hands-on learning in STEAM classrooms.  

  • An elementary school teacher and young students interact with floating holographic screens displaying colorful charts and playful data visualizations in a minimalist classroom setting

    New AI Collaborative to Explore Use of Artificial Intelligence to Improve Teaching and Learning

    Education-focused nonprofits Leading Educators and The Learning Accelerator have partnered to launch the School Teams AI Collaborative, a yearlong pilot initiative that will convene school teams, educators, and thought leaders to explore ways that artificial intelligence can enhance instruction.

  • silhouetted human figures stand opposite a glowing digital brain, surrounded by abstract circuits and shadowy shapes

    Tech Execs Expect AI Advancements to Increase Security Threats

    Forty-one percent of tech executives in a recent international survey said they believe advancements in AI will significantly increase security threats. NetApp's second annual Data Complexity Report points to 2025 as "AI's make or break year."

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Supported by OpenAI

    OpenAI, creator of ChatGPT, is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.