Period tracking app Flo releases anonymous mode and more digital health briefs

0
42



Interval monitoring app Flo launched its beforehand introduced nameless mode, which the corporate mentioned will enable customers to entry the app with out associating their title, electronic mail deal with and technical identifiers with their well being knowledge. 

Flo partnered with safety agency Cloudflare to construct the brand new characteristic and released a white paper detailing its technical specs. Nameless mode has been localized into 20 languages, and it is presently obtainable for iOS customers. Flo mentioned Android help shall be added in October. 

“Girls’s well being data should not be a legal responsibility,” Cath Everett, VP of product and content material at Flo, mentioned in an announcement. “Day-after-day, our customers flip to Flo to realize private insights about their our bodies. Now, greater than ever, girls need to entry, monitor and achieve perception into their private well being data with out fearing authorities prosecution. We hope this milestone will set an instance for the business and encourage corporations to lift the bar in terms of privateness and safety ideas.”

Flo first announced plans so as to add an nameless mode shortly after the Supreme Court docket’s Dobbs determination that overturned Roe v. Wade. Privateness specialists raised concerns that the info contained in girls’s well being apps could possibly be used to construct a case in opposition to customers in states the place abortion is now unlawful. Others have argued different types of data usually tend to level to unlawful abortions.

Nonetheless, reports and studies have famous many standard interval monitoring apps have poor privateness and knowledge sharing requirements. The U.Ok.-based Organisation for the Review of Care and Health Apps discovered hottest apps share knowledge with third events, and plenty of embed person consent data inside the phrases and circumstances. 


Brentwood, Tennessee-based LifePoint Well being introduced a partnership with Google Cloud to make use of its Healthcare Knowledge Engine to combination and analyze affected person data.

Google Cloud’s HDE pulls and organizes knowledge from medical data, scientific trials and analysis knowledge. The well being system mentioned utilizing the instrument will give suppliers a extra holistic view of sufferers’ well being knowledge, together with providing analytics and synthetic intelligence capabilities. LifePoint may even use HDE to construct new digital well being packages and care fashions in addition to combine third-party instruments. 

“LifePoint Well being is essentially altering how healthcare is delivered on the neighborhood degree,” Thomas Kurian, CEO of Google Cloud, mentioned in an announcement. “Bringing knowledge collectively from lots of of sources, and making use of AI and machine studying to it is going to unlock the facility of information to make real-time selections — whether or not it’s round useful resource utilization, figuring out high-risk sufferers, decreasing doctor burnout, or different essential wants.”


The Nationwide Institutes of Well being introduced this week it is going to make investments $130 million over 4 years, so long as the funds can be found, to develop using synthetic intelligence in biomedical and behavioral analysis.

The NIH Widespread Fund’s Bridge to Synthetic Intelligence (Bridge2AI) program goals to construct “flagship” datasets which can be ethically sourced and reliable in addition to decide finest practices for the rising expertise. It’ll additionally produce knowledge varieties that researchers can use of their work, like voice and different markers that would sign potential well being issues.

Though AI use has been increasing within the life science and healthcare areas, the NIH mentioned its adoption has been slowed as a result of biomedical and behavioral datasets are sometimes incomplete and do not comprise details about knowledge kind or assortment circumstances. The company notes this could result in bias, which specialists say can compound existing health inequities

“Producing high-quality ethically sourced datasets is essential for enabling using next-generation AI applied sciences that rework how we do analysis,” Dr. Lawrence A. Tabak, who’s presently performing the duties of the director of NIH, mentioned in an announcement. “The options to long-standing challenges in human well being are at our fingertips, and now’s the time to attach researchers and AI applied sciences to deal with our most troublesome analysis questions and in the end assist enhance human well being.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here