No Gains, Just Pains As 1.6m Fitness Phone Call Recordings Exposed Online

Exclusive Sensitive info from hundreds of thousands of gym customers and staff – including names, financial details, and potentially biometric data in the form of audio recordings – was left sitting in an unencrypted, non-password protected database, according to a security researcher who shut it down.

Leaky database hunter Jeremiah Fowler claims he discovered the wide-open AWS repository managed by HelloGym in late July and shared his findings with The Register.

The database remained open for a week, and Fowler said it took a bit of digging to determine who was responsible for the repository of audio calls.

“It was only after calling, asking individual gyms that mentioned their locations in the recording,” he told The Register. “I asked who they use to record their calls and one of the managers finally told me.”

HelloGym provides sales, marketing, phone-answering, and VoIP call services for several top gyms including Anytime Fitness, Snap Fitness, and UFC Gym, among others, and the database contained 1.6 million audio files from a number of franchise locations of some of the largest fitness brands in the US and Canada. Some of these calls were also shared with The Register.

HelloGym declined to comment for this story.

The audio recordings, all stored as MP3s, mentioned people’s names, phone numbers, and reasons for the call, such as to renew or cancel memberships. Based on the file dates and timestamps, these calls and voice recordings were collected between 2020 and 2025.

According to Fowler, the database was likely a storage repository for VoIP audio files intended for internal use only.

“A very large number of the recordings referenced payment and billing issues,” Fowler said. “Although I didn’t hear any credit card numbers in the audio, it shows that members were comfortable discussing payment information over the phone.”

He added that these calls could have been monitored by criminals looking to perform an adversary-in-the-middle attack: intercepting the call or voicemail, calling the gym member back in real time while posing as an employee, asking the member to confirm payment information or pay a phony cancellation fee – and then stealing their credit card or bank account details.

The audio files could be played in any web browser without requiring specialized software or a password to listen to them.

In some of the calls, gym employees called the corporate headquarters or client services department and provided their own names, gym number, and personal passwords to verify themselves before requesting account changes for members. This also presented an opportunity for crooks to steal valid credentials, which could have been used to impersonate gym staff in a Scattered Spider-style social engineering attack.

Voice cloning and deep fakes

And all of these potential scam scenarios are before AI enters the threat mix.

As Microsoft previously documented in 2023, AI tools such as VALL-E can clone a human voice with just three seconds of audio. Last year, the tech giant opted not to release its VALL-E 2 project to the public because of potential abuses “such as spoofing voice identification or impersonating a specific speaker.”

So it’s not outside of the realm of possibility that digital thieves may use people’s voice recordings to steal their identity and/or create deepfake videos or audio recordings to, say, impersonate a company executive and ask a finance employee to transfer money out of a business account.

“It is a real potential risk that is no longer a hypothetical,” Fowler said. “Many social media accounts are filled with family, friends, employment, and other valuable information that could indicate if a person has a high net worth or is a valuable target. Cross-referencing PII in previous breaches and open source data would give criminals a real understanding of an individual.”

He added that if a criminal wanted to impersonate someone for a social engineering scam, “it adds an additional layer of risk to have their biometric voice data and their personal information gathered from other sources. The way social engineering works is building trust and hearing a familiar voice could easily trick someone into believing the call is from someone they know.”

In addition to serving as a warning to individuals – specifically, be careful about what personal and financial information you leave via voicemail recordings – the database also serves as a precautionary tale for organizations collecting customer information and biometric data.

Top of the list, according to Fowler, is to use encryption, which will ensure that if the data is accidentally exposed, the files aren’t immediately readable. He also suggests performing penetration testing, which can help identify misconfigured or open storage systems.

Finally, he advises businesses to segment data that they are no longer using. “Far too often I see organizations storing years’ worth of records in a single database and not deleting old files,” Fowler said. “As a general rule, it is a good strategy to securely back up old data to limit the exposure in the event of a data incident.” ®


Original Source


A considerable amount of time and effort goes into maintaining this website, creating backend automation and creating new features and content for you to make actionable intelligence decisions. Everyone that supports the site helps enable new functionality.

If you like the site, please support us on “Patreon” or “Buy Me A Coffee” using the buttons below

To keep up to date follow us on the below channels.