The Markets
If you’ve ever waited in traffic while the center section of a bridge lifts to allow ships and sailboats to pass underneath, you may have noticed the enormous counterweight that lowers as the bridge moves higher. When the boats have passed, the counterweight rises, and the bridge lowers back into place.
The Federal Reserve (Fed) often acts as a counterweight to the economy; raising and lowering interest rates to achieve its goals. Recently, the Fed has been raising rates to bring inflation down. Higher rates make borrowing more expensive, slowing economic growth and reducing demand for goods.
Over the past 18 months, the Fed has raised the effective rate from near zero to 5.33 percent. Last week, data suggested its efforts were working. The Personal Consumption Expenditures Price Index showed that headline inflation has dropped from a peak of 6.8 percent in June of 2022 to 3.3 percent in July 2023.
Don’t Trust Your Ears
There are pros and cons to artificial intelligence (AI). On the pro side, many people have found AI-powered digital assistants helpful. The assistants schedule events and offer reminders. They relay timely information about weather and traffic, help manage lights, thermostats, ovens, and other smart devices in homes.
On the con side, they’ve become a valuable tool for scammers. Recently, criminals have been using AI-generated voices to scam family members, friends and financial institutions.
The potential for vocal deception was demonstrated at a recent Senate hearing, which featured “a faked voice recording that was written by ChatGPT and vocalized by an audio application trained on [a U.S. Senator’s] Senate floor speeches,” reported Matt Berg of Politico. “If you closed your eyes at the beginning of the hearing, you couldn’t have told that we were playing a voice clone of myself,” commented the Senator.
Deepfake audio also has been used to mimic the voices of friends and family members. In another hearing, a mother who was targeted shared the story of receiving a phone call from her terrified teenage daughter and her kidnapper, who demanded a ransom. Only, it wasn’t the daughter – it was an AI-generated voice that sounded just like her, “reported Carter Evans and Analisa Novak of CBS News.
There are ways for families and friends to protect against voice scams. These include:
– Choose a code word. Then, if a suspicious call is received, they can ask the caller for the code word – Call or text the person who is making the emergency call (or someone with them)
In the example above, the mother called her husband who confirmed their daughter was safe. Since voice cloning often relies on publicly available audio, it can be a wise choice to make social media accounts private, and only accept followers whom you know.