Insurers Want to Know How Many Steps You Took Today
The cutting edge of the insurance industry involves adjusting premiums and policies based on new forms of surveillance.
Do you want to know more information about the pa health insurance companies then please contact us in the comment section and send your queries.
A smartphone app that measures when you brake and accelerate in your car. The algorithm that analyzes your social media accounts for risky behavior. The program that calculates your life expectancy using your Fitbit.
This isn’t speculative fiction — these are real technologies being deployed by insurance companies right now. Last year, the life insurance company John Hancock
began to offer its customers the option to wear a fitness tracker — a
wearable device that can collect information about how active you are,
how many calories you burn, and how much you sleep. The idea is that your Fitbit or Apple Watch can tell whether or not you’re living the good, healthy life — and if you are, your insurance premium will go down.
This
is the cutting edge of the insurance industry, adjusting premiums and
policies based on new forms of surveillance. It will affect your life
insurance, your car insurance and your homeowner’s insurance — if it
hasn’t already. If the Affordable Care Act’s protections for people with pre-existing conditions should vanish, it will no doubt penetrate the health insurance industry as well.
Consumers
buy insurance from companies to protect against possible losses. But
this contractual relationship is increasingly asymmetrical. The
insurance companies once relied on a mix of self-reported information,
public records and credit scores to calculate risk and assess how much
to charge. But thanks to advances in technology, the capacity to
collect, store and analyze information is greater than ever before.
A 2018 report from the consulting firm McKinsey notes that “smart” devices — fitness trackers, home assistants like Alexa, connected cars and smart refrigerators — are proliferating in homes. The “avalanche of new data” they can provide will change the face of insurance.
In 2014, the insurance company State Farm filed a patent application for a system that “aggregates and correlates” data for
“life management purposes.” The application lists a wide range of
information, such as “home data, vehicle data and personal health data
associated with the individual.”
Some of the changes heralded by these
new technologies will be better for everyone, like faster claims
processing. But the use of data collection and artificial intelligence
also raises serious questions about what McKinsey calls “personalized
pricing” and what the State Farm patent application calls “personalized
recommendations” and “insurance discounts.”
Before the A.C.A., data brokers bought data from pharmacies
and sold it to insurance companies, which would then deny coverage
based on prescription histories. Future uses of data in insurance will
not be so straightforward.
As machine learning works its way into more and more decisions about who gets coverage and what it costs, discrimination becomes harder to spot.
Part
of the problem is the automatic deference that society has so often
given to technology, as though artificial intelligence is unerring. But
the other problem is that artificial intelligence is known to reproduce
biases that aren’t explicitly coded into it. In the field of insurance, this turns into “proxy discrimination.”
For example, an algorithm might (correctly) conclude that joining a
Facebook group for a BRCA1 mutation is an indicator of high risk for a
health insurance company. Even though actual genetic information — which is illegal to use — is never put into the system, the algorithmic black box ends up reproducing genetic discrimination.
A ZIP code might become a proxy for race; a choice of wording in a résumé might become a proxy for gender; a credit card purchase history can become a proxy for pregnancy status.
Legal oversight of insurance companies, which are typically regulated
by states, mostly looks at discrimination deemed to be irrational: bias
based on race, sex, poverty or genetics. It’s not so clear what can be
done about rational indicators that are little but proxies for factors
that would be illegal to consider.
Placing
those biases inside a secret algorithm can prevent critical examination
of inequality.
ProPublica found that people in minority neighborhoods
paid higher car insurance premiums
than residents of majority-white neighborhoods with similar risk, but
its reporters could not determine exactly why, since the insurance
companies would not disclose their proprietary algorithms or data sets.
A handful of lawsuits in other arenas
have challenged this practice. After Idaho’s Medicaid program started
using an automated system to calculate benefits, recipients suddenly saw
their benefits cut by as much as 30 percent. When the state refused to
disclose its algorithm, claiming it was a trade secret, the A.C.L.U. of Idaho sued to gain access to the code, and ultimately discovered that the formula was riddled with flaws.
Artificial intelligence, in all its variations, holds great promise. The automated processing
of car accident photos or machine reading of medical scans can help cut
down costs, and even save lives. But the opacity around many
applications of automation and artificial intelligence are reason for
pause. Not only do people have limited access to the code that
determines key facets of their lives, but the bar to understanding the
“reasoning” of algorithms and data sets is high. It will get higher as
more industries begin to use sophisticated technologies like deep learning
A. I. research should march on. But when it comes to insurance in
particular, there are unanswered questions about the kind of biases that
are acceptable. Discrimination based on genetics has already been
deemed repugnant, even if it’s perfectly rational. Poverty might be a
rational indicator of risk, but should society allow companies to
penalize the poor? Perhaps for now, A.I.’s more dubious consumer
applications are better left in a laboratory..
Do you want to know more information about the pa health insurance companies then please contact us in the comment section and send your queries.
Comments
Post a Comment