Algorithms are everywhere. Here’s why you should care



An algorithm is a set of rules or steps followed, often by a computer, to produce a result. And algorithms aren’t just on our phones – they’re used in all kinds of processes, online and offline, from upgrading your home to learning your robot vacuum to avoid your dog’s poop. Over the years, they have increasingly been entrusted with life-changing decisions, such as helping decide who to arrest, who should be released from jail before a court date, and who is approved for a home loan.
In recent weeks, algorithms have come under intense scrutiny, including how tech companies are expected to change the way they use them. This stems from both concerns raised during hearings featuring Facebook whistleblower Frances Haugen and bipartisan legislation introduced in the House (a companion bill had already been reintroduced in the Senate). The legislation would require big tech companies to allow users to access a version of their platforms where what they see is not shaped by algorithms. These developments highlight a growing awareness of the central role that algorithms play in our society.

“At this point, they’re responsible for making decisions about just about every aspect of our life,” said Chris Gilliard, visiting scholar at the Shorenstein Center on Media, Politics and Public Policy at Harvard Kennedy School.

However, how algorithms work and the conclusions they reach can be mysterious, especially as the use of artificial intelligence techniques makes them increasingly complex. Their results are not always understood or accurate – and the consequences can be dire. And the impact of potential new legislation to limit the influence of algorithms on our lives remains uncertain.

Algorithms explained

Basically, an algorithm is a series of instructions. As Sasha Luccioni, a researcher in the AI ​​ethics team at AI model builder Hugging Face pointed out, this can be hard-coded, with fixed instructions for a computer to follow, for example to put a list of names in alphabetical order. Simple algorithms have been used for computerized decision making for decades.

Today, algorithms help simplify otherwise complicated processes all the time, whether we know it or not. When you ask a clothing website to filter pajamas to see the most popular or cheapest options, you’re basically using an algorithm to say, “Hey, Old Navy, follow the steps to show me the cheapest pajamas. “.

All kinds of things can be algorithms, and they’re not just limited to computers: a recipe, for example, is kind of an algorithm, just like the morning routine of the week that you go through while sleeping before you leave the house.

“We use our own personal algorithms every day,” said Jevan Hutson, privacy and data security attorney at Seattle-based Hintze Law, who has studied AI and surveillance.

But while we can question our own decisions, those made by machines have become more and more enigmatic. This is because of the rise of a form of AI known as deep learning, which is inspired by the functioning of neurons in the brain and rose to prominence a decade ago. ‘years.
A deep learning algorithm can instruct a computer to watch thousands of cat videos, for example, to learn how to identify what a cat looks like. (That was a big deal when Google figured out how to do this reliably in 2012.) The result of this process of data bingeing and improving over time would be, in essence, a computer-generated procedure to find out. how the computer will identify if there is a cat in all the new photos it sees. This is often known as a pattern (although it is also sometimes referred to as an algorithm itself).

These models can be incredibly complex. Facebook, Instagram, and Twitter use them to personalize user feeds based on each person’s interests and past activity. Models can also be based on mounds of data collected over many years that no human could sort through. Zillow, for example, has been using its trademark, machine-learning-assisted “Zestimate” to estimate home values ​​since 2006, taking into account tax and property records, owner-submitted details such as adding a room. bathroom and photos of a lodge.

The risks of relying on algorithms

As Zillow’s case shows, however, offloading decision-making to algorithmic systems can also go horribly wrong, and it’s not always clear why.

Zillow recently decided to shut down its home-based turnaround business, Zillow Offers, showing how difficult it is to use AI to assess real estate. In February, the company said its “Zestimate” would represent an initial cash offer from the company to buy the property through its home flipping business; in november, the company wrote down $ 304 million on its inventory, blamed on having recently bought homes at higher prices than it thinks it can sell.

Elsewhere online, Meta, the company formerly known as Facebook, has come under scrutiny for tweaking its algorithms in a way that has helped spur more negative content on the biggest social network. in the world.

Zillow's home buying debacle shows how difficult it is to use AI to assess real estate
The algorithms have also had life-changing consequences, especially in the hands of the police. We know, for example, that at least several black men have been wrongly arrested for the use of facial recognition systems.

There is often little more than a basic explanation from tech companies about how their algorithmic systems work and what they are used for. Beyond that, experts in tech and tech law have told CNN Business that even those who build these systems don’t always know why they come to their conclusions – which is why they’re often referred to as “black boxes.” “.

“Computer scientists, data scientists, at this present stage, they seem like wizards to a lot of people because we don’t understand what they’re doing,” said Gilliard. “And we think they always do, and they don’t always.”

Pop filter bubbles

The United States does not have federal rules on how companies can or cannot use algorithms in general, or those that exploit AI in particular. (Some states and cities have adopted their own rules, which tend to deal with facial recognition software or biometrics more generally.)

But Congress is currently considering legislation dubbed the Filter Bubble Transparency Act, which, if passed, would require major internet companies such as Google, Meta, TikTok and others to “give users the ability to engage with a platform. -forms without being manipulated by user-specific data-driven algorithms.
The Netflix building on Sunset Boulevard is pictured on October 20, 2021 in Los Angeles.
In a recent CNN opinion piece, Republican Senator John Thune described the legislation he co-sponsored as “a bill that would essentially create a light switch for the secret algorithms of big tech – artificial intelligence. (AI) designed to shape and manipulate user experiences – and give consumers the choice to turn it on or off. ”

Facebook, for example, already has it, although users are effectively discouraged from activating the so-called switch permanently. A fairly well-hidden “Most Recent” button will show you posts in reverse chronological order, but your Facebook feed will revert to its original heavily moderated state once you exit the website or close the app. . Meta stopped offering such an option on Instagram, which it also owns, in 2016.

Hutson noted that while the Filter Bubble Transparency Act clearly focuses on major social platforms, it will inevitably affect others, such as Spotify and Netflix, which rely heavily on algorithm-based curation. If that passes, he said, it will “fundamentally change” the business model of companies that are built entirely around algorithmic curation – a feature he suspects many users appreciate in some contexts.

“It’s going to have an impact on organizations far beyond those in the limelight,” he said.

AI experts argue that the need for more transparency is critical on the part of companies creating and using algorithms. Luccioni believes laws for algorithmic transparency are needed before specific uses and applications of AI can be regulated.

“I see things changing, for sure, but there is a really frustrating mismatch between what AI is capable of and what it’s legislated for,” Luccioni said.


Previous 2000s Star Allan Davis Gets PEZ'd!
Next Allen Miller Obituary (1929 - 2021) - Torrington, WY