Technology

The algorithm that used private data from poor Argentine girls to predict their pregnancies

The algorithm that used private data from poor Argentine girls to predict their pregnancies

Collaborations between Big Tech and governments do not always have good ends, although it appears to be so. One of those terrible episodes that seems to be taken from Black Mirror has its origin in Argentina and its protagonists are Microsoft and the Ministry of Early Childhood of the northern province of Salta.

In an investigation carried out by Wired, it has been uncovered how the technology company and the government developed and put into operation an algorithm to predict teenage pregnancy using invasive data from thousands of girls.

The media tells how under the name of Technological Platform for Social Intervention, the algorithm developed by Microsoft could predict five or six years in advance, with name, surname and address, which girl was 86% predestined to have a teenage pregnancy.

A kind of Minority Report to predict the chances that a girl will get pregnant when she reaches adolescence. The ultimate goal was to use the algorithm to predict which girls, based on whether they resided in low-income areas, would become pregnant in the next five years. An idea that may have an end in family planning, but that was never revealed what happened when the algorithm indicated a girl in question as a potential teenage pregnancy. Not so little was the data transparent or the level of invasion of minors' privacy.

An opaque algorithm that, in reality, only served to violate the rights of minors. girls and women

Janko Ferlič/Unsplash All under an ironclad opaque information. According to Wired, the girls' private data was fed to the algorithm, including age, ethnicity, country of origin, disability and other data such as whether the girl's house had hot water in the bathroom. 200,000 people in total, including 12,000 women and girls between the ages of 10 and 19.

And the seriousness does not end there. According to Wired, “territorial agents” visited the homes of the girls and women in question, conducted surveys, took photos, and recorded GPS locations, all to feed the algorithm. The vast majority were poor, immigrants or from indigenous communities.

In addition, according to Wired, the algorithm was not subject to special controls due to the total absence of national regulation, and no evaluation of its impacts on girls and women was carried out. women who were used for the algorithm. In fact, it was not until the activists themselves and the community put pressure on politicians to find out how a totally opaque technology was being used that was basically born, according to Wired, to violate the rights of girls and women . Because, furthermore, they did not work as their managers said:

“The Laboratory of Applied Artificial Intelligence of the University of Buenos Aires highlighted the serious technical and design errors of the platform and questioned the claims of the developers that the model made “correct predictions 98.2 percent of the time”. “Methodological problems, such as the lack of reliability of the data, pose the risk that policy makers take the wrong measures”

Applied Artificial Intelligence Laboratory of the University of Buenos Aires A kind of eugenic control at the hands of those technological government agreements that the ultimate goal, although it may not seem like it, ends up leading to state control without monitoring. And with an end far from the original idea.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top