A vibrating tennis to guide blinds. Is it helpful?

Smart shoes
Technology for social inclusion

Close your eyes and imagine how life of a blind person is. Who has never played this game in order to better understand the challenges people with visual limitations face.? How do they know where they are going? How do they know where to turn left or right? Lechal was born as a smart shoe to help blinds by using geo-sensitive technology to vibrate and show the path. Is it helpful?

Thinking about a few of these challenges, the Indian company Ducere Technologies, founded by Krispian Lawrence, developed a footwear that shows the way for those in need of directions – which may be blind but also tourists in a foreign country. The Smart Shoes follow a pre-settled rout and shake indicate the way you need to go. The product is branded Lechal, which means ‘Take you there’ in Hindi [the official language of India].

From the initial inclusive goal, Krispian realized there were many other potential commercial uses for Lechal technology and expanded its market strategies. Nevertheless, the core concept of this visually-impaired acquaintance inspire us to rethink the way we geographically navigate. An interesting gain, for example, if you are visiting a city with high levels of violence where holding a cellphone on hands is not recommended. However, instead of praising any new technological idea, even if adorned of the best intentions, as in this case, most innovators lack the sociological-human perspective of USABILITY. In other words, we must ask: is it useful?

Smart shoes
Technology

Well, I have never tried the product to make the appropriate critics, but most of the blind people I saw in India, South Africa and Brazil wouldn’t be able to afford such a product costing around $60. Blind people historically depend on external help and  are excluded from labor market. India is home to 20 per cent of  the world’s visually impaired, which accounts for around 40 million individuals in 2020. In Brazil they are around 6.5 million. Do you believe all these people walk on very well paved large streets with easy access? Well well… the product has never intended to fit everyone, let’s be fair.

Then I checked Amazon reviews (as the link for the price on the official website is broken – but on Amazon it says $60) and read a few complaints of instability, bad connections, delays on vibration and other (among many good reviews as well) and thought about a blind person trusting the app to arrive at a place and it fails in the middle of the way. Well well again! I am pretty sure they would be safe and need to ask for help verbally as usual, but why to pay $60 for the trauma?

As one of the reviews on Amazon describes: “The basic thing they have to do is to vibrate the moment the Smartphone wants them to vibrate – and if they don’t do that they’re pretty useless, especially for 60 dollars. The concept is nice and has nearly no competitors – but the device has a long journey in front of it until it can be used – especially by visually impaired people Ducere or Lechal wants them to use.”

Prediction or policing? Can algorithms be racists?

By Elisa Maria Campos

The global fight against racism started several centuries back, even before we were able to make a call to invite a colleague to go on protest. Although the black #hashtag movements assume a core role in our digital era, global resistance against racism started in shared experiences in the fight against slavery, colonialism and racial oppression in Black movements as the one surrounding the Haitian revolution, the anticolonial revolutions and the long Black 1960’s (Martin, 2005).

The historical length of the fight against racism is relevant, not only to demonstrate the urgency of combating it, but also to raise awareness of how entrenched racism is in our society. In other words, how impressively hard is to combat it and to decolonize our perspectives, language, expressions, institutions and culture.

Will It be different concerning algorithms? Is technology a powerful tool for inclusion or exclusion? Can algorithms be racists? Who develop them, white or black or both? Unless we intend to continue repeating our mistakes and increasing the gaps of inequality, we must answer these questions urgently.

Data for Black Lives has been working on it. Defined as ”a movement of activists, organizers, and mathematicians committed to the mission of using data science to create concrete and measurable change in the lives of Black People.”, it was found by Yeshimabeit Milner and Lucas Mason-Brown, in Cambridge, Massachusetts. She engaged with data-based activism documenting fellow students’ experiences of racist policing, when she noticed patterns of racist classification by police, which results in data biases under police departments.

“There’s a long history of data being weaponized against Black communities.”, says Miner. It starts from data collection, as predictive policing derives from biased data registered reproducing historical prejudice from police against black people. Also, because predictive policing tools in the United States are mainly of two types, the ones using location-based algorithms, to predict where and when crimes are more likely to happen and tools based on data about people such as gender, age and criminal record to predict who has a high chance of being involved in future criminal activity.

An article by Will Douglas Heaven, in the MIT Technology Review magazine, give detailed information on it. “One of the most common predictive policing tools, called PredPol, which is used by dozens of cities in the US, breaks locations up into 500-by-500 foot blocks, and updates its predictions throughout the day—a kind of crime weather forecast. The person-based tools can be used either by police, to intervene before a crime takes place, or by courts, to determine during pretrial hearings or sentencing whether someone who has been arrested is likely to reoffend. For example, a tool called COMPAS, used in many jurisdictions to help make decisions about pretrial release and sentencing, issues a statistical score between 1 and 10 to quantify how likely a person is to be rearrested if released. According to US Department of Justice figures, you are more than twice as likely to be arrested if you are Black than if you are white. A Black person is five times as likely to be stopped without just cause as a white person. The mass arrest at Edison Senior High was just one example of a type of disproportionate police response that is not uncommon in Black communities” (Heaven, 2020).

Besides in predictive tools, racism can also be reproduced, for instance, in tools like statistical modeling, risk-based sentencing, and predatory lending that exclude Black communities from key financial services. Tough algorithms were developed to make decision-making fairer and objective than humans, they are ultimately based on data that reproduce limited information available because it is what police departments record. Moreover, the intersections between race and justice transcend any technical aspect. Police data carries generations of biased information and urge for more social research on how to dimmish racism, which is a question math is incapable of answering.

Anyway, few studies support the argument that predictive tools and algorithms are inevitable and, in the end, helpful to control crime rates despite inherent racism they carry. An opposite group says that there is no reasonable explanation to keep using any tool if it is consciously recognized as racist. The discussion is more than relevant and highlights the urgence to insist on a multidisciplinary team assessing data policies. Generally, most softwares are developed by white developers, in profit based white managed firms with a technical focus, lacking the social implications of how a simple location-based algorithm can, for instance, record someone as criminal stealing from him a bright future ahead.

Heaven, Will Douglas (2020). Predictive policing algorithms are racist. They need to be dismantled. MIT Technology Reviews Magazine. Available online: https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/

Martin, W. (2005). Global Movements before “Globalization”: Black Movements as World-Historical Movements. Review (Fernand Braudel Center), 28(1), 7-28. Retrieved January 2, 2021, from http://www.jstor.org/stable/40241617

Favelas Unified Dashboard COVID-19

Favela da Rocinha, Rio de Janeiro

Case study: FAVELAS UNIFIED DASHBOARD
http://www.favela.info/

A data unique initiative that trusts in local knowledge and communitary monitoring to change realities through information and data analysis. This is the Favelas Unified Dashboard, a project led by the NGO Catalytic Communities (www.catcomm.org) in partnership with manifold collectives and leaders of the favelas.

To overcome the lack of public policies and effective preventive measures against Covid-19, a growing coalition of favela-based and favela-supporting organizations came together in order to map, count and analyse data during the pandemic. In the favelas, Global resolutions such as social distance, testing and recurrent hand hygiene cannot be applied in a homogeneous manner. Its primary goal is to support prevention efforts, as the favelas of Rio became the epicenter of this infection in Rio.

A survey carried out by the project in 151 favelas in August/2020 (41 favelas + complexes) found 1,402 deaths in the mapped territories, of a total of 14,080 deaths in Rio, 8,612 of which were within the capital. The number surpasses states like Mato Grosso do Sul with 509 deaths, Amapá with 602, Tocantins with 547, Roraima with 547 and Acre with 561 deaths.

It is the greatest health tragedy in the history of Brazil and Rio de Janeiro has the highest death rate in the country (Monitora Covid/Fiocruz in 09/09/2020). The lack of real information about Rio de Janeiro’s favelas and communities makes it difficult to develop joint solutions and public policies to tackle local problems.

This innovative bottom-up data collection project produces data to inform the favela population about emergencial issues by investing in community engagement.

We mapped a 56% gap between official data and citizen produced data, helping to informam dwellers, public policers and researchers.

All data and methodology are available at www.favela.info .

CONNECT AND HELP
https://catcomm.org/

Based in Rio de Janeiro and with a US 501[c][3] charitable status, Catalytic Communities (CatComm) is an empowerment, communications, think tank, and advocacy NGO working since 2000 in support of Rio’s favelas development.