A vibrating tennis to guide blinds. Is it helpful?

Smart shoes
Technology for social inclusion

Close your eyes and imagine how life of a blind person is. Who has never played this game in order to better understand the challenges people with visual limitations face.? How do they know where they are going? How do they know where to turn left or right? Lechal was born as a smart shoe to help blinds by using geo-sensitive technology to vibrate and show the path. Is it helpful?

Thinking about a few of these challenges, the Indian company Ducere Technologies, founded by Krispian Lawrence, developed a footwear that shows the way for those in need of directions – which may be blind but also tourists in a foreign country. The Smart Shoes follow a pre-settled rout and shake indicate the way you need to go. The product is branded Lechal, which means ‘Take you there’ in Hindi [the official language of India].

From the initial inclusive goal, Krispian realized there were many other potential commercial uses for Lechal technology and expanded its market strategies. Nevertheless, the core concept of this visually-impaired acquaintance inspire us to rethink the way we geographically navigate. An interesting gain, for example, if you are visiting a city with high levels of violence where holding a cellphone on hands is not recommended. However, instead of praising any new technological idea, even if adorned of the best intentions, as in this case, most innovators lack the sociological-human perspective of USABILITY. In other words, we must ask: is it useful?

Smart shoes
Technology

Well, I have never tried the product to make the appropriate critics, but most of the blind people I saw in India, South Africa and Brazil wouldn’t be able to afford such a product costing around $60. Blind people historically depend on external help and  are excluded from labor market. India is home to 20 per cent of  the world’s visually impaired, which accounts for around 40 million individuals in 2020. In Brazil they are around 6.5 million. Do you believe all these people walk on very well paved large streets with easy access? Well well… the product has never intended to fit everyone, let’s be fair.

Then I checked Amazon reviews (as the link for the price on the official website is broken – but on Amazon it says $60) and read a few complaints of instability, bad connections, delays on vibration and other (among many good reviews as well) and thought about a blind person trusting the app to arrive at a place and it fails in the middle of the way. Well well again! I am pretty sure they would be safe and need to ask for help verbally as usual, but why to pay $60 for the trauma?

As one of the reviews on Amazon describes: “The basic thing they have to do is to vibrate the moment the Smartphone wants them to vibrate – and if they don’t do that they’re pretty useless, especially for 60 dollars. The concept is nice and has nearly no competitors – but the device has a long journey in front of it until it can be used – especially by visually impaired people Ducere or Lechal wants them to use.”

How 3D printers can help India to save oxygen

Uttar Pradesh, Índia Foto: SANJAY KANOJIA / AFP/20-04-2021

India is under an enormous crisis because of shortage of oxygen for treatment of COVID patients. Patients across the country are rushing to hospitals with no beds and many are under oxygen therapy at home. Oxygen shortage is the single biggest problem India faces at this very moment. 

Manu Prakash, a professor of bioengineering at Stanford University, has developed a Project basted on 3D printed engineering that allow to save up to 50% of oxygen in a tank . This is possible because during oxygen therapy, when a patient exhales, oxygen is still being supplied and is thus wasted.

Prakash developed an open source and rapidly manufacturable passive oxygen conservation device for treatment of mild COVID-19 patients who are on high-flow nasal cannula to be used at places facing shortage of oxygen like in India. He thought about a solution to deliver a bolus of oxygen only when a patient inhales, following the natural flow of respiration. How did he do it? By using a 3D printed device that preserves the oxygen during the exhalation cycle.

The project presentation emphasizes it is a temporary and ‘non-approved ‘ solution only for emergencial use. Moreover, the device is envisioned to be used in settings where patients with mild conditions are being treated; and not used for severe covid patients.

Therefore, they recommend the complete industrial process for better outcomes., once these valves were originally designed for use in ICU ventilators. Using a metal body will ensure reliability over large number of cycles. However, given the emergencial application here, it may indeed be worthwhile to test 3D printed components.

The project is open sourced and can be assessed here.

Prediction or policing? Can algorithms be racists?

By Elisa Maria Campos

The global fight against racism started several centuries back, even before we were able to make a call to invite a colleague to go on protest. Although the black #hashtag movements assume a core role in our digital era, global resistance against racism started in shared experiences in the fight against slavery, colonialism and racial oppression in Black movements as the one surrounding the Haitian revolution, the anticolonial revolutions and the long Black 1960’s (Martin, 2005).

The historical length of the fight against racism is relevant, not only to demonstrate the urgency of combating it, but also to raise awareness of how entrenched racism is in our society. In other words, how impressively hard is to combat it and to decolonize our perspectives, language, expressions, institutions and culture.

Will It be different concerning algorithms? Is technology a powerful tool for inclusion or exclusion? Can algorithms be racists? Who develop them, white or black or both? Unless we intend to continue repeating our mistakes and increasing the gaps of inequality, we must answer these questions urgently.

Data for Black Lives has been working on it. Defined as ”a movement of activists, organizers, and mathematicians committed to the mission of using data science to create concrete and measurable change in the lives of Black People.”, it was found by Yeshimabeit Milner and Lucas Mason-Brown, in Cambridge, Massachusetts. She engaged with data-based activism documenting fellow students’ experiences of racist policing, when she noticed patterns of racist classification by police, which results in data biases under police departments.

“There’s a long history of data being weaponized against Black communities.”, says Miner. It starts from data collection, as predictive policing derives from biased data registered reproducing historical prejudice from police against black people. Also, because predictive policing tools in the United States are mainly of two types, the ones using location-based algorithms, to predict where and when crimes are more likely to happen and tools based on data about people such as gender, age and criminal record to predict who has a high chance of being involved in future criminal activity.

An article by Will Douglas Heaven, in the MIT Technology Review magazine, give detailed information on it. “One of the most common predictive policing tools, called PredPol, which is used by dozens of cities in the US, breaks locations up into 500-by-500 foot blocks, and updates its predictions throughout the day—a kind of crime weather forecast. The person-based tools can be used either by police, to intervene before a crime takes place, or by courts, to determine during pretrial hearings or sentencing whether someone who has been arrested is likely to reoffend. For example, a tool called COMPAS, used in many jurisdictions to help make decisions about pretrial release and sentencing, issues a statistical score between 1 and 10 to quantify how likely a person is to be rearrested if released. According to US Department of Justice figures, you are more than twice as likely to be arrested if you are Black than if you are white. A Black person is five times as likely to be stopped without just cause as a white person. The mass arrest at Edison Senior High was just one example of a type of disproportionate police response that is not uncommon in Black communities” (Heaven, 2020).

Besides in predictive tools, racism can also be reproduced, for instance, in tools like statistical modeling, risk-based sentencing, and predatory lending that exclude Black communities from key financial services. Tough algorithms were developed to make decision-making fairer and objective than humans, they are ultimately based on data that reproduce limited information available because it is what police departments record. Moreover, the intersections between race and justice transcend any technical aspect. Police data carries generations of biased information and urge for more social research on how to dimmish racism, which is a question math is incapable of answering.

Anyway, few studies support the argument that predictive tools and algorithms are inevitable and, in the end, helpful to control crime rates despite inherent racism they carry. An opposite group says that there is no reasonable explanation to keep using any tool if it is consciously recognized as racist. The discussion is more than relevant and highlights the urgence to insist on a multidisciplinary team assessing data policies. Generally, most softwares are developed by white developers, in profit based white managed firms with a technical focus, lacking the social implications of how a simple location-based algorithm can, for instance, record someone as criminal stealing from him a bright future ahead.

Heaven, Will Douglas (2020). Predictive policing algorithms are racist. They need to be dismantled. MIT Technology Reviews Magazine. Available online: https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/

Martin, W. (2005). Global Movements before “Globalization”: Black Movements as World-Historical Movements. Review (Fernand Braudel Center), 28(1), 7-28. Retrieved January 2, 2021, from http://www.jstor.org/stable/40241617