17 Sep Adopting voice-to-text technologies for social good
Imagine a world where we can focus on the people that need us and automate the fiddly tasks that often get in the way?
A possible solution
The last two years has seen a huge increase of the development of apps and technologies aimed at assisting NGOs and police officers to help them streamline their support services and better file data. Indeed, there are apps aimed at identifying victims, gathering evidence and facilitating communication with victims.
The Apprise App, for example, has been developed to screen workers for signs of Human Trafficking. The app consists of a list of questions which assess the working conditions and calculate a vulnerability rating based on workers’ response.
iWitness is another app which has been developed to gather evidence, and to help witnesses and victims provide a detailed account of an event.
Many other apps have been developed to aid reporting (e.g. the Unseen App and The Stop the Traffik App) and also to help the victim locate the closest police station and support services while providing them with instructions on how to report a crime and claim damages (Brottsofferappen).
However, there are many tools which could further help front line practitioners streamline their services. Voice-to-text which automatically transcribes spoken text into written texts, saving time and energy, is one such technological tool.
Examples of this tool include Google Docs Voice Typing, Braina Pro and Speechnotes, even Siri is able to transcribe your text message when speaking into your phone. A key innovative feature of this technology is the prospect of these tools including machine-learning algorithms to process text and automatically extract and structure data from this text into a database, providing rich insightful data sets. Although these applications have not been explored in detail by law enforcement, NGOs or the charity sector; at Trilateral, we believe tools such as these have a huge potential to help law enforcement, NGOs and other practitioners who may come into contact with victims of crime.
Deploy existing technologies to focus on the victims’ needs
This type of technology could help practitioners as it would allow them to transcribe the narrative of a victim when reporting a crime. NGOs, in particular, who often have a lack of capacity and resources may greatly benefit from this type of tools as they could help them to efficiently transcribe interviews from telephone calls, allowing the practitioner to be more focused on the needs of victims and witnesses without missing key details from their testimony. This in turn will lead to NGOs and law enforcement showcasing greater sensitivity towards the victims and their needs. This increased support and focus is essential as it is widely accepted that victims are more satisfied with their case if they believe the police are treating them with respect and fairness and that they are genuinely interested in their case.
Thus, innovative voice-to-text tools will create a paradigm shift in prosecuting crimes as the victim’s needs and interests will be centralised, while not losing sight of the importance of prosecution. This in turn will improve the efficiency, accuracy and usability of evidence gathered by the police and NGOs which will be essential to successful prosecutions. It will also allow practitioners to identify patterns and trends and new insights by analysing the data being collected. These machine learning techniques can also assist and support police and NGO workers by prompting the right non-biased questions after the reporting. As such the victim’s statement will become more complete and less biased by intervention.
In summary voice-to-text technologies that are currently widespread and often used in every-day activities could be used by law enforcement and NGOs for good. It is often not about re-inventing the wheel but thinking about how existing technologies could be applied in different ways or to different sectors to make a change.
For more information on this research area, contact our team: