Nezařazené

Solutions and Asylum Procedures

After the COVID-19 pandemic halted many asylum procedures across Europe, fresh technologies are now reviving these kinds of systems. Via lie detection tools tested at the edge to a program for verifying documents and transcribes selection interviews, a wide range of technology is being utilized in asylum applications. This article is exploring how these technology have reshaped the ways asylum procedures are conducted. It reveals just how asylum seekers are transformed into forced hindered techno-users: They are asked to adhere to a series of techno-bureaucratic steps also to keep up with unforeseen tiny within criteria and deadlines. This obstructs the capacity to navigate these devices and to pursue their legal right for protection.

It also illustrates how these kinds of technologies happen to be embedded in refugee governance: They help the ‘circuits of financial-humanitarianism’ that function through a flutter of dispersed technological requirements. These requirements increase asylum seekers’ socio-legal precarity by simply hindering them from being able to view the programs of safety. It further states that analyses of securitization and victimization should be along with an insight in the disciplinary mechanisms of them technologies, by which migrants will be turned into data-generating subjects who all are disciplined by their reliance on technology.

Drawing on Foucault’s notion of power/knowledge and comarcal expertise, the article states that these technologies have an inherent obstructiveness. There is a double effect: although they assist with expedite the asylum process, they also help to make it difficult for the purpose of refugees to navigate these types of systems. They are really positioned in a ‘knowledge deficit’ that makes them vulnerable to bogus decisions created by non-governmental actors, and www.ascella-llc.com/the-counseling-services-offers-free-confidential-counseling-services-to-enrolled-students ill-informed and unreliable narratives about their instances. Moreover, they pose fresh risks of’machine mistakes’ that may result in erroneous or discriminatory outcomes.