DiscoverDržavljan D094 Claudio Agosti and the algorithm economy
094 Claudio Agosti and the algorithm economy

094 Claudio Agosti and the algorithm economy

Update: 2023-12-15
Share

Description


Claudio Agosti is a fellow researcher at the University of Amsterdam. He has a hacker background and security professional. He researches, implements, and promotes technology in the public interest. In the last decade, he addressed issues such as communication protection for whistleblowers, analysis of the web-tracking phenomenon, and algorithm analysis.


In 2019 he started to work with the ETUI Foresight Unit on research and training courses aimed at understanding the ‘technicalities’ behind Artificial Intelligence. Five years later, the ETUI is releasing a technical report which meticulously outlines the techniques used by researchers to observe the internal logic of the app used by riders in Italy and documents its actual behavior in terms of harvesting their personal data.


We sat down with Claudio to discuss his investigation, the consequences of unchecked share economy and the way forward.


Transcript of the episode:



Expand the transcript


00:00:23 Domen Savič / Citizen D 


OK, so welcome everybody. It’s the 8th of November 2023, but you’re listening to this episode of Citizen D podcast on the 15th of December 2023. With us today is Claudio Agosti, algorithms explorer and digital rights evangelist at the AI forensics NGO and the topic of today’s discussion is worker rights in the digital economy. So first of all, hello. 


00:00:52 Claudio Agosti 


Hello and thank you people of the future. 


00:00:56 Domen Savič / Citizen D 


Let’s start with the quick recap of the report. It’s titled “Exercising workers rights in algorithmic management system”. What does that mean? What was the topic of the report? What did you investigate and what were some of the findings? 


00:01:15 Claudio Agosti


Thank you. The report, it talks about a story, an investigation that began in 2019. I founded a project Tracking exposed. That was a project meant to do algorithm analysis, and we were analyzing platforms of social media or of Amazon and other web platforms. But I get in touch with the Aida of the European Trade Union Institute because I met her at the privacy camp, the event organized every year in in Brussel to put together privacy folks and other people considered on this right. And she was fascinated about our approach to analyze algorithms because it’s a black box analysis.


And it was in 2019 we met. I started to collaborate with that institution to teach a bit to trade unions how they should be skeptical of the apps that run on the, let’s say, on the riders phones, to organize their work, because those apps may violate certain kind of privacy rights or also some kind of labour rights. 


Initially it was just an insight, an intuition, but only in 2020 or let’s say around the time we start to try to make some investigation. And through that we approach it in two ways. The first was making a survey, a set of questions that were meant for riders, to understand if they felt that the techniques that the technology that organize their work was discriminating on them, that’s because if you want to use some, if you want to bring it to court a company, you need to have evidence that this company did something bad, and the evidence of the violations.  


In the other path instead, we were doing analysis that was only technical. So, try to do the best engineering of an app, but to do a reverse-engineering of a rider app you need to have a login and password of a rider because the app starts to execute and do all the potential privacy leakage or the surveillance of the worker only if you log in properly and we needed to find a rider willing to share and a login and password, and that was particularly difficult. We also tried to subscribe ourselves to be rider, but we did not get accepted. I don’t know if it was because of the place we were living. 


But after 18 months and that’s this huge amount of time… So one of the important cost of this investigation that could have been reduced and will be reduced in the future, but after 18 months, we found a person willing to share those logins and passwords. So we set up a mechanism and this methodology that initially was some static analysis by Exodus privacy, which is an online service. I suggest you consult because it shows by doing static analysis how many known trackers are present in every mobile app. 


Then we meet them, a proxy is a software that allow you to intercept the traffic that the app is performing to the outside, and then with the freedom is a system that allows to run the application on a sort of special environment, where the calls made to the system can be intercepted and recorded or modified and in that way we can observe when the app was actually accessing to the GPS or to other peripherical. So, with these three methods we start to observe how the app was behaving and we start to realize that. 


As first was revealing, the location of the worker, even outside of the working shift. The second is that inside of the communication, about the profiles, so we were intercepting traffic and inside of this communication between the app and the global infrastructure you were seeing that some requests were made by the app to understand who’s the rider that is using the app, and so you are seeing the profile and the information tied to this profile. And then there were other requests more focused on getting new orders. 


Inside of the profile of the rider, there was a present. A score. That’s it. I mean, it was not the number you could have expected in the sense that official League Global acknowledged the existence of an excellent score. And we realized that the excellent score was a different score. So therefore, there was a hit then scoring mechanism present or let’s say present in this communication, then how it was used by the app. 


But infrastructure, we don’t know. But that’s was an evidence that even if it’s not that surprising for a Labor Unionist, it’s very important to have this evidence because Labor unionists in the past hugged, they requested that the worker should not be subjected to, let’s say, the vote or it can be subject to the vote, but the voting that they get from the customer should not impact their ability to win. 


And that is part of the labour right. It’s not that your worker can stop because someone starts to vote you poorly, even if this seems to be a standard in the online market, because it’s normal for me when I buy something I don’t know on eBay to check the task worthless. 


The seller is not OK that uh, if uh, your life and your work depend on a system. 


Someone can game this system and start to downvote you and make you suffer a loss of business. And last but not least, we saw that there were also third parties not declared in the contracts or in the privacy policy that we’re getting all this information about the user profile and their location and everything that we’re doing in the app. So, you click it here and you move this panel where you were when this happened. 


How much you are moving in that moment, all those kinds of detailed information were given to third parties, that’s, is another problem. But I don’t want to keep reassuming it, there is the report that is a 60 page long and there is also a video of 50 minutes around with me and Joanna that talks about it and you can find it on the website https://www.reversing.works because tracking exposed the project that I mentioned before closed this year and it became two different projects, one is AI forensics. The one you mentioned that carry on the algorithm analysis toward the influential algorithm.  


So we look at TikTok and being chart and the language model. So some part of the algorithm analysis is carried on by forensics, reversing works focus more on the impact of algorithm and surveillance capitalism in worker rights. So this kind of effort is captured by this different group and the website reversing works contain also more reference about this report. 


00:10:08 Domen Savič / Citizen D 


So in your investigation you’ve investigated one app and one company basically on one market, right? So how fair would be the assumption that other providers of these types of services and the same providers in different markets? Are they using the same, let’s say, tactic, social scoring, hidden grades and so forward. Do you think it’s this? Is this limited to 1 market or is it present everywhere? 


00:10:43 Claudio Agosti  


I believe there are insights that make us assume this is a frequent condition because one sided analytics third parties are in between offering analytic service to them. So showing how the app is actually used, but they’re also tied to the marketing campaign, so they use the data collected this way to resell them in the advertisement or don’t know customers certification or brand analysis

Comments 
loading
00:00
00:00
1.0x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

094 Claudio Agosti and the algorithm economy

094 Claudio Agosti and the algorithm economy

Državljan D