Seanad debates

Thursday, 8 February 2024

Digital Services Bill 2023: Committee and Remaining Stages

 

9:30 am

Photo of Lynn RuaneLynn Ruane (Independent) | Oireachtas source

It is unusual to have reports ruled out of order on the grounds of posing a cost to the State. Over the years, we have tabled many amendments on introducing reports, including ones that have actually come to fruition and been carried out by Departments, that were not ruled out of order. It has always been a positive addition to much of the legislation that has passed through the House. I note that as part of these risk assessments, platforms must examine the design of their recommended systems and other algorithmic systems and all data-related practices which includes profiling and similar activities. This is welcome. The assessments must also consider any foreseeable risks to human rights, civic and electoral freedoms and the public health of children, stemming from the practices. This level of scrutiny is welcome, in particular the requirements for the independent audits under Article 37. We saw this section being improved by these risk assessments and independent audit reports under Articles 34 and 37 of the regulations being collated by the Minister in order to form the basis of a summary analysis of safety in terms of algorithms in the State. This would have enabled us as legislators to have not just a clear picture of practices by each platform, but to also have a more holistic picture of the practices of all online platforms active in the State. Despite the amendment being ruled out of order, we have the report in this regard and I would like to hear from the Minister of State on the whole area and what we were trying to achieve in ensuring that we can actually map what those assessments look like, especially in terms of human rights and electoral safety.

I would like to note that it was precisely this kind of provision that we advocated for during the debate on the online safety legislation. At the time, we had amendments which called for a report on algorithm safety in the State. This would have served a purpose similar to that served by the risk assessments and the reports required by regulation. We also sought reports on the impact of the practices of online platforms on the public health of children. At that time, we were eager to see recognition of the notion of algorithmic harm. This is the idea that is not just user- generated content that can create harm in online space, rather the practices themselves and the policies of the platforms. This idea of harm due to algorithms is recognised within the digital services regulation and we recognise that. However, we would like to be able to continue to evaluate how they are implemented in practice and learn from them.

Comments

No comments

Log in or join to post a public comment.