Oireachtas Joint and Select Committees
Wednesday, 28 February 2024
Joint Oireachtas Committee on Jobs, Enterprise and Innovation
Operation and Resourcing of Coimisiún na Meán: Digital Services Commissioner
Maurice Quinlivan (Limerick City, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I remind members who are participating remotely that they need to do so from within the Leinster House complex only. I did not receive any apologies.
Today we will engage with Dr. John Evans, digital services commissioner, to discuss the operation and resourcing of the new Coimisiún na Meán. As members will be aware, Coimisiún na Meán was established further to the provisions of the Online Safety and Media Regulation Act 2022. It has been designated as Ireland's Digital Services Coordinator and is responsible for implementing and enforcing the EU regulation known as the Digital Services Act, DSA, in Ireland. I welcome the digital services commissioner, Dr. John Evans, to the committee. He is joined by Mr. Tiernan Kenny, director of communications and public affairs with Coimisiún na Meán.
Before we begin, as I always do, I will explain some limitations to parliamentary privilege and the practices of the Houses as regards references that may be made to another person in evidence. The evidence of witnesses physically present or who give evidence from within the parliamentary precincts is protected, pursuant to the Constitution and statute, by absolute privilege. Witnesses are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable or otherwise engage in speech that might be regarded as damaging to the good name of the person or entity. Therefore, if their statements are potentially defamatory in respect of an identifiable person or entity, they will be directed by me to discontinue their remarks. It is imperative they comply with any such direction.
The opening statement has been circulated to all members. To commence our consideration of this matter, I now invite Dr. Evans to make his opening remarks.
Dr. John Evans:
I thank the committee for the invitation to speak today. I am digital services commissioner at Coimisiún na Meán and I am joined by Mr. Tiernan Kenny, director of communications and public affairs. I will give a brief overview of our work and structure at Coimisiún na Meán and describe the EU Digital Services Act and our role under it, as well as what it means in concrete terms for people in Ireland.
Coimisiún na Meán was established almost a year ago, in March 2023, and took on the functions and staff of the Broadcasting Authority of Ireland. Our remit covers broadcasting and on-demand regulation, media development and online safety. Since our establishment, we have been working hard to grow the organisation. Our headcount now is 102 as of this week and we have sanction to hire 160 people, a number we expect to hit in the middle of this year. We are arranged in a divisional structure that we believe is the best way to meet our objectives.
Coimisiún na Meán has five commissioners, who oversee the different divisions. I am the digital services commissioner and I work alongside Ms Celene Craig, our broadcasting and on-demand commissioner; Ms Niamh Hodnett, our online safety commissioner; Mr. Rónán Ó’ Domhnaill, our media development commissioner; and Mr. Jeremy Godfrey, our executive chair. Ms Craig will step down in mid-March and a recruitment process is under way to replace her. The organisation is structured into different divisions, each of which is overseen by a commissioner. There are four external-facing divisions. I look after platform supervision and investigations, which focuses mostly on compliance and enforcement of platform obligations in respect of harmful and illegal content under the DSA, the terrorist content online regulation and national law. The second division is regulatory policy, which focuses on consulting with civil society and industry on harms and issues and on making rules for industry to follow. It is overseen by the online safety commissioner, Ms Niamh Hodnett. The media landscape division focuses on audiovisual and media services, AVMS, providers, sound broadcasters and content producers and is overseen by the broadcasting and on-demand commissioner. The user support division focuses on audiences and users of online services and is overseen by the media development commissioner. In addition, we have a data and technology division, a legal services division and a corporate services division, which includes functions such as governance, communications and finance.
This year, we are putting in place an online safety framework in Ireland. This will apply to the online services used every day. This framework will end the era of self-regulation in the technology sector and make online platforms accountable for how they keep their users, especially children, safe online. Platforms must also uphold fundamental rights, including freedom of expression. This framework has three main parts: the EU terrorist content online regulation, for which we became a competent authority in November 2023; the EU Digital Services Act, which became fully applicable on 17 February 2024; and the draft online safety code, which is out for consultation at the moment. It is our intention to implement the different elements of this framework in a coherent way, to avoid unnecessary duplication and burdens for businesses and make it easy for citizens to know and enforce their rights. We have responsibility for regulating services which have their European headquarters in Ireland. I will explain our role under the DSA in more detail, as this sits within my role as digital services commissioner.
The Digital Services Act is an EU regulation that sets rules for online intermediary services. The definition of an online intermediary service is broad, covering almost any service which is provided online, although it does not include private messaging services. The DSA applies a baseline set of obligations to all intermediary service providers, including having clear terms and conditions for how they can be used. Further obligations are added depending on the functionality and size of the service. The DSA applies a particular set of obligations for online platforms, such as social media services, online marketplaces and app stores. These include providing a way for users to flag illegal content, publishing transparency reports on their activities, allowing users to appeal content removals and devoting sufficient resources to content moderation activities, without solely relying on automated decision-making.
The most stringent obligations apply to very large online platforms, VLOPs, and very large online search engines, VLOSEs. Any platform or search engine with 45 million or more monthly active users in the EU qualifies as a VLOP or VLOSE. These services have additional obligations to assess and mitigate the risks that arise from how their services are designed and used across four categories, including how they can facilitate the spread of illegal content or cause negative outcomes for public order, electoral integrity, and public health. These provisions can help to address issues such as misinformation or disinformation. The European Commission has designated 22 VLOPs or VLOSEs, 13 of which have established their EU headquarters in Ireland.
Each member state has to appoint a digital services commissioner, DSC, to enforce the DSA. In Ireland’s, case the DSC is Coimisiún na Meán, with some responsibilities for online marketplaces allocated to the Competition and Consumer Protection Commission. Each DSC has responsibility for regulating the service providers whose EU headquarters are in its country, but the European Commission has primary responsibility for some of the obligations applying only to VLOPs and VLOSEs. As online services can be provided across borders, international co-operation will be important in the application and enforcement of the DSA. The legislation set up a group of national DSCs and the European Commission, called the European Board for Digital Services, which met for the first time last week. It is our intention to work closely with the European Commission and our EU counterparts to ensure that the DSA leads to improvements in online safety for people, especially children. This co-operation will be particularly important when dealing with the largest platforms, given the European Commission’s role.
I might say a few words about users' rights under the DSA. A key aim of the DSA is to provide a more predictable and trustworthy online environment for users. The DSA gives users several rights when using online platforms. As well as the obligations to explain their terms of service in plain language and provide a mechanism for flagging illegal content, the DSA also obliges platforms to inform users when their content has been removed, down ranked or demonetised or when their accounts have been suspended or terminated. Users also have the right to appeal content moderation decisions made by platforms to the platforms themselves and then to an out-of-court dispute settlement body where one exists.
Coimisiún na Meán can receive complaints from users of online services who believe their provider of an online service has not complied with its obligations under the DSA. We have opened a contact centre to give users advice and support and so that we can feed real-world intelligence into our platform supervision teams and take action when the rules are broken. While it is within Coimisiún na Meán’s remit to assess if providers of online services are doing what they are obliged to do under the DSA in respect of illegal content, it is not our role to act as an online censor or to tell people what they can or cannot say. On Monday, 19 February, we opened our dedicated contact centre, providing advice to the public on their rights under the Digital Services Act. It will also be used to gather intelligence that will inform An Coimisiún’s supervisory and enforcement activities.
We have published an application form and guidance on out-of-court dispute settlements. This process allows users of online platforms the opportunity to avail of an impartial and independent means of resolving disputes relating to content moderation decisions by online platforms without requiring users to enter costly or lengthy legal disputes. Separately, we have published an application form and guidance on the new trusted flaggers system. This system will create a fast lane for approved trusted flaggers and platforms will have to prioritise dealing with reports from them. These trusted flaggers will be independent and have particular expertise in detecting and notifying illegal content.
Internally, we are in the process of initiating our supervision strategy and considering our functions relating to the Digital Services Act and our responsibilities under the terrorist content online regulation and the draft online safety code. We have adopted an impacts and risk approach to supervision. This assesses the risk of harm occurring, categorises regulated entities or services accordingly and differentiates the approach to supervision based on the impact category of the various regulated entities. The risk and harm framework recognises that the most value can be delivered by focusing on regulated entities that may pose the highest levels of online harm and risk and on the types of risk that pose the greatest threat of harm to people.
I acknowledge the work of members of this committee in ensuring that the DSA was swiftly implemented in Ireland and that we were one of just six EU member states to have their digital services commissioner in place for the DSA’s first day of operation last week. We are at the early stages of the implementation of the DSA. We will need to work collaboratively with a range of partners to deliver a positive impact for people and ensure that they can take advantage of the benefits of being online while being protected from being harmed by illegal content.
Maurice Quinlivan (Limerick City, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I thank Dr. Evans. I will invite members to contribute. As they know, we have a rota in place. First is Deputy O’Reilly of Sinn Féin.
Louise O'Reilly (Dublin Fingal, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
I thank the witnesses for attending and for their evidence. I apologise, as I will have to leave. I am due to speak in the Dáil, but I will stay for as long as I can. This is a large area and I have quite an interest in it. Hands up, I do not know a great deal about it, but I am trying to read myself into it and learn as much as I can. I will ask a couple of questions and, if I get a chance, I will contribute again.
I wish to ask about an coimisiún’s role and the role of the European Commission in tackling disinformation, that is, deliberate misinformation. Since the outbreak of the Israel-Hamas war in recent months, there has been a growing apprehension about the potential unintended repercussions of the EU’s Digital Services Act, particularly on the digital rights of Palestinians and other vulnerable minorities. Online information is taken down amid accusations that it constitutes harmful content or disinformation, a complaint is opened, the original takedown is proven illegitimate and invalid, and the content is put back up. This can happen in a one-sided way and can result in the unjust removal of legitimate content by hiding behind the claim that it breaches the DSA when that may not be the case. This happened recently. I am not sure whether Dr. Evans is familiar with the “Free State with Joe Brolly and Dion Fanning” podcast, but a social media post with a clip from one of its episodes regarding the war in Palestine was removed following an accusation of disinformation and harmful content. Mr. Brolly and Mr. Fanning contested this and it was proved that what they had posted was factual and fair. There was a rush to take the content down and the EU DSA was cited, but the content was factual and fair.
I expressed a concern when we were debating the legislation. It is not that I do not believe there is good intent behind the legislation – there is – but I am concerned that the Act could be used and applied with bias. The interpretation of what constitutes disinformation can have a political element to it. This is a real concern. What will An Coimisiún na Meán do to protect against that type of bias and to advocate at European level against political bias? It is a tricky area, but we have seen cases. I cited that example because it was fairly high profile – one of the people involved is a barrister and someone of significant repute capable of taking the case – but there was an obvious political slant to the content being taken down. What can be done to tackle this issue and ensure that, where there is the potential for political bias to be applied, the law is used to ensure that does not happen?
Dr. John Evans:
There are probably two aspects to the answer, the first of which is the European Commission’s role in respect of disinformation and misinformation. Parts of the DSA are aimed at addressing misinformation and disinformation. The platforms need to do risk assessments and apply mitigation measures where there are patterns of content that may lead to four areas of harm, including risk to public health and to the political process. The area of harm that the Deputy is getting at is the risk to fundamental rights, including freedom of expression. If a platform does not have the appropriate mitigation measures in place to protect freedom of expression as one of the fundamental rights, that will get picked up by the European Commission. That is at a systemic level and it takes some time for that process to work.
The second part of the answer is that, in a way, what the Deputy described is the system working as it is supposed to.
First, content gets taken down. The user is alerted and a reason is given. That can then be disputed and another decision can be made. In this case, it sounds like the matter was resolved.
Louise O'Reilly (Dublin Fingal, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
It was. I do not want to cut across Dr. Evans but I may have made my point badly. The issue is not that the system does not work but rather what triggers the system. I am going to offer an opinion and I want to be very careful in what I say. In some instances, politically motivated complaints are made. Such complaints are not made because the complainants believe the content is disinformation. They may know it is not disinformation and that it is real but they have a political opposition to what is being said, which is what motivates the complaint. The content is then taken down only to be reinstated. My question is not about whether the content gets taken down. It was taken down and Dr. Evans can argue that shows the system working. The content was taken down, the complaint was resolved and it was put back up. That is fair enough but, going back to the start of the process, is there anything that can be done to ensure that, when a person is reviewing a complaint, he or she can spot that political bias and name it for what it is?
Louise O'Reilly (Dublin Fingal, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Yes, that is it.
Louise O'Reilly (Dublin Fingal, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
One person's disinformation is another person's political cause. The question is how to stop that.
Louise O'Reilly (Dublin Fingal, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
The trick is to preserve the right to freedom of expression while also protecting people against misinformation. My point is that political bias can be injected into the system at any stage. It is not just when a complaint is made. It can come in during the moderation process. The commission needs to be really alert to it all of the time and not to just wait for a complaint. Sometimes, the handling of complaints can be politically biased and that is a grave concern of mine. I am sorry; I am conscious of time.
Page 2 of Dr. Evans's statement says that the "key aim of the DSA is to provide a more predictable and trustworthy online environment for users". And so say all of us. It also says "the DSA also obliges platforms to inform users when their content has been removed, downranked or de-monetised, or when their account has been suspended". Some concern has been raised about the prioritisation of speed over due diligence. In the rush to combat illegal content and disinformation at any cost, could this potentially exert pressure on large platforms like X, Facebook and TikTok to act swiftly and decisively, even it means relying on imperfect or opaque algorithmic tools to avoid liability and public scrutiny? What is going to be done to ensure there is no disproportionate removal of lawful content produced in what we might call a nuanced situation? What can be done to ensure a swift resolution where decisions are contested by people and organisations?
If we go back to the example of the "Free State" podcast, where the issue appears to have been resolved relatively quickly, the content was taken down, that decision was disputed and the content was reinstated. To go back to the Covid pandemic, we should not let the perfect be the enemy of the good. We should remember that. As a friend of my often says, slow is smooth and smooth is fast. How can the commission act quickly while also ensuring that all processes are followed? If there is a rush to take action quickly on a hot button topic that everyone is talking about, is Dr. Evans concerned that, because everyone wants something done, it may be a case of "somebody should do something, this is something so therefore we should do it" and that an action will be taken that, in the fullness of time, turns out to be the wrong action? How can it be ensured that action is swift but also fair?
Dr. John Evans:
Again, there are two aspects to the answer. They relate to the role of the European Commission and to our role. Our immediate role relates to article 16, which is on the notice and action mechanisms, and whether those mechanisms are working well. If someone makes a complaint and, after an assessment, some content gets taken down and a statement of reasons is produced, following which the decision is appealed, a person who is unhappy with their experience of that process can complain to us that the notice and action mechanisms are not working. What the Deputy is describing may be a systemic issue. If the platforms are not investing sufficiently in their trust and safety structures, it may affect their ability to process information quickly but also effectively. Getting the balance right between speed and diligence is important. That would get picked up. If they are not investing sufficient resources in trust and safety infrastructure in moderation and so on within their own companies, that could get picked up by the European Commission.
Louise O'Reilly (Dublin Fingal, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
On content moderation, in his submission, Dr. Evans mentions "resolving disputes relating to content moderation decisions by online platforms, without requiring users to enter a costly or lengthy legal dispute." The Chair will be aware that I have raised the issue of content moderators on a number of occasions. Many of these large platforms have outsourced the job of content moderation. In his submission, Dr. Evans highlights how important content moderators are. Does he believe it is acceptable that they are very often the lowest paid within these companies and that their roles have been outsourced? In some instances, they have been outsourced and then outsourced again. These people are essentially the front-line defenders. I have spoken about them many times. They see the things we do not want to see, things that we should never see and that nobody should ever do.
Does Dr. Evans have a view on the necessity of keeping the function of content moderation close to the company? I have a workers' rights concern in this regard but I will put it to one side for the minute because I know it is not why Dr. Evans is here. However, I also have a concern that these really important health and safety front-line defenders' function is not just being outsourced but outsourced again so that it can be kept at arm's length from these large platforms. I am not asking Dr. Evans to comment on the workers' rights aspect but on how it works to have that really essential function not just at arm's length from the company, but at arm's length from the company that is at arm's length from the company. These functions have been outsourced and then, in some instances, outsourced again so they are twice-removed from the platforms. Does Dr. Evans have a view as to the optimal way to do it or how it is best managed? Should they be close to or part of the set-up? Is he content with the outsourcing? Does he have a view on the matter?
Dr. John Evans:
To be honest, I do not really have a view on those outsourcing arrangements at the moment. We are at the earliest stages of this. A lot of the supervision plan we have talked about is about getting to know the internal workings of the platforms and how they organise their trust and safety infrastructure. I appreciate the Deputy is coming from a workers' rights perspective with regard to outsourcing but it is a concern from the perspective of the efficient functioning of that capacity. If outsourcing is leading to systematic biases or underperformance, that is something the platform needs to manage. It needs to ensure its trust and safety function works. However it organises-----
Louise O'Reilly (Dublin Fingal, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Let us say there is a really high turnover of staff. Experiential learning is really essential for this type of work. God love those workers but it really is. If there is that really high turnover of staff, the platform will have no control over that. It has no control over the personnel who are doing this work. I appreciate Dr. Evans's answer. I really would strongly encourage him to put some focus on this area not because of the workers' rights element, but for the sake of efficiency and the need to keep that kind of function close.
If I may, I will ask one very brief last question. It relates to small and medium enterprises and microbusinesses. Concern has been expressed to our office, and I am sure others have heard it as well, that compliance might be financially burdensome. Dr. Evans mentioned that he would ensure that duplication and other such burdens are avoided. Is there anything else that can be done to alleviate the burden on small and medium enterprises and microbusinesses? They really want to play their part put they are flat to the mat and stretched to the limit at the moment.
Dr. John Evans:
There are some exemptions for microenterprises. The DSA itself is structured so that the biggest regulatory burden and the most onerous obligations are on the players that have the most influence in the system, the very large platforms. Dropping down to the next level, there are the platforms. Some of these can be quite big but there is a wide range of platforms out there.
As one moves down the tiers, there are lesser obligations. There is a proportionality of regulation built into the DSA. In due course, as we become established, we will make sure to put out some guidance on how people can best go about applying.
Mr. Tiernan Kenny:
One of the directors in the platform supervision investigation team is devoted to those below-threshold platforms. We are actively thinking about how to communicate with them most effectively to be proportionate. We will follow up with the Deputy in order to better understand some of the concerns that have been expressed to her office. We will make sure we can meet them as we issue further guidance.
Louise O'Reilly (Dublin Fingal, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
That would be excellent. I thank the witnesses.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I thank the witnesses. They are very welcome. I wish them well in the very important work they have to do. They have recognised the work of this committee and taken an interest, but it is a whole new world for us as well and for many other people.
Australia has been very active in this space for quite some time. There has been an e-safety commissioner there since 2018. Has the commission looked at that individual's work and engaged with them? Is the commission aware of what they have been doing and can we glean anything from that?
Dr. John Evans:
This falls more naturally into the area of the online safety commissioner, Niamh Hodnett. I have been with the commission since the end of July and Ms Hodnett has been in position since March. Right from the beginning, Ms Hodnett was watching out to see who was doing best practice here and brought Coimisiún na Meán into the global online safety network. One of the founding agencies of this network was the Australian e-commissioner. We have actually had their e-commissioner, Ms Julie Inman Grant, over to visit the offices. There is quite an amount we can learn here from the Australian experience.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I met Ms Inman Grant when she was here recently. She is quite impressive.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I wish to ask about illegal content. Will Dr. Evans define how the commission decides what is illegal and what is not illegal? How does the commission come to the conclusion on whether something is illegal or harmful?
Dr. John Evans:
Yes, there is a distinction around illegal content. Under the DSA, illegal content comes under what is illegal because of EU law or national law. What is illegal in one member state might be different from what is illegal in another. In Ireland, there is a long list of illegal content. I have a list here of examples of such content including: involving a credible threat of violence, which comes under section 5 of the Non-Fatal Offences Against the Person Act 1997; a threat of grossly or offensive communication, which is under section 4 of the Harassment, Harmful Communications and Related Offences Act; offensive conduct of a sexual nature, which comes under section 45 of the Criminal Law (Sexual Offences) Act 2017; and anything to do child sexual abuse. Material that breaches copyright could come into it as well. These are very much the kinds of crimes against the most vulnerable in society right through to infringement on copyrights. As an agency we do not do the initial assessment of whether content is illegal. The first line of defence is at the platforms. The platforms need to be able to determine what is illegal and what is not illegal. Our role, when it comes to it, is if somebody complains that their complaint has not been properly dealt with by the platform. Then we may have to come to a view ourselves on whether or not content is illegal. It is limited to that.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Where does the commission stand regarding the role of trusted flaggers?
Dr. John Evans:
Trusted flaggers was one of the things we wanted to have up and running from day one. Last week, we published guidance that tells people what they need to do to apply to become a trusted flagger. Essentially, a trusted flagger is a fast lane for organisations with special expertise in different areas of online harm so that they have a fast lane to approach and flag the content to the platforms. We will have a certification role with those trusted flaggers and also a supervision role to make sure they are doing the jobs they were intended to do.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
What is the situation with appointing those roles at the moment?
Dr. John Evans:
It is up to the bodies themselves to put in the applications. In due course and as we build capacity, very soon we will have the mechanisms to take the applications and assess them. We also have the guidance, as I had said earlier. We will do a bit of a push to let people know. Prior to 17 February we had been in contact with a few organisations who expressed an interest in becoming trusted flaggers. We have a reasonable idea of who are the possible candidates.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
When will the commission be in a position to appoint the first trusted flaggers?
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
One of the areas we have been looking at here with interest is the whole emerging area of artificial intelligence. It is developing at a very fast pace, as the witnesses will be aware, so much so that some experts are saying it is very hard to regulate and to keep up with it because it is changing so fast. We could contend there is a possibility they could also create illegal and harmful content through that. What is the commission's capacity and Dr. Evans' view on this emerging technology?
Dr. John Evans:
There is the EU AI Act. We do not have responsibility for this. In a sense, the DSA does get at the regulation of AI where it is used to provide the services of the platforms. If there is an AI behind an algorithm that is recommending something to the user as part of his or her feed, or if it is behind the search engine when one goes online, then that becomes part of the systems and therefore a potential risk for a platform. It needs to assess this and then put in place mitigation measures. For example, if AI is leading to a distortion in the presentation of content in such a way that it becomes a risk to public discourse then this is something the platform should address through its risk mitigation measures. That would then be an issue for the European Commission to pick up. Because AI is part of the machinery behind everything they do it does get caught to a degree by the DSA.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Reference was made to the EU Commission having designated 22 VLOPs and the large online search engines. Of these, 13 can be found in Ireland. They must have 55 million or more monthly active EU users to qualify. That seems to be a quite high bar in some ways. Are there many others that are below that? Are there some who are disputing that 55 million active EU users to qualify? Is there a grey area whereby some of them that should captured have not been? Does Dr. Evans know what I am getting at?
Dr. John Evans:
Yes, I do. The metric of monthly active users has been used by the European Commission, and that is being challenged for a couple of reasons. The first is because it leads to designation and the second is because it has implications for how levy fees are calculated. It is a critical criterion for them. Is there a question of whether some are in and around the threshold? There are some that are approaching the threshold but this does not mean that they escape the DSA. If they are below the threshold, they still need to fulfil a range of regulatory obligations. In those situations and if they are based in Ireland, we are the regulator in the driving seat. The Department of Enterprise, Trade and Employment did a survey of online platforms and ISPs generally based in Ireland. In addition to the 13 VLOPs, we believe there are probably at least 400 other service providers of differing levels of significance. Some of them will be in and around that and will possibly be VLOPs one day, while some are much smaller.
That is why, as Mr. Kenny said, we have a director of supervision dedicated to that area. We need a different supervision strategy for that group because they are very heterogeneous. There is a long tail.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
In the presentation, it was stated that it is not Coimisiún na Meán's role to act as an online censor or tell people what they can or cannot say. If somebody goes online or on one of the platforms and advocates killing people and so on, how does one square that statement with threats of violence and so forth?
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
In the statement, Coimisiún na Meán stated it does not tell people what they cannot or cannot say. That needs to be qualified by what Dr. Evans just said.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
It just jumped out at me because of what Deputy O'Reilly spoke about - the free speech issue. There are grey areas there too. How far does free speech go?
Mr. Tiernan Kenny:
To clarify, there has perhaps been some confusion about our role. We have been accused of effectively being an online censor or having the power to order platforms to remove content arbitrarily. We are trying to hammer home the point that this is a systemic form of regulation. As Dr. Evans said, first and foremost, it is the platforms that are accountable for dealing with online content on their services rather than us being some kind of Internet police force.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Will the witnesses comment on the out-of-court dispute settlement body? Where is that at and how is it working?
Dr. John Evans:
This is an area we wanted to have up and running early. We published guidance and an application form for a small number of potential dispute settlement bodies. Relative to the number of trusted flagger applications we will eventually get, this will be a smaller feature of the system. We spoke to a small number of potential applicants prior to 17 February. We will probably have a decision on at least one or two by the summertime.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Coimisiún na Meán has a dedicated contact centre. How does that work?
Dr. John Evans:
We have an outsourced contact centre. The company supplying the service to us is Fexco. We had some difficulty contracting for that because it was uncertain what the volume of complaints would be. We prepared a good number of agents at the centre who were ready to take calls as of last Monday. As it turns out, the volume of contacts we are getting is quite manageable at the moment. Mr. Kenny may comment in a second. As we turn up the volume, we will start getting more.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
What is the nature of the complaints at this stage?
Mr. Tiernan Kenny:
Since we opened the contact centre, we have had, I think, 108 contacts either by phone or email. That is 108 individuals. In some cases, there are multiple conversations. Some are about broadcasting complaints, which are then redirected to another part of the organisation. Some are general queries about our remit and some are from people who have come across content they do not like. It is not necessarily that the content is illegal or a breach of platforms' rules. They ask us to remove it. We try to explain that is not necessarily our role. Out of those 108 complaints, seven cases have been escalated to our complaints team at this point as potential breaches of obligations under the Digital Services Act. They will be considered and passed on to the platform's supervision and investigation teams for any further action, as required.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
If it is deemed that this content needs to be taken down or removed, how does that work?
Mr. Tiernan Kenny:
We tell people if they come across illegal content that they should flag it to the platform on which they saw it because the platform has the obligation to have a mechanism for somebody to report illegal content. It also has an obligation to remove it once it becomes aware of it and to inform the person who made the complaint of the outcome. If you make a complaint, the platform has to tell you what they have done. If the person does not like the outcome of that complaint, for example, if they report content as illegal and the platform says it is not, they can appeal that directly with the platform internally within six months. Once the out-of-court dispute settlement bodies are in place, complainants will be entitled to go to them. At that point, they can come to us. I think the complaint they would be making is that the platform has not followed its obligations under the Digital Services Act to remove illegal content once notified. That is when we look at whether the platform had all the systems and processes in place and, potentially, having to make a determination as to whether the content was illegal or not. That comes at the very end of the process. It is all about making sure the system works.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
It seems to be quite a long process. In all that time, that content can still potentially be online. Is that correct?
Mr. Tiernan Kenny:
This goes back to what Dr. Evans said about the tricky balance between freedom of expression and removing content that is illegal or harmful. We expect the platforms to be right in the vast majority of these decisions in the first place and to be right on appeal in the vast majority of the rest of them, with a very small number filtering through to us.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Is it relatively easy to contact all of these platforms? As a layperson, a member of the public, is it relatively easy to do that?
Matt Shanahan (Waterford, Independent)
Link to this: Individually | In context | Oireachtas source
I was just reflecting; if it was 20 years ago, we would not be having this conversation. It is amazing how time has moved on and a whole industry has come about. I wish the gentlemen before us the best of luck with their duties. It is very significant to be appointed to the Digital Services Coordinator. The witness spoke about resourcing. Dr. Evans said there are 100 staff and I think Coimisiún na Meán is hoping to get to a headcount of 160. In previous committees, one of the first things raised was going to be Coimisiún na Meán's ability to meet all of its remit, which is very wide, across a number of different areas. On the work Coimisiún na Meán will do, what will a sanction be? My colleague spoke about platforms found to have been in breach of the Digital Services Act. Where do the sanctions start in that case and how quickly do they come into play? How can people see companies being sanctioned?
Dr. John Evans:
Our objective is not to raise fines; it is to change behaviours at the platforms. We will have a process internally of supervision and complaints escalation to investigations. When that formal step is taken, we have significant powers around gathering information, carrying out inspections and so on. At the end of that process, if it goes the full length, we have, under the DSA, the power to sanction the platforms up to 6% of turnover. It is significant money.
Matt Shanahan (Waterford, Independent)
Link to this: Individually | In context | Oireachtas source
I presume that would be after a lengthy process, possibly involving cases in the European courts.
Matt Shanahan (Waterford, Independent)
Link to this: Individually | In context | Oireachtas source
It is not straightforward. On social media monitoring, does Coimisiún na Meán have any input into the management of algorithms and how they are arrived at? I refer in particular to those that are negatively biased, which it seems they all are, farming out negative bias on social media posts. Is there any way to get at that? The witnesses are trying to point out that Coimisiún na Meán wants to support free speech but there is a point at which it becomes sedition, for instance.
Dr. John Evans:
One of the most vulnerable groups online are children. Our role in algorithms under the DSA is that platforms are not allowed to target children with their algorithms. The next most important intervention around algorithms is risk assessment, which I spoke about, so, where the algorithms may be producing what are called rabbit hole feeds where somebody is getting the same kind of content over and over.
Where that can become harmful to people or society might be in the public health sphere, for example, if somebody is the subject of a stream of toxic beauty and dieting things. Material to do with self-harm can become a stream too. We are aware that happens. People are at risk there. It can happen in other ways. There are rabbit hole effects with respect to ideas and political views. Everybody is familiar with the idea of bubbles where people are only exposed to views that reinforce those they already hold. Where that becomes an issue for public health or fundamental rights or a risk to political process, then it becomes a systemic issue the European Commission can deal with.
Matt Shanahan (Waterford, Independent)
Link to this: Individually | In context | Oireachtas source
It will be difficult to deal with. A lot of the algorithms themselves are generated by AI. They are not necessarily something that somebody is sitting down to write a programme for. The programme is freewheeling and adapting all the time. It will be difficult. Another issue I point to is deepfakes, which will be a big part of political propaganda in the next six to 12 months. These are videos and content that look to be real. It will potentially be hard to ascertain what is real. A lot of damage will potentially be done by the time somebody realises that a political figure in a video somebody has put up has not said what they are purported to have said. I just point that out. I am not sure what Coimisiún na Meán can do about it because with those things the damage can be done in 24 hours.
Dr. John Evans:
Mr. Kenny might want to come in, but everybody working in this space is aware of the potential negative impact that an unregulated negative space presents for elections in particular. That is why there was a certain amount of haste in Europe to get the Digital Services Act in place as quickly as possible for this year. I mentioned earlier that we had the first meeting of the European digital services board last week. At that meeting we discussed a draft set of guidelines that the European Commission has produced. They are guidelines directed at the platforms about how they should prepare themselves for the elections this year. That guidance includes a role for Digital Services Coordinators because a lot of the effects across the Union will be language or culture specific. You need to have the various locally active agents contributing. In Ireland that would be the Electoral Commission. It would be fact checking organisations and organisations like ours, which do work on media literacy. We have a role in co-ordinating that.
Mr. Tiernan Kenny:
I will come in on the media literacy point. We are talking about the platform side, but we also have a role in making sure users can critically engage with and understand the media they encounter, including recognising when something might be false or misleading. Some of the platforms are also making efforts to label AI-generated content so if it turns out to be a deepfake video, it can be clear to users that it has been artificially created. However, that only works with good faith actors.
Matt Shanahan (Waterford, Independent)
Link to this: Individually | In context | Oireachtas source
My final question is on the trusted flagger status. We know that will be an independent role, but the witnesses say they have a fast-track lane for approving trusted flaggers, and platforms will have to deal with them. What kind of teeth has the agency in terms of getting those processes up and ensuring platforms are engaging in a timely fashion?
Dr. John Evans:
If the trusted flaggers' experience of engagement with the platforms is that when they flag content, a decision is subsequently made and they are not happy with that decision, they can appeal that. They go through that process in a expedited way. If they are unhappy in such a way that, in their view, it represents a breach of the Digital Services Act then we can intervene as an agency.
Richard Bruton (Dublin Bay North, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I thank the witnesses for their interesting presentation. I will start on the political thing. I am interested to know how they define protecting electoral integrity. It seems to me that deliberative democracy, which a lot of us are used to, is challenged by social media. Outrage is the most saleable product and it spreads virally by the systems they have in place. On top of that, some states believe digital technology is a tool to exercise power against democracies. How does Coimisiún na Meán see itself regulating bots and foreign agents seeking to subvert democracy? What standard is it applying? You look at the US and see freedom of expression á la US, where things certainly close to hate are not only said but reproduced, and not only on social media but on broadcasting channels. Where is it pitching this line on freedom of expression? The witnesses rightly say it is not easy to define, but there are certainly extremes that seem to rule on social media that are not acceptable. Because it is different from print media they do not take any responsibility. It is designed to send viral things that are outrageous. That is basically its modus operandi. I would be interested to her more about that.
I have a number of questions. Digital technology and social media platforms can be used for all sorts of things that can be described as high stakes like job interviews and insurance quotations. We will see algorithms doing this selection process. How do the witnesses intend to prevent them having the inherent biases or learnt biases of big datasets that are already prejudiced? It would be interesting to hear how that is, because it is a concern. How do they propose to change these shallow consent and accept buttons that we hit? As I understand it they are now changing. What happens to all of the pre-existing consents? Does everyone have to start all over again? Where is that? This consent is not really consent at all. People do not read any of the small print. It is not really relevant. When it comes to profiling people, not just in the political area, but for profiling and selling, what principles do the witnesses apply for what is acceptable and unacceptable? Profiling is at the heart of what is done, and then bombarding and finding the capsules in which people circulate. Are principles applied? I know they say children cannot be profiled, which is okay. Are the rest of us sort of on our own? What falls to the EU and what falls to Coimisiún na Meán? Thirteen of the 22 platforms are headquartered here and we are overseeing them. It would be crucial how we are perceived as to what falls to us. The final issue is that this seems extraordinarily complicated work. We are checking if software packages are inherently biased or checking how algorithms are put together. Are they equipped to challenge that level of investigation if it comes to it? If it becomes apparent that there are inbuilt biases, do they have the capacity to unpick that or how will it be handled?
Dr. John Evans:
In Europe we have charted a course around trying to balance freedom of expression with curtailments of what we can do online. The US is on one path. Other, totalitarian, regimes are on another path.
The whole idea of the DSA was to try to create a system with checks and balances that walks the line between those two polar opposites. There will be a degree of needing to work with this for a while to see how it works, in particular, around the risk assessment and the commission's role in those four areas where we are trying to mitigate at a systemic level where harms can occur. On the rapid dissemination of illegal content and, probably more to the Deputy's point, the risk to fundamental rights and to political processes, we will be watching very closely to see how that goes this year. When we listen to EU officials speak when someone asks them what "good" looks like, they will often say the platforms did not play a negative part in European elections and national elections in Europe this year. That will be the litmus test for how well the system will work this year. I do not know whether that fully answers the Deputy's question.
On the question of algorithmic biases, whether we have the capacity to do that and what our line on profiling within the algorithms is, with regard to the profiling question, apart from children it is also not allowed to target individuals based on certain characteristics, for example, sexual orientation or religion. Mr. Kenny might come in with more examples. There are other categories that someone cannot be targeted for online.
On the capacity issue around the algorithms and developing the expertise, I spoke a bit earlier about the structure of the organisation. I spoke about the external facing divisions and I mentioned very much in passing that Coimisiún na Meán will develop a data and technology division as well. We have hired one director so far who is somebody who is expert in data analytics. In due course, we will hire two more directors there, one with responsibility for the technology side and who will, therefore, have expertise with algorithms, and then a third director who will be focused on digital forensics. Together, those three directors will work and have their staff. We want to develop a centre of excellence inside the organisation. Ireland has a very outsized role with respect to this so it is very important we start thinking like a big regulator in this respect and develop that capacity ourselves. Of course we will be able to link in with other regulators across Europe. Some of the other regulators are doing important work in this area. There is this centre for algorithmic transparency where we see a lot of the leading edge experts in this area working for that and helping inform policy. That is something we will want to link in with in due course as well.
On the question of shallow consents, I do not have a good answer for that one. It is more of a data protection issue, generally, but I can look into that more when I go back to the office, if that is okay.
Maurice Quinlivan (Limerick City, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Does anybody else want to come in?
Richard Bruton (Dublin Bay North, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Can I come back to job interviews and those sort of things that are like insurance? Situations could be envisaged where all sorts of data that would be regarded as extraneous to an insurance policy being cast would be capable of being tracked from people's social media histories. Will Coimisiún na Meán police that?
Richard Bruton (Dublin Bay North, Fine Gael)
Link to this: Individually | In context | Oireachtas source
A lot of that is put up by people themselves and with their consent. It is in the public domain. Is Dr. Evans saying it cannot be used or can only be used for certain purposes?
Richard Bruton (Dublin Bay North, Fine Gael)
Link to this: Individually | In context | Oireachtas source
For job interviews I suspect employers look at people's social media history. Is that now unlawful?
Richard Bruton (Dublin Bay North, Fine Gael)
Link to this: Individually | In context | Oireachtas source
There are extremes out there. If dishing out insurance policies or sifting through CVs becomes machine learning, there could be an awful lot of very-----
Richard Bruton (Dublin Bay North, Fine Gael)
Link to this: Individually | In context | Oireachtas source
What I am getting at is that you feel there are certain principles that ought to be embedded in a job interview. Maybe this has to go into employment law or insurance law and is not Coimisiún na Meán's baby. However, it does seem to me that machine learning will challenge existing laws and it will not be good enough for Coimisiún na Meán to say there is a data law and there are other laws that must be complied with. Maybe we have to rewrite some of the laws. Will Dr. Evans come back to me on that?
As for the accept and consent stuff. I never read terms and conditions and I accept them ten times per day. Will they all have to come back to me again and say, "Are you sure?", and here is some very simple explanation of what I am consenting to?
Mr. Tiernan Kenny:
It depends on the exact service in question. With regard Coimisiún na Meán's role under the DSA there is an obligation on the platforms to set out clearly the terms and conditions for using the service. That includes that, when the service is available to children, those terms and conditions are set out in a way that is understandable to children. Some of the consent stuff the Deputy is talking about is more typically for the processing of personal data, so that is the kind of thing that might be included in the terms of service. It would be a decision for the platforms, I think, whether they would feel they would have to change the wording of their terms and conditions sufficiently that someone would need to consent again. The general practice is that the service tends to ask people, when it makes changes to the terms and conditions, to ensure they still consent to using the service. That really is a question for the platforms as to when they make changes to their terms and conditions.
Richard Bruton (Dublin Bay North, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Does Coimisiún na Meán, as our trusted partner who is looking out for our interests, invigilate their consents in some way to ensure they are not asking for unreasonable things from consumers, who we know will not read the small print? We just know that is not going to happen.
Mr. Tiernan Kenny:
There are different areas of legislation now. Some of that would be consumer protection legislation and some of that would be privacy legislation. Coimisiún na Meán's role would be to make sure the terms and conditions are clear and understandable. We have the contact centres so if we start to hear from users that they cannot understand what is being said to them, as the Deputy said, in these very lengthy documents and if a user feels their data is being used in a way that was not necessarily made clear to them at the point of signing up, that is something we could take up with the platforms if we felt there was a systemic issue.
Richard Bruton (Dublin Bay North, Fine Gael)
Link to this: Individually | In context | Oireachtas source
When someone makes a complaint and they come to Coimisiún na Meán with it, is it an acceptable defence to say the complainant pressed the button to accept four years ago and the provider is only doing what it said it would do? What status do those consent buttons have in Coimisiún na Meán's assessment of a subsequent investigation?
Mr. Tiernan Kenny:
I cannot say very much about how we deal with any particular investigation because it would depend on the merits of the case. The obligation is for the terms and conditions to be clearly understandable for the people who are reading them. That is what Coimisiún na Meán would be looking at. Our powers commenced on 17 February, so it is quite difficult for us to go back in time, as the Deputy will understand.
Richard Bruton (Dublin Bay North, Fine Gael)
Link to this: Individually | In context | Oireachtas source
That Act is now in place. The services cannot presumably rely on consents they received under the old legislation.
Mr. Tiernan Kenny:
It depends on what the services were looking for consent to do, as I have said. If it is to do with the processing of personal data, that is outside Coimisiún na Meán's wheelhouse. It is an open question as to whether they need to present the terms and conditions to the users again.
That is probably something we would have to go and look into to see if this needs that. We might come back to the Deputy on that.
Matt Shanahan (Waterford, Independent)
Link to this: Individually | In context | Oireachtas source
I have one more question and it relates to Coimisiún na Meán's remit in monitoring for public order, etc. During the Dublin riots, a particular messaging system was used to co-ordinate people to go there. The messages were put up publicly. If people saw some of the communications, it would be easy enough to infer what was taking place. First, if that is going on, how quickly can someone like Coimisiún na Meán interdict that? Second, if it sees that happening - and it has happened - is Coimisiún na Meán in a position to approach the platform, state that it did not moderate its content or it allowed X, Y and Z to take place, and would it sanction them? People in the community and the public at large starting to see moderation in terms of companies paying fines is all that will tell them that moderation is happening on some of these things.
In respect of that, there are a number of those. They are like WhatsApp but they are not WhatsApp. It is the same type of platform. What is Coimisiún na Meán's control on something like that?
Dr. John Evans:
There are two situations. The first is where they are established in a member state of the EU. In that case, the local DSC, probably in collaboration with the European Commission, can act. There is another situation. The one the Deputy is thinking of is probably based outside the EU. Under the DSA, that does not matter. The law still applies. The only question is which Digital Service Coordinator would go after them. Normally, what is intended to happen is that the platform or service would elect to be regulated in a particular member state. That could be here. It could be anywhere. If they do not do that, it is fair game for any of the DSCs to approach that platform and begin asking questions. As of 17 February, DSCs that are established can doing that now.
Matt Shanahan (Waterford, Independent)
Link to this: Individually | In context | Oireachtas source
In respect of what I have outlined, should the commission say to another DSC that there should be some legal sanction for what has occurred or facilitated? Otherwise, how is it to stop this being repeated?
Dr. John Evans:
The DSA provides for very significant sanctions; ultimately, 6% of turnover. That would apply whether the entity is based inside or outside the EU.
I fully appreciate what the Deputy is saying, though. To have confidence in the system, people need to see the teeth being used. That is fully the intention of the community of DSCs and the European Commission.
Matt Shanahan (Waterford, Independent)
Link to this: Individually | In context | Oireachtas source
How does Ireland get that activity going? Let us say that we say we do not want to have a repeat of that. The best way not to have a repeat is to make sure that this has been flagged with the relevant platforms and, ideally, that there is some legal sanction. How is that initiated or how does the commission put that case across?
Dr. John Evans:
We had some experience of this in the autumn before we had our powers and the first step was to engage immediately with the platforms. Because we had been laying some groundwork, we immediately knew people at the platforms to call in and we had them in with us the following day, and we had the European Commission there as well.
Now that we have passed 17 February and we have our powers, where that might translate is, for example, where somebody had notified content to the platform that was not taken down and the platform in some sense, in the process that it was following, breached the DSA.
Another one goes back to this systemic issue. It is not an overnight fix but it still leads to significant sanctions, potentially, down the road, where a platform's mitigation measures have been found not to be sufficiently robust to combat that kind of issue coming. There, it is probably threat to public discourse and even political process.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I thank Coimisiún na Meán. How does someone find its dedicated content centre online? Where is it?
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
What is CNAM? What is that?
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I was looking for it here and it was not jumping up for me easily.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
It might be an idea to do that.
Can the witnesses comment on influencers and advertising? One of the requirements is that if somebody is advertising something online, he or she has to show he or she is advertising it. This applies to influencers as well. There is an industry of influencers out there who get paid by various organisations to comment online and their comments online can have a huge impact on the organisation, for example, a bank, making substantial income from that.
Do the witnesses agree that it would be important that the consumer would know if something is financed in that way and that it is an advertisement rather than someone's opinion, for instance? Has Coimisiún na Meán any role in or comment on that?
Mr. Tiernan Kenny:
The Advertising Standards Authority of Ireland is the primary regulator for advertising but there are obligations in the Digital Services Act about advertisement transparency which go exactly to the point the Deputy is making that if platforms choose to show advertisements to their users, it has to be made clear to the user that what he or she is seeing is, in fact, an advertisement, who has paid for the advertisement, and why the advertisement is being shown to the particular user, to go back to some of the discussion about why some people might be targeted for advertisements or not. The ASAI has engaged in quite a bit of activity on influencer advertising and some of those other questions might best sit with the authority.
Dr. John Evans:
The Competition and Consumer Protection Commission, under general consumer law, also has a responsibility here around influencers. It has done a number of campaigns to try and address the kind of issue that the Deputy is thinking of around inappropriate placement of materials or lack of transparency around whether something is an advertisement or not.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
Between the trusted flaggers and dedicated contact centre, it seems a lot of Coimisiún na Meán's activity will be driven by complaints from the trusted flaggers or the public who go through the contact centre. On top of that, has Coimisiún na Meán itself an oversight role to investigate or keep an eye on what is going on out there, and rather than waiting for a complaint to come in, to initiate its own activity?
Dr. John Evans:
In the division that I am setting up, the platform supervision and investigations division, four of the teams are dedicated to supervision, one is a regulatory operations team and then another one is a major investigations team. The way the four supervision teams are organised is as follows: we have one, at present, dedicated to below threshold platforms, that is, the ones with fewer than 45 million monthly active EU users that are based here. The three other supervision teams are dedicated to the very large platforms. We have shared out the 13 VLOPs or VLOSEs with EU headquarters in Ireland between those three teams.
What they will do is engage with the platforms directly, look at the risk assessments they are required to undertake, manage day-to-day operations and inform us about the kind of risks that different platforms are raising across different kinds of headings, for example, whether there is a threat of harm here related to copyright infringement or whether there is a threat with a particular platform around the access that minors have to it. It depends. What those supervisors will do is help us flesh out that picture. Based on that intelligence, when we marry that with the intelligence coming through the contact centre, we will get a good idea of what our enforcement priorities should be so we can initiate own-initiative investigations as well.
David Stanton (Cork East, Fine Gael)
Link to this: Individually | In context | Oireachtas source
I thank Dr. Evans and Mr. Kenny.
Maurice Quinlivan (Limerick City, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Before we finish, I have a question on trusted flaggers and where we are at with that. Is it the intention that Coimisiún na Meán will invite people to apply for those positions?
Have people already applied for it? Has anyone been approved yet? It was mentioned that if a body applied now it would be approved by the summer, but have a number of organisations been approved already?
Dr. John Evans:
No one has been approved already. On their own initiative, some of the platforms ran similar schemes with fast-tracking of different kinds of organisation. Once we build our capacity a little more – let us say within another six weeks or so – we will probably put a push on this to bring to people’s attention that the trusted flagger system is up and running and we are taking applications. That is the case currently. It is just that we have not raised the volume on it yet.
Maurice Quinlivan (Limerick City, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Is it the intention to have a wish list of trusted flaggers?
Maurice Quinlivan (Limerick City, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
Deputy Stanton has already asked my second question. It has to do with monitoring online content. The commission mentioned that it would not be acting in the role of online censor. That is understandable, but does Dr. Evans believe the commission will monitor some content or will it wait for complaints to come in? I believe he stated that there had been 108 contacts already, with only seven of those needing to be escalated. Will the commission wait for complaints or will it look for issues?
Dr. John Evans:
Through the supervision activity I described, we will get a sense of where the risks are. As part of that supervision, we will do a certain amount of intelligence gathering outside the contact centre. We are thinking of ways to do that as well. We cannot take on the role of general monitor of the Internet. The whole spirit of the DSA is that the platforms themselves will take primary responsibility for illegal and harmful content. However, we will invest in some kind of intelligence gathering.
Maurice Quinlivan (Limerick City, Sinn Fein)
Link to this: Individually | In context | Oireachtas source
As no one else has indicated, I thank Dr. Evans and Mr. Kenny for assisting the committee in our consideration of this important matter. That concludes the business of the committee in public session. I propose that we go into private session to continue other business. Is that agreed? Agreed.