In Issue 2/2023


Reliability is one key aim in parliamentary reporting. In this article, I discuss the official video report of the Dutch Parliament from different perspectives: image manipulation and deepfake videos in general; the introduction of cameras into the Parliament; the fundamental differences between traditional and video Hansard; and the challenges introduced by the new privacy legislation in the EU, the GDPR.

Deepfake and the Thomas theorem

Manipulating photos, so-called shallow fakes, has been commonplace for some time now. It is precisely in this area that major progress has been made in recent years due to developments in the field of artificial intelligence (AI). AI is a calculation system based on large amounts of data with the aim of imitating or mimicking human skills, and it is making major progress. More and more AI tools are now available, including in the field of content creation—so-called generative AI. Generative AI supplies both texts and images upon request. Manipulating photos and videos using AI or deep learning produces shockingly real results, which have been shown, for example, in the fabricated video of Morgan Freeman:

It is therefore no exaggeration to see such images, or deepfakes, as potential and real problems, for example for official video reporting by Parliaments.

To reinforce this consideration, reference must be made to the Thomas theorem, the principle developed by the American sociologist couple William Isaac and Dorothy Swaine Thomas (1928). Their principle states: if one defines situations as real, they are real in their consequences. In our case, although the picture or video is fake, the consequences are real. These consequences may be disinformation, polarisation or incitement of hatred and violence.

Problems, Procedures, Answers, Technique

Deepfake should therefore be seen as a problem or even a threat to official video reporting by Parliaments. What can those Parliaments do against this? Roughly speaking, responses may be done in the protocol, in the procedure, and in the technique. Some of these solutions are more realistic and effective than others.

A quick response could be to ban cameras in Parliament: no cameras, no broadcast or livestream and no video. In the Dutch Parliament, this rather radical proposal was put forward by some MPs, claiming that the presence of cameras, in combination with social media, places the focus outside the debate by default. Without cameras or a livestream, there would be no reason to send social media updates into the world in response to the parliamentary debate.

An extension of this radical proposal is another radical proposal, namely a total ban on the use of smartphones in the Parliament. A slightly less impractical—but nevertheless impractical—proposal would be to delay the broadcast or live stream by 24 hours. All these proposals do not have much chance, as they can only be honoured by changing the protocol. This would require a majority of votes in the Dutch House of Representatives.

Another more practical and realistic response can be found in the field of technology. More generally, preventive measures can be put in place largely to protect Parliament’s IT infrastructure and its contents, video or other types of data against external threats. A completely different form of prevention would be to state on the website that certain guidelines apply, for example that, because of transparency, there is no editing or other such measures done to the official video reports. Fingerprinting or watermarking could also be considered. With these techniques, the images in the original video are given a kind of fingerprint or an invisible serial number. These make it possible at any later time to determine whether the original has been altered, manipulated or edited. A more satisfying approach is the one in which AI is effectively used to counter AI. The Dutch House of Representatives, for example, has teamed up with a firm specialising in deepfake detection as a countermeasure.

Lights, Camera, Action!

What impact did the introduction of cameras and television broadcasts have for Parliaments? Were there any concerns? Fortunately, the House of Commons of Canada published an extensive and clear report on this, entitled Television and the House of Commons (Robertson, 1998). The report shows that there are good reasons to bring in cameras to make a video Hansard: public expectations; desire for openness in the government; and the important role that television plays in modern political life. However, even then there was concern that bringing in the cameras could result in trivialising the debate. With reference to Marshall McLuhan and his famous phrase “The Medium is the Message”, the argument goes like this: TV is entertainment, so televising politics makes politics look like entertainment too, which is a bad thing. According to the Canadian report, “Many Canadians are critical of politicians, and dislike what they see of them on television (…) criticise the childishness or irrelevance (…) and their behaviour (…)”. The report continues, “Faced with the prospect of losing the parliamentary channel (…) many Canadians react strongly, saying that it is extremely important (…) to follow the proceedings.”

The report also shows that there was no evidence of the so-called Prima Donna effect; speeches remained as they were and attendance did not increase or decrease. There were other effects, expected and unexpected, related to the increasing importance of the image: improved attire; more reading of pre-written speeches; and more awareness of inappropriate facial expressions.

True versus Real

With the introduction of video Hansard, it becomes important to place it in a comparative perspective together with the traditional, written Hansard. In the comparison that I made earlier, the focus is on two essential aspects: construction and authorisation. With the traditional Hansard, the question is: is it true? With the video Hansard, or the video report, the question is: is it real?

The traditional Hansard is constructed in the sense that it is human work. It is made using the “universal editing principles of parliamentary reporting” (La Rocca, 2015). Reporters or editors perform work by removing, repairing or reorganising words. Video Hansard, on the other hand, often consists of automated recording by cameras and microphones. There is therefore no construction involved.

The same line of thought can be followed when it comes to authorisation. Standard practice in most parliamentary organisations is that official written parliamentary reports or minutes are authorised, ie formally established, before publication. Publication is subject to all kinds of formal rules and procedures. This simply does not apply to video Hansard, as video reports are mostly published on websites without any quality control whatever before or after publishing. This leads to the conclusion that, in terms of both construction and authorisation, the written Hansard is superior to the video Hansard. In other words, there is no best practice yet on the authorisation of video content.

GDPR, a new challenge

The Parliamentary Reporting Office (PRO) of the House of Representatives of the Netherlands has recently been confronted with various requests subsequently to edit names or blur faces already published in official video reports of public meetings. These requests were made for privacy reasons, ie the General Data Protection Regulation, personal safety, business interests and the interest of the state. The PRO’s initial response was to reject these requests. After all, this is a public meeting, of which the official verbatim report, Hansard, is not censored afterwards.

Because this response was questioned here and there, the PRO took the initiative to conduct an investigation via the European Centre for Parliamentary Research and Documentation. In this study, Parliaments were asked whether they have a practice or policy regarding blurring or editing official video reports. Some 39 Parliaments participated in the study, and the conclusions based on their responses were clear and unanimous: none of the Parliaments has an official practice or guideline in this area. None the less, the PRO is currently awaiting advice from the official parliamentary lawyer of the House of Representatives as to whether, if a request is made on the basis of the GDPR, this should in principle be honoured. That would be an unexpected and, from the perspective of transparency, unwanted effect of the laws and regulations that aim to guarantee our privacy.


Deepfake poses a potential and real threat to the principle of transparency and to the need to act on a common ground of what we believe to be true. Practical and realistic responses to this threat can be found in the field of technology. Another threat to transparency is the overshooting of the GDPR development, resulting in editing and blurring of official video reports of parliamentary meetings.

Henk-Jan Eras is a Quality Officer in the House of Representatives of the Netherlands. He is also a member of Tiro’s editorial team.


  • Eras, H.J. (2019). Transparency in Dutch Parliament, a paper for the ECPR (European Consortium for Political Research) Standing Group on Parliaments, June 28th, Leiden.
  • McLuhan, M. (1964). Understanding Media: The Extensions of Man. New American Library.
  • Robertson, J.R. (1990/1998). Television and the House of Commons. Ottawa, Law and Government Division of the House of Commons. Television and the House of Commons (BP-242E) (
  • La Rocca, R. (2015). Theory and praxis of parliamentary record construction. Towards a ‘universal grammar’ of parliamentary reporting? Presentation during Intersteno IPRS (Parliamentary and other professional Reporters’ Section) Budapest 2015.
  • Thomas, W.I. & D.S. Thomas (1928). The child in America: Behavior problems and programs. Alfred A. Knopf.
Showing 2 comments
pingbacks / trackbacks
  • […] Henk-Jan Eras:Reliability and Parliamentary Reporting […]

  • […] Henk-Jan Eras:Reliability and Parliamentary Reporting […]

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.