De Voorstad groeit

What about Google’s anti-social practices?

Publised at 2bloggen.org
Author: Daniël Verhoeven

When Google announced its integrated phone service called Google Voice Thursday, it said something very loud.  Google is saying it wants to be the world's communication hub, and hundreds of companies - ranging from mobile phone operators to Skype to Microsoft better be listening.

When Google announced its integrated phone service called Google Voice Thursday, it said something very loud. Google is saying it wants to be the world’s communication hub, and hundreds of companies – ranging from mobile phone operators to Skype to Microsoft better be listening. Google wants it all.

Well lets first congratulate  Tim Berners-Lee and  Robert Calliau today at the 20th anniversary of the invention of the World Wide Web. Most people seem to have forgotten that it was originally designed as a collaborative tool at the CERN in Switserland. Tim Berners-Lee (see video) called it a Grassroot movement at the time. Today, if we let Google and other usurpators do,  it is threathened to become a gigantic billboard along a deserted highway. We want Internet to be more intelligible and re-conquer the public space on the Net to stop the privatization of communication. That’s why I want to spread a critical note ont the 20th birttday of the Web. Because the ad industry is blocking the Open Society on the Net

For some Google has become the Web. It is indeed omnipresent. The question is, isn’t Google withholding the social development on the Net. I think it is. I pre-publish here the introduction of an article to appear next week about the anti-social practices of the Google Octopus. Google is NOT an intelligent search engine because it is not intelligible.  Searching with Google is like  a blindfolded search without a guide. It illicitates trial an error methods a strategy to keep you hooked at the Google machine. Google could make a guide, but refuses to do so, because it needs your addiction to sell its ads.

In a previous article I intuitively described contextual search as finding information on the web NOT using Google. I was a little bit surprised about the interest for the story, because the idea of contextual search was still an embryonic idea. In this article I will develop this idea of contextual search further.

When looking to human search activities in the real world, the first thing that strikes me is that most of the time we do not search by trial and error. There is always an anticipation process, often also a conscious search strategy. For instance when looking for blackberries in a wood, we do not look in trees but search for thorny bushes because we know that blackberries do not grow on trees. The knowledge we used to anticipate our search I would like to define as the context of our search. But our knowledge is not static, it is a learning process. The first example is a fairly simple case, looking for things. Looking for information, also knowledge, is a bit more complicated. At the side of the searcher there are 2 steps in which the second step is used as feedback for the first:

1) searching information

2) learning to find information as a process

As a child, driven by curiosity, we start by asking questions to our parents. On that moment it is still clear that there is a communication process involved. When we learn that we can find information on other places, in libraries for example, we forget about the communication process, because we do not longer have a personal contact with the person who made the information we look for available. But if nobody would make information available we would not find any. In addition if the information provider thinks about who or what his information is for, his activity, be it broadcasting, publishing, content providing, should be  coherent with communicating.  I use the word coherent because I do not believe it is a model for communication, it is rather the other way around. Since basically we have learned to seek for information by communication an intelligent information provider will try to facilitate our search process mirroring its own search activity.

I will argument further a double paradigm shift from a behaviourist to communicational approach and from a system centred to people centred approach and explain why contextual search is a more efficient search strategy because it uses our basic communication skills.  Both paradigm shifts have already a long history. Noam Chomsky may be called the father of the first when criticising Skinner in 1959. One of the main aspects of the human brain is that it is anticipative, and this doesn’t fit in behaviourist theory. The people centered approach had many fathers, Piaget is one of them but we will relay in this article on the second wave of constructionism at the Biological Computer Lab where Gordon Pask is our guide.

I will be using data from linguistics, neurology, sociology and political economy in the first chapter to enlighten these paradigm shifts. In the second chapter I will oppose contextual search with actual digital search practices, taking Google as the most outspoken representative. It clearly wants to be the best in all fields, also in  applying behaviourist techniques to spy its users. Though Google and alike are not idiots. Vint Cerf, Googles evangelist even admits that Google is rather a big shovel instead of an intelligent  search engine:

“Look search today is messy. Think about one of those big construction shovels, you know, like a tractor with a big shovel on the front. And you have to operate it by pulling and pushing a series of levers. It’s big and imprecise. Using a search engine today feels like trying to move one of these Earth-moving shovels.” (Interview with Vint Cerf of Google, 13 August 2008, Siva Vaidhyanathan)

But it is for all a money shovel:

“The real brilliance of Google is the ability to monetize search through AdSense. This company uncovered the relationship between advertising and information. The old way of advertising had no direct interaction with the audience. But now the audience can click.” (same interview)


People centred techniques denuded from their context are used in certain domains, also by Google, but I will try to show that their primary paradigm is behaviourist and system centred. The following quote of Vincent Cerf shows this:

“There are things that computers can do that six billion humans can’t do. Computers have the scale capacity to discover and analyze things,” (same interview)

This a quite humiliating oracle for human kind in my opinion. Recently bionics made important progress by realising a bionic eye. But we must also consider the relativity of this realisation. Our retina contains 126 million sensitive cells, the bionic interface consisted out of only 60 electrodes. One estimate puts the human brain at about 100 billion (10^11) neurons and 100 trillion (10^14) synapses. Each neuron can be considered as a small biological computer on its own.  Google’s cloud computers is compared with the human brain merely a fart.

The very same techniques used for disambiguation, spamdexing, spamfiltering, natural language processing, machine learning and contextual advertising could also be used to provide contextual preselections/anticipations both in the quality and semantic domain, but they are not applied that way on the main search engines.

The fact alone Google uses some of these techniques for advertising but aren’t available for information search is a paradox. One problem is to demonstrate this because Google hides after the mask of Vint Cerfs delusional evangelism. Of course the best propaganda is the propaganda that isn’t recognised as such.

Google’s search system is a black box, all we know about it are the published patents, but the search algorithms used aren’t part of them.

Wiki Search acknowledges the demand for openness as you can read on the about Page:

“For hundreds of years, the most respected institutions have treated transparency as a requirement. Those who, of their own accord, promise openness find that with this pledge comes credibility, as, in the words of the late Justice Louis Brandeis of the United States Supreme Court, “Sunlight is the best disinfectant.” Indeed, those who avoid the light of scrutiny and instead opt for obfuscation are often assumed to be hiding something, and for good reason.”

Also Tim Bernbers-Lee stressed the need for open standards as you can see in this video of 2007.

more about “Web inventor Tim Berners-Lee Unplugge…“, posted with vodpod

The fundamental idea of Pask’s conversation theory is that learning occurs through conversations which make the knowledge explicit.  Therefore we need an open-system approach and that one is blocked by Google.

The case of so called ‘contextual ads’ is becoming problematic because it entails a privacy invading culture snooping and spying the Net, while we tend to consider the Net as an information source. Though is remains a chaotic and unreliable source since advance in contextual search isn’t structural.

Besides ‘Contextual advertising’ is typical Orwellian newspeak. Why not call it what it is: “unsolicited advertising” like “unsolicited mail”: SPAM. Is this the consequence of deregulating the Net in the Nineties? Anyway it is clear that deregulation didn’t work as expected in Net business. Instead of competition and diversity, concentration went faster than in any sector. A few big players control the Internet business despite the long tail proposal, which seems only to be applicable for niche markets I’m afraid. By treating information mainly as a commodity, all truth-value is stripped gradually leaving free play for all kind of manipulation techniques to revaluate information, to give status to a bunch of crap. This is a dangerous tendency.

As pod casters, broadcasters, bloggers, site builders, in one word: content providers we leave our traces on the Net. While the Net is a huge space to seek an audience, big players, telecom providers and service providers,  want to use the Net primarily to make money, annihilating human presence, turning and degrading communication into bits and bytes, into usable code for profit making. A free meal is never entirely free. Most people are not aware of that. This is the shadow side of Web 2.0. We should become aware of the reshaping of our communication practices using the Net. Why not make part of it infrastructure public or social owned. After all, Internet was developed in the public domain. If not, there would be not Internet, I’m afraid.

In the last chapter I’ll propose some operational definitions of contextual search and a framework for further investigation and experimentation. I will also relate contextual search to the semantic Web.

One of the intriguing axioms of Watzlawicks communication Theory is: “You cannot not communicate”. About nine years ago I started to doubt about that axiom. What I saw on the Net and around me was communication impoverishment and communication pollution, it felt like not communicating. I started an inquiry into the opposite of communication. I must admit now that I didn’t find the opposite, though I had undertaken an interesting voyage through a plethora of theories and research. Watzlawick was right, in real life we are always communicating, though the message today is more often than before that we are not available to communicate, we communicate something at least. But Watzlawick had no experience using Internet, using Google… The new technologies are reshaping our communication skills dramatically. We must analyse this with rigour.

One of the critics of modern liberty, Tony Curzon Price used the ‘homunculus‘ metaphor by pointing to Google’s attention deficit desorder. I’m doing basically the same expanding the metaphor he used to a paradigm. I want to compare our behaviour using  CMC based systems like Google with natural communication, a communication between content providers and content users. This may sound odd and completely off the record, but in fact I’m only re-joining a tradition that has started in the sixties and seventies at the Biological Computer Lab in Urbana Campaing by Gordon Pask[1]. I wil dive  into linguistic pragmatics and recent research in different fields, mainly neurology and sociology for a short refreshing swim.  New cybernetics was quite aware of the shortcomings of Computer Mediated Communication. Pask’s conversation Theory regards social systems as symbolic, language-oriented systems where responses depend on one person’s interpretation of another person’s behaviour, and where meanings are agreed through conversations. Interactions of Actors Theory was another important framework for research developed by Pask.

Conversation Theory and Pragmatics share important basic concepts on communication. Today pragmatics as well as conversation theory remain greatly ignored in Net applications. As to Pragmatics, and this is stressed in the Relevance Theory (Dan Sperber, Gloria Orriggi, 2006), the primary condition for success of the human communication system is overtness. Overtness hasn’t been a success story on the Net nor in society as we will show in the 2nd chapter. We are more then ever watched and spied upon. This is another argument to give it some more thought the way Pask and Van Foerster did (Müller, 2000) before the age of Internet.

This brings us to the political economy of information search. The urge for overtness in the political economy of communication is our guide. The complete commodification of knowledge and thereby of communication isn’t possible because it is not a commodity in the traditional sense of Adam Smith’s political economic theory. Making ‘le savoir’ a commodity is a reduction inflicted by those who want to earn money with knowledge and the desire to know, curiosity, a basic human drive. Google mixes these two concepts of knowledge: knowledge as a basic human quality acquired in experience, communication and learning and information as a commodity.

Gabriel Tarde noticed this market reduction more then 100 years ago in ‘Psychologie économique’. Knowledge is a value in itself not needing a market to spread but an educating parent, a classroom teacher, a university professor, a librarian, a trainer, a friend. Tardes theory got lost in time but in the information age it’s an eye opener. Instead of taking material production the famous needle factory of Adam Smith, as a starting point for his political economical analysis, he started with the analysis of ‘la production de connaissances’, ‘des valeurs vérités’ (truth values). Think about it as the production of a book, the production of a text, starting with the author having an idea to write about until the publication and acceptance (Lazzarato, Maurizio, 1999). Well on the Web, the text you are reading isn’t a commodity either, since it is published under the Creative Commons Licence. Anyway Google is going to use it to sell its clicks, to earn money with my work. Mixing the economic value of a book, text, with it’s truth value, creates ambiguity resulting in giving up the truth value. This might not have been the original purpose of Google, but it’s clear a result. So Tardes view is quite relevant for our information society and the way it treats knowledge.

My choice of scientific resources is limited, but consistant, others can add to it from other contexts. I hope they will. Moreover I do not leave Pask’s tradition since interdiscaplinarity was a main approach of Pask’s Interactions of Actors Theory.


[1] In Urbana-Campaign at the University of Illinois the Biological Computer Lab of Heinz Van Foerster was inquiring the man-machine interaction. A range of brilliant scientists developed new cybernetics there from 1958 until 1974. The most important were: Heinz von Foerster (fysics, biofysics, epistemology), von Glaserfeld (epistemology, radical constructivism), Maturana and Varela (biologists, radical constrivism), Gordon Pask (psychologist, neurologist, Conversation theory, Learning theories), Ashby (Cybernetics). Close to it, at  Palo Alto worked Watzlack and Bateson developping communication theory and double bind theory. Both teams were connected (Müller, 2000)

More about Uberveillance:

More about Google Watch

More about the Social Web

Keywords also in Wikipedia: Open Society, Closed Society, Advertising, Surveillance, Uberveillance, World Wide Web
Wordpress tags: Google, Google Watch, Internet, Language, Privacy, Surveillance, Uberveillance, Verhoeven Daniël, Web 2.0. Controlemaatschappij, Cyberocracy, Google, Google Watch, Internet, Invasion of Privacy, Uberveillance, World Wide Web, WWW.

For some Google has become the Web. It is indeed omnipresent. The question is, isn’t Google withholding the social development on the Net. I think it is. I pre-publish here the introduction of an article to appear next week about the anti-social practices of the Google Octopus. Google is NOT an intelligent search engine because it is not intelligible.  Searching with Google is like  a blindfolded search without a guide. It illicitates trial an error methods a strategy to keep you hooked at the Google machine. Google could make a guide, but refuses to do so, because it needs your addiction to sell its ads.

In a previous article I intuitively described contextual search as finding information on the web NOT using Google. I was a little bit surprised about the interest for the story, because the idea of contextual search was still an embryonic idea. In this article I will develop this idea of contextual search further.

When looking to human search activities in the real world, the first thing that strikes me is that most of the time we do not search by trial and error. There is always an anticipation process, often also a conscious search strategy. For instance when looking for blackberries in a wood, we do not look in trees but search for thorny bushes because we know that blackberries do not grow on trees. The knowledge we used to anticipate our search I would like to define as the context of our search. But our knowledge is not static, it is a learning process. The first example is a fairly simple case, looking for things. Looking for information, also knowledge, is a bit more complicated. At the side of the searcher there are 2 steps in which the second step is used as feedback for the first:

1) searching information

2) learning to find information as a process

As a child, driven by curiosity, we start by asking questions to our parents. On that moment it is still clear that there is a communication process involved. When we learn that we can find information on other places, in libraries for example, we forget about the communication process, because we do not longer have a personal contact with the person who made the information we look for available. But if nobody would make information available we would not find any. In addition if the information provider thinks about who or what his information is for, his activity, be it broadcasting, publishing, content providing, should be  coherent with communicating.  I use the word coherent because I do not believe it is a model for communication, it is rather the other way around. Since basically we have learned to seek for information by communication an intelligent information provider will try to facilitate our search process mirroring its own search activity.

I will argument further a double paradigm shift from a behaviourist to communicational approach and from a system centred to people centred approach and explain why contextual search is a more efficient search strategy because it uses our basic communication skills.  Both paradigm shifts have already a long history. Noam Chomsky may be called the father of the first when criticising Skinner in 1959. One of the main aspects of the human brain is that it is anticipative, and this doesn’t fit in behaviourist theory. The people centered approach had many fathers, Piaget is one of them but we will relay in this article on the second wave of constructionism at the Biological Computer Lab where Gordon Pask is our guide.

I will be using data from linguistics, neurology, sociology and political economy in the first chapter to enlighten these paradigm shifts. In the second chapter I will oppose contextual search with actual digital search practices, taking Google as the most outspoken representative. It clearly wants to be the best in all fields, also in  applying behaviourist techniques to spy its users. Though Google and alike are not idiots. Vint Cerf, Googles evangelist even admits that Google is rather a big shovel instead of an intelligent  search engine:

“Look search today is messy. Think about one of those big construction shovels, you know, like a tractor with a big shovel on the front. And you have to operate it by pulling and pushing a series of levers. It’s big and imprecise. Using a search engine today feels like trying to move one of these Earth-moving shovels.” (Interview with Vint Cerf of Google, 13 August 2008, Siva Vaidhyanathan)

But it is for all a money shovel:

“The real brilliance of Google is the ability to monetize search through AdSense. This company uncovered the relationship between advertising and information. The old way of advertising had no direct interaction with the audience. But now the audience can click.” (same interview)


People centred techniques denuded from their context are used in certain domains, also by Google, but I will try to show that their primary paradigm is behaviourist and system centred. The following quote of Vincent Cerf shows this:

“There are things that computers can do that six billion humans can’t do. Computers have the scale capacity to discover and analyze things,” (same interview)

This a quite humiliating oracle for human kind in my opinion. Recently bionics made important progress by realising a bionic eye. But we must also consider the relativity of this realisation. Our retina contains 126 million sensitive cells, the bionic interface consisted out of only 60 electrodes. One estimate puts the human brain at about 100 billion (10^11) neurons and 100 trillion (10^14) synapses. Each neuron can be considered as a small biological computer on its own.  Google’s cloud computers is compared with the human brain merely a fart.

The very same techniques used for disambiguation, spamdexing, spamfiltering, natural language processing, machine learning and contextual advertising could also be used to provide contextual preselections/anticipations both in the quality and semantic domain, but they are not applied that way on the main search engines.

The fact alone Google uses some of these techniques for advertising but aren’t available for information search is a paradox. One problem is to demonstrate this because Google hides after the mask of Vint Cerfs delusional evangelism. Of course the best propaganda is the propaganda that isn’t recognised as such.

Google’s search system is a black box, all we know about it are the published patents, but the search algorithms used aren’t part of them.

Wiki Search acknowledges the demand for openness as you can read on the about Page:

“For hundreds of years, the most respected institutions have treated transparency as a requirement. Those who, of their own accord, promise openness find that with this pledge comes credibility, as, in the words of the late Justice Louis Brandeis of the United States Supreme Court, “Sunlight is the best disinfectant.” Indeed, those who avoid the light of scrutiny and instead opt for obfuscation are often assumed to be hiding something, and for good reason.”

Also Tim Bernbers-Lee stressed the need for open standards as you can see in this video of 2007.

more about “Web inventor Tim Berners-Lee Unplugge…“, posted with vodpod

The fundamental idea of Pask’s conversation theory is that learning occurs through conversations which make the knowledge explicit.  Therefore we need an open-system approach and that one is blocked by Google.

The case of so called ‘contextual ads’ is becoming problematic because it entails a privacy invading culture snooping and spying the Net, while we tend to consider the Net as an information source. Though is remains a chaotic and unreliable source since advance in contextual search isn’t structural.

Besides ‘Contextual advertising’ is typical Orwellian newspeak. Why not call it what it is: “unsolicited advertising” like “unsolicited mail”: SPAM. Is this the consequence of deregulating the Net in the Nineties? Anyway it is clear that deregulation didn’t work as expected in Net business. Instead of competition and diversity, concentration went faster than in any sector. A few big players control the Internet business despite the long tail proposal, which seems only to be applicable for niche markets I’m afraid. By treating information mainly as a commodity, all truth-value is stripped gradually leaving free play for all kind of manipulation techniques to revaluate information, to give status to a bunch of crap. This is a dangerous tendency.

As pod casters, broadcasters, bloggers, site builders, in one word: content providers we leave our traces on the Net. While the Net is a huge space to seek an audience, big players, telecom providers and service providers,  want to use the Net primarily to make money, annihilating human presence, turning and degrading communication into bits and bytes, into usable code for profit making. A free meal is never entirely free. Most people are not aware of that. This is the shadow side of Web 2.0. We should become aware of the reshaping of our communication practices using the Net. Why not make part of it infrastructure public or social owned. After all, Internet was developed in the public domain. If not, there would be not Internet, I’m afraid.

In the last chapter I’ll propose some operational definitions of contextual search and a framework for further investigation and experimentation. I will also relate contextual search to the semantic Web.

One of the intriguing axioms of Watzlawicks communication Theory is: “You cannot not communicate”. About nine years ago I started to doubt about that axiom. What I saw on the Net and around me was communication impoverishment and communication pollution, it felt like not communicating. I started an inquiry into the opposite of communication. I must admit now that I didn’t find the opposite, though I had undertaken an interesting voyage through a plethora of theories and research. Watzlawick was right, in real life we are always communicating, though the message today is more often than before that we are not available to communicate, we communicate something at least. But Watzlawick had no experience using Internet, using Google… The new technologies are reshaping our communication skills dramatically. We must analyse this with rigour.

One of the critics of modern liberty, Tony Curzon Price used the ‘homunculus‘ metaphor by pointing to Google’s attention deficit desorder. I’m doing basically the same expanding the metaphor he used to a paradigm. I want to compare our behaviour using  CMC based systems like Google with natural communication, a communication between content providers and content users. This may sound odd and completely off the record, but in fact I’m only re-joining a tradition that has started in the sixties and seventies at the Biological Computer Lab in Urbana Campaing by Gordon Pask[1]. I wil dive  into linguistic pragmatics and recent research in different fields, mainly neurology and sociology for a short refreshing swim.  New cybernetics was quite aware of the shortcomings of Computer Mediated Communication. Pask’s conversation Theory regards social systems as symbolic, language-oriented systems where responses depend on one person’s interpretation of another person’s behaviour, and where meanings are agreed through conversations. Interactions of Actors Theory was another important framework for research developed by Pask.

Conversation Theory and Pragmatics share important basic concepts on communication. Today pragmatics as well as conversation theory remain greatly ignored in Net applications. As to Pragmatics, and this is stressed in the Relevance Theory (Dan Sperber, Gloria Orriggi, 2006), the primary condition for success of the human communication system is overtness. Overtness hasn’t been a success story on the Net nor in society as we will show in the 2nd chapter. We are more then ever watched and spied upon. This is another argument to give it some more thought the way Pask and Van Foerster did (Müller, 2000) before the age of Internet.

This brings us to the political economy of information search. The urge for overtness in the political economy of communication is our guide. The complete commodification of knowledge and thereby of communication isn’t possible because it is not a commodity in the traditional sense of Adam Smith’s political economic theory. Making ‘le savoir’ a commodity is a reduction inflicted by those who want to earn money with knowledge and the desire to know, curiosity, a basic human drive. Google mixes these two concepts of knowledge: knowledge as a basic human quality acquired in experience, communication and learning and information as a commodity.

Gabriel Tarde noticed this market reduction more then 100 years ago in ‘Psychologie économique’. Knowledge is a value in itself not needing a market to spread but an educating parent, a classroom teacher, a university professor, a librarian, a trainer, a friend. Tardes theory got lost in time but in the information age it’s an eye opener. Instead of taking material production the famous needle factory of Adam Smith, as a starting point for his political economical analysis, he started with the analysis of ‘la production de connaissances’, ‘des valeurs vérités’ (truth values). Think about it as the production of a book, the production of a text, starting with the author having an idea to write about until the publication and acceptance (Lazzarato, Maurizio, 1999). Well on the Web, the text you are reading isn’t a commodity either, since it is published under the Creative Commons Licence. Anyway Google is going to use it to sell its clicks, to earn money with my work. Mixing the economic value of a book, text, with it’s truth value, creates ambiguity resulting in giving up the truth value. This might not have been the original purpose of Google, but it’s clear a result. So Tardes view is quite relevant for our information society and the way it treats knowledge.

My choice of scientific resources is limited, but consistant, others can add to it from other contexts. I hope they will. Moreover I do not leave Pask’s tradition since interdiscaplinarity was a main approach of Pask’s Interactions of Actors Theory.


[1] In Urbana-Campaign at the University of Illinois the Biological Computer Lab of Heinz Van Foerster was inquiring the man-machine interaction. A range of brilliant scientists developed new cybernetics there from 1958 until 1974. The most important were: Heinz von Foerster (fysics, biofysics, epistemology), von Glaserfeld (epistemology, radical constructivism), Maturana and Varela (biologists, radical constrivism), Gordon Pask (psychologist, neurologist, Conversation theory, Learning theories), Ashby (Cybernetics). Close to it, at  Palo Alto worked Watzlack and Bateson developping communication theory and double bind theory. Both teams were connected (Müller, 2000)

More about Uberveillance:

More about Google Watch

More about the Social Web

Google, Google Watch, Internet, Language, Privacy, Surveillance, Uberveillance, Verhoeven Daniël, Web 2.0. Controlemaatschappij, Cyberocracy, Google, Google Watch, Internet, Invasion of Privacy, Uberveillance, World Wide Web, WWW.

Advertenties

Geef een reactie

Vul je gegevens in of klik op een icoon om in te loggen.

WordPress.com logo

Je reageert onder je WordPress.com account. Log uit / Bijwerken )

Twitter-afbeelding

Je reageert onder je Twitter account. Log uit / Bijwerken )

Facebook foto

Je reageert onder je Facebook account. Log uit / Bijwerken )

Google+ photo

Je reageert onder je Google+ account. Log uit / Bijwerken )

Verbinden met %s

Informatie

Dit bericht was geplaatst op 15/03/2009 door in ICTI, Internet en getagd als , , .

The precautionary principle

The precautionary principle or precautionary approach to risk management states that if an action or policy has a suspected risk of causing harm to the public or to the environment, in the absence of scientific consensus that the action or policy is not harmful, the burden of proof that it is not harmful falls on those taking an action.

Categorieën

Fotos in Heirniswijk genomen

Enter your email address to follow this blog and receive notifications of new posts by email.

Archief

Blog Stats

  • 12,358 hits

Sitemeter

%d bloggers liken dit: