The number of external services we use on our websites has grown exponentially over the past couple of years. This increases the effort needed to integrate them into a working solution. Website complexity and page weight are growing to a large extent due to these services that the average visitor isn't even noticing.
While using a couple of services might be fine, some websites are using too much, with the user's data being sent in multiple directions across the web. This, of course, increases the likelihood that it will be misused for the wrong purposes. We can see that the most commonly used scripts are in the area of analytics and social networking. Big corporations do everything to convince us that these plugins are a necessary part of how we should design our websites. For example, Google will show an image in its list of organic search results only if a person has a Google+ account that is linked to that website. It also offers Google Analytics, that allows the company to gather data on competitive websites for its own analytics goals. In other words, corporations are serving their own purposes with the scripts we put on our websites. They want us to believe that these services are needed. But we fail to see this, because our need for growth dictates the extent to which we are ready to sacrifice our common sense. So we usually think that "more is better".
There are a lot of similar services dedicated to web analytics and social networking and putting them all on our website only helps us do the same thing multiple times. In a sense, duplication doesn't exist only in code, but in where we decide to put the site owner's attention. The mark of a good design is that it doesn't duplicate script responsibilities. It's hard to understand what people want to learn from three different analytics packages (client-side only). Even if these packages have different advantages and disadvantages, the results they show will be incomparable due to differences in their implementation. But they will most certainly all show the number of unique visitors, visits, average time spent on the site and so on, which is an overlap of responsibilities. We could only think that averaging the results will give us a more accurate picture, but this isn't necessarily true. At the end, these packages were integrated, because it wasn't clear from the start what the few most important metrics were that the company wanted to track. As a result, everything and everyone is tracked. But, as we know, good design is specific and does nothing more than needed.
Now we have many social networks and all of them want to put likes, pluses or achievements on our websites. The common opinion is that they are needed for better marketing of a website. The truth is that corporations need them to spread their services virally at the expense of site owners. By embedding them, part of the growth of our website gets outsourced to these companies and since they grow from a variety of places (due to network effects), this virtually guarantees that a small website will never be able to reach a critical mass of users in the shadow of a giant network being fed by millions of similar ones.
There is a way to see how many external services are used on a website. The Ghostery plug-in for Firefox and Chrome shows even their names, so we can see if there are any overlapping responsibilities. This post probably wouldn't exist if I haven't seen the number 35 on a well-known site that has lots of traffic. This just shows the extent to which every click and keystroke are been tracked today. At the same time, I'm not aware of a single company that publicly declares the external services it uses and the reasons for that. Even in the case of small changes in a long list, it is required to inform users, because they have the right to know who is actually serving them.
Suppose that a user had a bad experience with company X and company Y uses a service of company X. If the user knew that, he might reconsider his/her choice of Y. Ethics plays an important role in personal choice, but it's much more convenient for companies to hide what would be its effects. It's often the lack of transparency with what goes under the hood that scares people and makes them reluctant to use websites or sign up on them. This then becomes a problem for the whole network, not just for the small website owner.
Suppose that a site is hacked and a small script is injected between these 35 external services. How is someone going to find this? Can a single developer see quickly at a glance what is going on? How could a user potentially identify this "add-on" if he wasn't initially informed of the third-party services the company uses? We shouldn't forget about the self-correcting capabilities of the crowd. Informing users gives them the power to quickly bring things to their equilibrium position, even if we ourselves were blind for the consequences.
Another problem is that with 35 services, the chance that at least a couple of them violate our principles rises. Even the combinations in which they start to interact will no longer be trivial and might create future problems that are hard to predict. Every service is a dependency and it becomes much harder to know which one has a new API and when, which increases the chance to use legacy systems. We could ask ourselves: "What is the immediate value for the user this service is providing?" In many cases a service call will be triggered only if a user does something specific, but the code is downloaded every time for every user, no matter if this happens or not. This means that scripts that are hundreds of kilobytes might never get executed.
I'm not saying that external services can't be hidden from our eyes even when we can see concrete names as in the case of Ghostery. They can and this will often be done through proxies on the server. But if people knew the dependencies of a website, they would have been more understanding in case something goes wrong with that external service (which can't be controlled) and they would know what to expect. (This isn't to say that the designer shouldn't actively work to provide internally fabricated fallbacks for everything external that might stop working.) Simply because a service remains invisible doesn't mean that we don't need to declare that we are using it.
We should strive for zero ghosts on our websites. This requires to put our whole arsenal of knowledge, methodologies, approaches and gut feeling into it, because doing things simple is often much harder. Not only because the site will save a lot of HTTP requests as a side-effect, but because it will raise much fewer questions. This is good design.