Secondly, it is fashion and clothing; teenagers think that their parents don’t understand it and that often start to argue. Finally communication is the most important in teenagers and adults life. Most of the conflicts are the consequence of not understanding each other, that’s why they have to talk a lot. Moreover, we can learn to be more respectful towards people who are different from us. But firstly we have to begin from ourselves. We must feel the tolerance. Person always has to think what he’s saying or doing because one day he might be in that other persons place. They say: “ Do not do to others what you would not want them to do to you.”Plastic Surgery for and against
The plastic surgery in world is very popular. A lot of people use this to make them looking younger or prettier. The new medical technology let us to make our body perfect. The plastic surgery have a lot of pros and cons. First of all the plastic surgery can be necessary for those people, who had an accident, in which they have lost their good looking. Also people who have burns. The plastic surgery can help them to remove all burnings or an accident hurt. After this they can back to their normal life easier. Then people feels better and self-confident. So plastic surgery show us some pluses. Moreover, the plastic surgery isn’t a medicine, which can save our life. Of course there are people who want to make their body better and prettier. They decide for plastic surgery, because they want to be more perfect. Also they show how much money they have and that that they are lazy they are tyngi go to do some sports that they body look sportive. In the world surgery become popular and trendy but it is unhealthy. In my opinion that some people do the plastic surgery because it takes them pleasure and other people do that because jiems tai butinybe pasalinti randus. It changes personality to good or bad side.

Dear Ruth, How are your days in Canada? I am writing to you because I would like to tell you about the celebration of the Last Bell at my school. First of all the weather was amazing and everybody was really happy about that. The sun was shining and it wasn't very hot. Moreover, on that day my classmates looked gorgeous. However, our ceremony was wonderful, but on the other hand sad. The most what I like about it was our song, because it made us feel that it is the end of school and we will never be together in our future. All in all, I would like to say that it is very hard to understand that I have to graduate my school, because I have a lot of things and people to remember. Also, the exams are coming and it will make me very stressful. Despite that, I will remember this school all my life. Write me soon.
Today I would like to talk about international day for tolerance First of all The international day for tolerance is held on 16 November with activities directed towards both education establishments and the wired public. Its aim is to promote respect and mutual understanding between people of different race and colour, religion and belief, ethnicity and tradition. heads of state and governments encourage dialogue and cooperation among different cultures, civilizations and peoples. During the International day of tolerance our students decorate school and make paper flowers with greetings witch later are handed out for people whom we meet on the street. This is really a good way to express tolerance. Moreover, at school we are taught about tolerance as developing the young people's ability to self-assess critical thinking and reasoning. Also, at school we are taught about tolerance, how it develops the young people's ability to independently value, critically think and philosophize about world religions, cultures and nations. All this knowing enriches us. In Lithuania international day for tolerance is commemoration. However, adolescence - a critical age period. This is the freedom and self-seeking time. So there are many problems of conflict among teenagers and adults. But I will include only three. Firstly, I think it is particularly parents who often don’t like the friend of the child, then teenagers argue with their parents because they want to be with their friends and their parents thinks that they are going bad way. And all this causes a conflict between them.

Intellectual property rights allow people to own ideas and have rights concerning what happens to these ideas, including how often they have used, what they are associated with and if they have permission to be copied. There are considered to be five different types of intellectual property.
All of these different types of communication are covered within copyright and patenting law, and have to be protected like any other copyrighted material.
Accumulated experience means experience gained over a number of years when a person has come across lots of different issues to do with the job. If the coca cola business can keep people in their jobs, the level of experience in the business can grow. Also when coca cola recruiting the people at coca cola they would like the people who wants to work at coca cola that they must be able to have the specific skill and accumulated experience in role at coca cola. Also coca cola would like them to be a specialist has that means they would need less training and very much hardly any teaching. but in some cases coca cola organisation have to be careful not to create a situation where, for example only one person can operate a piece of machinery or fix a software program. If that person then leaves the organisation it is difficult to maintain resource and the business may not function effectively.
Coca cola invest a lot of money in software and day to day use has coca cola are a global business and need the best software in order for them to still be the biggest companies around the world.  If coca cola haven’t bought a license or are under licensed meaning that runs software on more computers than they have a license for, they are breaking the law. Sometimes business may find themselves in the opposite position, where they are over licensing. This may be because a number of different departments have brought software individually.
Conclusion
                      Financial resources concern the ability of the business to "finance" its chosen strategy. For example, a strategy that requires significant investment in new products, distribution channels, production capacity and working capital will place great strain on the business finances. Such a strategy needs to be very carefully managed from a finance point-of-view.

                      The category of physical resources covers wide range of operational resources concerned with the physical capability to deliver a strategy. These include: production facilities, marketing facilities, information technology.

All buildings, plant and machinery require regular maintenance and updating. Even factories that work using flow production (where a product moves through successive processes in a single direction just like coca cola do making coke in bulks) 24 hours a day have to allow time to check that machines are working properly and minor adjustments to them if necessary. Coca cola need to make sure that the machinery is perfectly working for the coca cola to still begin made and efficient.
The health and safety at work act 1974 requires organisation to draw up polices and provisions for what should happen in an emergency in the coca cola factory and warehouse. The coca cola organisation also must provide other equipment within the building, including fire alarms to detect fire, fire escape for use to high buildings and fire extinguishers. This will motivated and help the staff at coca cola knowing that coca cola are doing their best to keep them safe and staff and employees will look to pay back to coca cola by working the socks off for the business. Also the employees must have a duty of care towards any customers or visitors who come into the workplace, so emergency provision is a high priority for any organisation.
All building owned or leased by businesses must have insurance. If the business owns the building, it will arrange this cover itself, but if the building is leased, it is often arranged with the landlord. A fee is paid each month and then protection is given to the business in the event that something happens to or within the building such as a flood, fire, earthquake or storm that damages the building and stock. Damages to any equipment, theft from the building etc. coca cola needs to look after the business in order for the business to still be consistent and organized.
Coca cola building must be made secure and looked after, even when employees have gone home. Some businesses employ full time security like coca cola to look after the warehouse and factories. Also sometimes using dogs to help them, the security must be in tact where the warehouse and factories must be looked after very well has it contains the products and the machinery which makes the products etc. coca cola security must into account the actions of people attempting to cause destruction.

Technological resources are more than just equipment computer hardware, such as a modem or router and monitor, is a physical resource and is treated as such. Technological resources are things like software, music or text. These resources are owned, make physical resources, and have to be managed in the same way. Technological resources can be considered in four areas which I will be talking about 

Report
Any business will need to have premises from where it can operate. Coca cola has premises where they can make their products like producing coca cola bottles and the actual coke itself. The coca cola uses factories in order for them to make the coke. The factory will have big machines making bulks of coca cola. So coca cola needs to have a big factory in order to make the coca cola. Also because coca cola are a worldwide business where it is a very most known business and lots of orders are put in from big retailers in the UK like Sainsbury and Tesco’s. So coca cola will need to have warehouses in order to stock the cokes and ship it in to vans and Lorries so they can send it to the big retailers who have ordered coca cola. In a warehouse of coca cola you see lots of forklifts and machinery to help to stock in the products in to the Lorries and vans.
The materials that are needed by a business will very much depend on the type of operation it is running and the people working there. A coca cola business making factory will need big machinery and forklifts in order for coca cola to be running efficiently and organized. All business especially needs to be careful about how much they use and try to avoid wasting materials so as to keep the cost low and help the environment.

Each business has specific requirements for the type of plant and machinery. A business may spend thousands of pounds on its factory and machinery by buying everything in order for coca cola to make the products. This is ideal for coca cola has they usually makes bulks of coca cola so they would need machinery in order for them to do it quicker and more efficient. also if it would have been hand made by man then it would take much longer to make which means the likes of retailers of Tesco’s and Sainsbury would have to think twice knowing it may take quite a long time for them to get what they want. Equipment is essential for a business to operate smoothly. Equipment is critical for profit and not for profit organization alike. Many organizations include information technology within their list of essential equipment, as both hardware and software are becoming essential pieces of kit.

Summary

Both physical and technological resources need to be managed carefully in an organization. Physical resources include the building, maintenance and security of the premises. Technological resources include the physical equipment, designs and drawings. Physical resources that the business needs to maintain in order to carry out its activities. They include things like building, facilities, plant and machinery. Management of physical resources involves planning maintenance and refurbishment, and includes organizing insurance and security to those resources safe. 
Introduction
Coca-Cola is a carbonated soft drink sold in stores, restaurants, and vending machines in more than 200 countries. Coca Cola was invented in May 1886 by Doctor John Pemberton a pharmacist from Atlanta, Georgia. John Pemberton concocted the Coca Cola formula in a three legged brass kettle in his backyard. The name was a suggestion given by John Pemberton's bookkeeper Frank Robinson.
The company Coca Cola produces concentrate, which is then sold to licensed Coca-Cola bottlers (a company that bottles beverages as part of a manufacturing process).throughout the world. The bottlers, who hold territorially exclusive contracts with the company, produce finished product in cans and bottles from the concentrate in combination with filtered water and sweeteners. The bottlers then sell, distribute and merchandise Coca-Cola to retail stores and vending machines.
The Coca-Cola Company has introduced other cola drinks under the Coke brand name. The most common of these is Diet Coke, with others including Caffeine-Free Coca-Cola, Diet Coke Caffeine-Free, Coca-Cola Cherry, Coca-Cola Zero, Coca-Cola Vanilla, and special versions with lemon, lime or coffee. Based on Interbrand's best global brand 2011, Coca-Cola was the world's most valuable brand.

Discrimination is a serious concern among recruiters. If discriminatory hiring practices can be proven, this could result in serious harm, both financially and in terms of reputation. The United States Department of Labor forbids discrimination based on race, colour, national origin, sex, disability, religion, political affiliation, religion or age. Steps can be taken to avoid such complications. First, advertise only the essential requirements for the position. Provide an accurate job description, listing only the position name and the specific duties involved. Things such as language proficiency or physical capabilities should not be listed unless they are absolutely essential for the role. When conducting interviews, ensure that the location is accessible by people with disabilities and refrain from holding interviews on religious or cultural holidays. Use the same questions for every candidate and try to have more than one recruiter present during the interview. Careful notes must be taken so that recruiters can justify hiring or not hiring the particular candidate.
Conclusion
I have concluded that recruitment process is very important for any company or organization. Recruiters play an important role in the success of an organization. They essentially act as a filter that -- when used properly -- only selects the best candidates. In a constantly changing business world, companies need to hire people who are adaptable, loyal, knowledgeable, dependable and confident, thereby creating a foundation for success.

dedicated to the organization will work hard to help it succeed. With this in mind, recruiters must ask questions that provide information about a candidate's strengths and weaknesses. Additionally, interviewers should inquire about a candidate's greatest achievements throughout her career. Generally, loyal employees will have a track record of striving for excellence, resulting in a more competitive, innovative and profitable business. According to the Chief Executive Officer of the Cummins Group, 'one of the most crucial decisions that a leader will make is the choice of those who will support them. In a highly technological competitive market, Cummins requires people who are not only technically competent, well informed, loyal and committed but also capable of showing good judgement, often under pressure. When submitting a written tender for a contract, for example, the team working on it has to decide just what the company can promise to deliver, when and at what price. This can be particularly tricky when offering a new product for the first time e.g. what level of 'after sales service' should the tender include, based on the company's assessment of the risks associated with its new venture.

Improper recruitment and selection practices can often result in high turnover or involuntary separations. If a recruiter is not careful when analysing resumes and conducting interviews, she may hire an employee with a weak work ethic or a tendency to move quickly from one job to the next -- "job hopping." Recruiters should pay close attention to the lengths of time at each previous job and carefully check references. Another so-called "red flag" is a gradual decrease in responsibility. For example, if the candidate starts out working as a senior manager and slowly shows progression to a less complex role, this could indicate that he is not as competent as he claims.

Report
                      Finding and recruiting the right people is critical to the success of any business and is the key to sustained growth and retaining a competitive edge.
                      Recruiting staff is a very costly exercise. Cost is a major reason why effective recruitment and selection is important. There are many ways in which poor recruitment practices can result in financial losses. For example, if a candidate's competency is not accurately assessed, he may make mistakes that can hinder productivity. If he needs to be retrained or replaced, this takes up more company time that could otherwise be invested toward remaining competitive. When organisations choose the right people for the job, train them well and treat them appropriately, these people not only produce good results but also tend to stay with the organisation longer. In such circumstances, the organisation's initial and ongoing investment in them is well rewarded. An organisation may have all of the latest technology and the best physical resources, but if it does not have the right people it will struggle to achieve the results it requires. This is true across the whole spectrum of business activity e.g. schools, hospitals, legal practices, restaurants, airlines, and diesel engine manufacturers.
Cummins is well aware of the importance of 'getting it right'. Poor choices at the recruitment stage can prove expensive. The company needs to be sure of a candidate's technical competence. For example, if an engineer designs a component that fails and has to be re-engineered, the company loses both time and money and may incur penalty charges on any delay in fulfilling particular contracts. Time and money spent in recruiting that particular employee will have proved expensive and wasteful whilst a better candidate may not only have 'got away' but also gone to a competitor. Cummins was recently first to market with a complete range of engines that met new stringent environmental legislation. Their technical solution to meeting this legislation was completely different to the competitor's approach. Had it turned out to be ineffective or not to be approved by government authorities, it could have led to the downfall of the company. The responsibility of making the correct decision was shared by relatively few individuals.
In addition to technical competence and appropriate experience, an organisation needs to be sure that it can rely on candidates' goodwill, loyalty and commitment towards the organisation and its aims. Loyalty and productivity are linked. Employees who feel 

Content
              In this report I will be talking about why companies should recruit effectively, about awareness of wrong recruitment, about where companies should stop and take a deep look. I will be using Cummins Company as an example to make everything easier.
Summary
It is important to list the skills your new hire will need to fulfil his duties. You get much better results in your recruitment process if you advertise specific criteria that are relevant to the job. Include all necessary skills, and include a list of desired skills that are not necessary but that would enhance the candidate's chances. If you fail to do this, you might end up with a low-quality pool of candidates and wind up with limited choices to fill the open position.
Introduction

This report is made to help other companies understand how much important good recruitment is. Companies usually only thing about the profit, they don’t give much attention to recruitment process and that is one of the biggest reason why so much companies fail. Many businesses are investing millions of dollars daily on recruitment, when in fact they should be focusing on employee retention. An army of recruiters are being sent to the front line every day, while no one is manning the back door. If you were fighting a war, you’d be dead. I expect many companies will experience the same fate, if they don’t start focusing on keeping the talent they have and choosing the right people that they need. 

All attacks we have identified exploit the lack of endpoint anonymity and are aided by the effects of free riding. We have seen effective legal measures on all peer-to-peer technologies that are used to provide effectively global access to copyrighted material. Centralized web servers were effectively closed down. Napster was effectively closed down. Gnutella and Kazaa are under threat because of free rider weaknesses and lack of endpoint anonymity.
Lack of endpoint anonymity is a direct result of the globally accessible global object database, and it is the existence of the global database that most distinguishes the newer darknets from the earlier small worlds. At this point, it is hard to judge whether the darknet will be able to retain this global database in the long term, but it seems seems clear that legal setbacks to global-index peer-to-peer will continue to be severe.  

However, should Gnutella-style systems become unviable as darknets, systems, such as Freenet or Mnemosyne might take their place. Peer-to-peer networking and file sharing does seem to be entering into the mainstream – both for illegal and legal uses.  If we couple this with the rapid build-out of consumer broadband, the dropping price of storage, and the fact that personal computers are effectively establishing themselves as centers of home-entertainment, we suspect that peer-to-peer functionality will remain popular and become more widespread.

Users of gnutella who share objects they have stored are not anonymous. Current peer-to-peer networks permit the server endpoints to be determined, and if a peer-client can determine the IP address and affiliation of a peer, then so can a lawyer or government agency. This means that users who share copyrighted objects face some threat of legal action. This appears to be yet another explanation for free riding.
There are some possible technological workarounds to the absence of endpoint anonymity.  We could imagine anonymizing routers, overseas routers, object fragmentation, or some other means to complicate the effort required by law-enforcement to determine the original source of the copyrighted bits. For example, Freenet tries to hide the identity of the hosts storing any given object by means of a variety of heuristics, including routing the object through intermediate hosts and providing mechanisms for easy migration of objects to other hosts. Similarly, Mnemosyne [10] tries to organize object storage, such that individual hosts may not know what objects are stored on them. It is conjectured in [10] that this may amount to common-carrier status for the host. A detailed analysis of the legal or technical robustness of these systems is beyond the scope of this paper.

2.4.3 Attacks

In light of these weaknesses, attacks on gnutella-style darknets focus on their object storage and search infrastructures. Because of the prevalence of super-peers, the gnutella darknet depends on a relatively small set of powerful hosts, and these hosts are promising targets for attackers.
Darknet hosts owned by corporations are typically easily removed. Often, these hosts are set up by individual employees without the knowledge of corporate management. Generally corporations respect intellectual property laws. This together with their reluctance to become targets of lawsuits, and their centralized network of hierarchical management makes it relatively easy to remove darknet hosts in the corporate domain.
While the structures at universities are typically less hierarchical and strict than those of corporations, ultimately, similar rules apply. If the .com and .edu T1 and T3 lines were pulled from under a darknet, the usefulness of the network would suffer drastically.
This would leave DSL, ISDN, and cable-modem users as the high-bandwidth servers of objects. We believe limiting hosts to this class would present a far less effective piracy network today from the perspective of acquisition because of the relative rarity of high-bandwidth consumer connections, and hence users would abandon this darknet.  However, consumer broadband is becoming more popular, so in the long run it is probable that there will be adequate consumer bandwidth to support an effective consumer darknet.
The obvious next legal escalation is to bring direct or indirect (through the affiliation) challenges against users who share large libraries of copyrighted material.  This is already happening and the legal threats or actions appear to be successful [7]. This requires the collaboration of ISPs in identifying their customers, which appears to be forthcoming due to requirements that the carrier must take to avoid liability[1] and, in some cases, because of corporate ties between ISPs and content providers. Once again, free riding makes this attack strategy far more tractable.
It is hard to predict further legal escalation, but we note that the DMCA (digital millennium copyright act) is a far-reaching (although not fully tested) example of a law that is potentially quite powerful.  We believe it probable that there will be a few more rounds of technical innovations to sidestep existing laws, followed by new laws, or new interpretations of old laws, in the next few years.



[1] The Church of Scientology has been aggressive in pursuing ISPs that host its copyright material on newsgroups.  The suit that appeared most likely to result in a clear finding, filed against Netcom, was settled out of court. Hence it is still not clear whether an ISP has a responsibility to police the users of its network.

Fully distributed peer-to-peer systems do not present the single points of failure that led to the demise of central MP3 servers and Napster. It is natural to ask how robust these systems are and what form potential attacks could take. We observe the following weaknesses in Gnutella-like systems:
·         Free riding
·         Lack of anonymity

2.4.1 Free Riding

Peer-to-peer systems are often thought of as fully decentralized networks with copies of objects uniformly distributed among the hosts. While this is possible in principle, in practice, it is not the case. Recent measurements of libraries shared by gnutella peers indicate that the majority of content is provided by a tiny fraction of the hosts [1].  In effect, although gnutella appears to be a peer-to-peer network of cooperating hosts, in actual fact it has evolved to effectively be another largely centralized system – see Fig. 2. Free riding (i.e. downloading objects without sharing them) by many gnutella users appears to be main cause of this development. Widespread free riding removes much of the power of network dynamics and may reduce a peer-to-peer network into a simple unidirectional distribution system from a small number of sources to a large number of destinations. Of course, if this is the case, then the vulnerabilities that we observed in centralized systems (e.g. FTP-servers) are present again. Free riding and the emergence of super-peers have several causes:
Peer-to-peer file sharing assumes that a significant fraction of users adhere to the somewhat post-capitalist idea of sacrificing their own resources for the “common good” of the network. Most free-riders do not seem to adopt this idea. For example, with 56 kbps modems still being the network connection for most users, allowing uploads constitutes a tangible bandwidth sacrifice. One approach is to make collaboration mandatory. For example, Freenet [6] clients are required to contribute some disk space. However, enforcing such requirements without a central infrastructure is difficult.
Existing infrastructure is another reason for the existence of super-peers. There are vast differences in the resources available to different types of hosts. For example, a T3 connection provides the combined bandwidth of about one thousand 56 kbps telephone connections.

The realization that centralized networks are not robust to attack (be it legal or technical) has spurred much of the innovation in peer-to-peer networking and file sharing technologies. In this section, we examine architectures that have evolved. Early systems were flawed because critical components remained centralized (Napster) or because of inefficiencies and lack of scalability of the protocol (gnutella) [17]. It should be noted that the problem of object location in a massively distributed, rapidly changing, heterogeneous system was new at the time peer-to-peer systems emerged. Efficient and highly scalable protocols have been proposed since then
The realization that centralized networks are not robust to attack (be it legal or technical) has spurred much of the innovation in peer-to-peer networking and file sharing technologies. In this section, we examine architectures that have evolved. Early systems were flawed because critical components remained centralized (Napster) or because of inefficiencies and lack of scalability of the protocol (gnutella) [17]. It should be noted that the problem of object location in a massively distributed, rapidly changing, heterogeneous system was new at the time peer-to-peer systems emerged. Efficient and highly scalable protocols have been proposed since then [9,23].

2.3.1. Napster
Napster was the service that ignited peer-to-peer file sharing in 1999 [14]. There should be little doubt that a major portion of the massive (for the time) traffic on Napster was of copyrighted objects being transferred in a peer-to-peer model in violation of copyright law. Napster succeeded where central servers had failed by relying on the distributed storage of objects not under the control of Napster. This moved the injection, storage, network distribution, and consumption of objects to users.
However, Napster retained a centralized database[1] with a searchable index on the file name. The centralized database itself became a legal target [15].  Napster was first enjoined to deny certain queries (e.g. “Metallica”) and then to police its network for all copyrighted content.  As the size of the darknet indexed by Napster shrank, so did the number of users.  This illustrates a general characteristic of darknets: there is positive feedback between the size of the object library and aggregate bandwidth and the appeal of the network for its users.

2.3.2. Gnutella

The next technology that sparked public interest in peer-to-peer file sharing was Gnutella. In addition to distributed object storage, Gnutella uses a fully distributed database described more fully in [13]. Gnutella does not rely upon any centralized server or service – a peer just needs the IP address of one or a few participating peers to (in principle) reach any host on the Gnutella darknet.  Second, Gnutella is not really “run” by anyone: it is an open protocol and anyone can write a Gnutella client application. Finally, Gnutella and its descendants go beyond sharing audio and have substantial non-infringing uses. This changes its legal standing markedly and puts it in a similar category to email. That is, email has substantial non-infringing use, and so email itself is not under legal threat even though it may be used to transfer copyrighted material unlawfully.


[1] Napster used a farm of weakly coupled databases with clients attaching to just one of the server hosts. 

By 1998, a new form of the darknet began to emerge from technological advances in several areas. The internet had become mainstream, and as such its protocols and infrastructure could now be relied upon by anyone seeking to connect users with a centralized service or with each other. The continuing fall in the price of storage together with advances in compression technology had also crossed the threshold at which storing large numbers of audio files was no longer an obstacle to mainstream users. Additionally, the power of computers had crossed the point at which they could be used as rendering devices for multimedia content. Finally, “CD ripping” became a trivial method for content injection.
The first embodiments of this new darknet were central internet servers with large collections of MP3 audio files. A fundamental change that came with these servers was the use of a new distribution network: The internet displaced the sneaker net – at least for audio content. This solved several problems of the old darknet.  First, latency was reduced drastically.
Secondly, and more importantly, discovery of objects became much easier because of simple and powerful search mechanisms – most importantly the general-purpose world-wide-web search engine. The local view of the small world was replaced by a global view of the entire collection accessible by all users. The main characteristic of this form of the darknet was centralized storage and search – a simple architecture that mirrored mainstream internet servers.
Centralized or quasi-centralized distribution and service networks make sense for legal online commerce.  Bandwidth and infrastructure costs tend to be low, and having customers visit a commerce site means the merchant can display adverts, collect profiles, and bill efficiently.  Additionally, management, auditing, and accountability are much easier in a centralized model.   
However, centralized schemes work poorly for illegal object distribution because large, central servers are large single points of failure: If the distributor is breaking the law, it is relatively easy to force him to stop.  Early MP3 Web and FTP sites were commonly “hosted” by universities, corporations, and ISPs.  Copyright-holders or their representatives sent “cease and desist” letters to these web-site operators and web-owners citing copyright infringement and in a few cases followed up with legal action [15].  The threats of legal action were successful attacks on those centralized networks, and MP3 web and FTP sites disappeared from the mainstream shortly after they appeared.

Prior to the mid 1990s, copying was organized around groups of friends and acquaintances.  The copied objects were music on cassette tapes and computer programs. The rendering devices were widely-available tape players and the computers of the time – see Fig. 1. Content injection was trivial, since most objects were either not copy protected or, if they were equipped with copy protection mechanisms, the mechanisms were easily defeated. The distribution network was a “sneaker net” of floppy disks and tapes (storage), which were handed in person between members of a group or were sent by postal mail. The bandwidth of this network – albeit small by today’s standards – was sufficient for the objects of the time. The main limitation of the sneaker net with its mechanical transport layer was latency. It could take days or weeks to obtain a copy of an object. Another serious limitation of these networks was the lack of a sophisticated search engine.
There were limited attempts to prosecute individuals who were trying to sell copyrighted objects they had obtained from the darknet (commercial piracy). However, the darknet as a whole was never under significant legal threat. Reasons may have included its limited commercial impact and the protection from legal surveillance afforded by sharing amongst friends.
The sizes of object libraries available on such networks are strongly influenced by the interconnections between the networks.  For example, schoolchildren may copy content from their “family network” to their “school network” and thereby increase the size of the darknet object library available to each.  Such networks have been studied extensively and are classified as “interconnected small-worlds networks.” [24] There are several popular examples of the characteristics of such systems. For example, most people have a social group of a few score of people.  Each of these people has a group of friends that partly overlap with their friends’ friends, and also introduces more people.  It is estimated that, on average, each person is connected to every other person in the world by a chain of about six people from which arises the term “six degrees of separation”.
These findings are remarkably broadly applicable (e.g. [20,3]).  The chains are on average so short because certain super-peers have many links.  In our example, some people are gregarious and have lots of friends from different social or geographical circles.. 
We suspect that these findings have implications for sharing on darknets, and we will return to this point when we discuss the darknets of the future later in this paper.
The small-worlds darknet continues to exist. However, a number of technological advances have given rise to new forms of the darknet that have superseded the small-worlds for some object types (e.g. audio).

We classify the different manifestations of the darknet that have come into existence in recent years with respect to the five infrastructure requirements described and analyze weaknesses and points of attack.
As a system, the darknet is subject to a variety of attacks. Legal action continues to be the most powerful challenge to the darknet. However, the darknet is also subject to a variety of other common threats (e.g. viruses, spamming) that, in the past, have lead to minor disruptions of the darknet, but could be considerably more damaging.

In this section we consider the potential impact of legal developments on the darknet. Most of our analysis focuses on system robustness, rather than on detailed legal questions. We regard legal questions only with respect to their possible effect: the failure of certain nodes or links (vertices and edges of the graph defined above). In this sense, we are investigating a well known problem in distributed systems.

Throughout this paper, we will call the shared items (e.g. software programs, songs, movies, books, etc.) objects. The persons who copy objects will be called users of the darknet, and the computers used to share objects will be called hosts.
The idea of the darknet is based upon three assumptions:
3.    Users are connected by high-bandwidth channels.

The darknet is the distribution network that emerges from the injection of objects according to assumption 1 and the distribution of those objects according to assumptions 2 and 3.
One implication of the first assumption is that any content protection system will leak popular or interesting content into the darknet, because some fraction of users--possibly experts–will overcome any copy prevention mechanism or because the object will enter the darknet before copy protection occurs.
The term “widely distributed” is intended to capture the notion of mass market distribution of objects to thousands or millions of practically anonymous users. This is in contrast to the protection of military, industrial, or personal secrets, which are typically not widely distributed and are not the focus of this paper.
Like other networks, the darknet can be modeled as a directed graph with labeled edges. The graph has one vertex for each user/host. For any pair of vertices (u,v), there is a directed edge from u to v if objects can be copied from u to v. The edge labels can be used to model relevant information about the physical network and may include information such as bandwidth, delay, availability, etc.  The vertices are characterized by their object library, object requests made to other vertices, and object requests satisfied.
To operate effectively, the darknet has a small number of technological and infrastructure requirements, which are similar to those of legal content distribution networks. These infrastructure requirements are:
1.    facilities for injecting new objects into the darknet (input)
2.    a distribution network that carries copies of objects to users (transmission)
3.    ubiquitous rendering devices, which allow users to consume objects (output)
4.    a search mechanism to enable users to find objects (database)
5.    storage that allows the darknet to retain objects for extended periods of time. Functionally, this is mostly a caching mechanism that reduces the load and exposure of nodes that inject objects.

The dramatic rise in the efficiency of the darknet can be traced back to the general technological improvements in these infrastructure areas. At the same time, most attempts to fight the darknet can be viewed as efforts to deprive it of one or more of the infrastructure items. Legal action has traditionally targeted search engines and, to a lesser extent, the distribution network. As we will describe later in the paper, this has been partially successful.  The drive for legislation on mandatory watermarking aims to deprive the darknet of rendering devices. We will argue that watermarking approaches are technically flawed and unlikely to have any material impact on the darknet. Finally, most content protection systems are meant to prevent or delay the injection of new objects into the darknet. Based on our first assumption, no such system constitutes an impenetrable barrier, and we will discuss the merits of some popular systems.
  We see no technical impediments to the darknet becoming increasingly efficient (measured by aggregate library size and available bandwidth).  However, the darknet, in all its transport-layer embodiments, is under legal attack. In this paper, we speculate on the technical and legal future of the darknet, concentrating particularly, but not exclusively, on peer-to-peer networks.

The rest of this paper is structured as follows. Section 2 analyzes different manifestations of the darknet with respect to their robustness to attacks on the infrastructure requirements described above and speculates on the future development of the darknet.  Section 3 describes content protection mechanisms, their probable effect on the darknet, and the impact of the darknet upon them. In sections 4 and 5, we speculate on the scenarios in which the darknet will be effective, and how businesses may need to behave to compete effectively with it.

People have always copied things. In the past, most items of value were physical objects.  Patent law and economies of scale meant that small scale copying of physical objects was usually uneconomic, and large-scale copying (if it infringed) was stoppable using policemen and courts.  Today, things of value are increasingly less tangible: often they are just bits and bytes or can be accurately represented as bits and bytes.  The widespread deployment of packet-switched networks and the huge advances in computers and codec-technologies has made it feasible (and indeed attractive) to deliver such digital works over the Internet.  This presents great opportunities and great challenges.  The opportunity is low-cost delivery of personalized, desirable high-quality content.  The challenge is that such content can be distributed illegally.  Copyright law governs the legality of copying and distribution of such valuable data, but copyright protection is increasingly strained in a world of programmable computers and high-speed networks.
For example, consider the staggering burst of creativity by authors of computer programs that are designed to share audio files.  This was first popularized by Napster, but today several popular applications and services offer similar capabilities.  CD-writers have become mainstream, and DVD-writers may well follow suit.  Hence, even in the absence of network connectivity, the opportunity for low-cost, large-scale file sharing exists.

We investigate the darknet – a collection of networks and technologies used to share digital content.  The darknet is not a separate physical network but an application and protocol layer riding on existing networks.  Examples of darknets are peer-to-peer file sharing, CD and DVD copying, and key or password sharing on email and newsgroups.  The last few years have seen vast increases in the darknet’s aggregate bandwidth, reliability, usability, size of shared library, and availability of search engines.  In this paper we categorize and analyze existing and future darknets, from both the technical and legal perspectives.  We speculate that there will be short-term impediments to the effectiveness of the darknet as a distribution mechanism, but ultimately the darknet-genie will not be put back into the bottle.  In view of this hypothesis, we examine the relevance of content protection and content distribution architectures.

On the other hand, what seems to be happening is that the "users" themselves have been gradually "modularising" culture.  In other words, modularity has been coming into modern culture from the outside, so to speak, rather than being built-in, as in industrial production. In the 1980s musicans start sampling already published music; TV fans start sampling their favorite TV series to produce their own “slash films,” game fans start creating new game levels and all other kinds of game modifications. (Mods “can include new items, weapons, characters, enemies, models, modes, textures, levels, and story lines.”) And of course, from the verry beginning of mass culture in early twentieth century, artists have immediately starting sampling and remixing mass cultural products – think of Kurt Schwitters, collage and particularly photomontage practice which becomes popular right after WWI among artists in Russia and Germany. This continued with Pop Art, appropriation art, and video art.

Enter the computer. In The Language of New Media I named modularity as one of the principles of computerised media. If before modularity principle was applied to the packaging of cultural goods and raw media (photo stock, blank videotapes, etc.), computerization modularizes culture on a structural level. Images are broken into pixels; graphic designs, film and video are broken into layers. Hypertext modularises text. Markup languages such as HTML and media formats such as QuickTime and MPEG-7 modularise multimedia documents in general. We can talk about what this modularisation already did to culture – think of World Wide Web as just one example - but this is a whole new conversation.

In short: in culture, we have been modular already for a long time already. But at the same time, “we have never been modular” - which I think is a very good thing.

The logic of culture often runs behind the changes in economy – so while modularity has been the basis of modern industrial society since the early twentiteh century, we only start seeing the modularity principle in cultural production and distribution on a large scale in the last few decades. While Adorno and Horkheimer were writing about "culture industry" already in the 1940s, it was not then - and its not today - a true modern industry.[7] In some areas such as production of Hollywood animated features or computer games we see more of the factory logic at work with extensive division of labor.  In the case of software enginnering (i.e. programming), software is put together to a large extent from already available software modules - but this is done by individual programmers or teams who often spend months or years on one project – quite diffirent from Ford production line assembling one identical car after another. In short, today cultural modularity has not reached the systematic character of the industrial standardisation circa 1913.

But this does not mean that modularity in contemporary culture simply lags behind industrial modularity, responsible for mass production. Rather, cultural modularity  seems to be governed by a diffirent logic than industrial modularity. On the one hand, “mass culture” is made possible by a complete industrial-type modularity on the levels of packaging and distribution. In other words, all the materials carriers of cultural content in the modern period have been standarised, just as it was done in the production of all goods - from first photo and films formats in the end of the nineteenth century to game catridges, DVDs, memory cards, interchangeable camera lenses, etc. But the actual making of content was never standardised in the same way.[8] So while mass culture involves putting together new products – fims, television programs, songs, games – from a limited repertoir of themes, narratives, icons using a limited number of conventions, this is done by the teams of human authors on a one by one basis. And whiile more recently we see the trend toward the resuse of cultural assets in comercial culture, i.e. media franchising – characters, settings, icons which appear not in one but a whole range of cultural products – film sequals, computer games, theme parks, toys, etc. – this does not seem to change the basic “pre-industrial” logic of  the production process) For Adorno, this individual character of each product is part of the ideology of mass culture: “Each product affects an individual air; individuality itself serves to reinforce ideology, in so far as the illusion is conjured up that the completely reified and mediated is a sanctuary from immediacy and life.”