Read The Label BOF (RTL) Reported by Anthony Rutkowski/Internet Society The RTL BOF, chaired by Vint Cerf, met on 19 July at the 33rd IETF in Stockholm. The session was multicast on the MBone. Introduction Vint Cerf feels that read the label may not be the right solution. The IETF should deal with issues that have arisen in the U.S. and elsewhere. Revolves in part around unlimited access by minors. Some believe this is too much freedom; that there should be some limitations on access where appropriate or required. Actually restricting content is impossible. It is such an openly accessible medium, anyone can make information available. It is also inimical to the freedom of expression that is the premise of the Internet. The objective of this BOF is to answer the following questions: o Does the IETF want to develop specifications for access control at the periphery of the network; at the point of access, rather than controlling content? o What kind of mechanisms might be suitable? The group cannot get into specifics today. If we do, we will not get a charter organized. This BOF should not get into who can control or implement -- that activity is likely voluntary. We would like to limit discussion, to whether something should be done, to a half hour. There may be other alternatives. These are just suggestions. The goal is to develop a consensus view. Ted Hardy suggested that an alternative to developing specifications is to develop principles regarding netiquette. It is a weaker outcome, but a possibility. Presentation by Tim Berners-Lee Tim Berners-Lee gave a brief presentation. He said that W3C will address this issue. Members have demanded it, and meetings will be held to amass information and proposals. Tim feels that a three party protocol should not be developed because a two party protocol can be produced more quickly. A two party system scales; it works very nicely. Such a mechanism should be there to provide the tools to allow people to act responsibly. A three party system could put information in a hypertext link which says some organization approves. This would make a Web of approval which can get very complex. There is a problem -- you cannot find all the references. It is a big problem without an immediate solution. Tim asked who should take the responsibility when something goes wrong? This raises significant legal issues. There are also security issues. You have to be able to believe the rating. Policy is something that society decides. We need to develop mechanisms to all people to express policy independently. Discussion/Comments Vint Cerf asked if the IETF should deal with this? Ted Hardy, NASA Applications Center, noted that this issue has been a concern. There are parallels to IP security. The simple fact is that a major portion of the Internet is under attack from people not unlike a security attack. The problem cannot be wished out of existence. Why should the IETF do it? It has the expertise. It is an international body. It can make sure the work proceeds without the built-in prejudices of the U.S. The solution needs to accommodate different viewpoints and values. Industry groups have axes to grind or are focusing narrowly. W3C, for example, will not cover other services. Win Treese, Open Market, said that a problem is lurking. Classifying content in ways that browsers can make decisions. One approach is to classify so that programs and people can make decisions. Book and magazine publishers are nervous about this. Classification is regarded as a film industry tool that they are reluctant to see applied more broadly. Karen Sollins, MIT, feels that the system should not have cultural biases. Could be U.S. centric. A three party system need not only come in the future. Indices can be built now. Elliot Lear, Silicon Graphics, pointed out that he personally uses filters for his mail. Technology is not limited to censorship. 850 mB per day of netnews is presently distributed. Third party habits are already used to some extent to filter already. You can also use a score system based on multiple experiences and reactions. You can have different sets of semantics. Erford Hudic, European Publishing, said that setting standards at source is almost impossible, especially from diverse cultures. Ratings by third parties are used in European experiences. You can also make some materials more difficult to access. It is like putting magazines on top shelf. An unknown participant asked what if we decided to do nothing? Legislation is being considered in the U.S. Views are mixed. This person does not know about things occurring elsewhere in the world. Existing work will probably continue. Legislative bodies could prosecute the wrong parties in providing the services. If we do not supply mechanisms, someone else might. Ali Batmore said that people are just trying to regulate others. Technical solutions are not necessary. Voluntary solutions will not work. Vint Cerf pointed out that by doing nothing, the result might deny access to the network for youngsters. Scott Bradner feels that if we fail to address this issue, not only would kids be denied material, but only materials suitable for six-year-olds would be made available to the rest of us. An unknown participant said that information that is libelous or contrary to political views, is also a problem. Singapore's actions in this area have been a problem. We need to assure attribution. Vint Cerf noted that there is more than just pornography involved here. Metadata that describes the nature of the data may actually be useful for other purposes. Chris Weider, Bunyip, feels that a two party solution is not technically feasible. Metadata infrastructure is not available now. A technical solution should use limited browser and third party rating mechanisms. John Klensin, MCI, said that there were two motivations. One is legally driven, the other is consensual. Voluntary solutions are readily available. All ratings need to have some kind of ordinal scale. Even nudity is relative depending on country or culture. Many will want to see what third parties think. This may cause a thousand flowers to bloom. It may be a robust marketplace. Charles Perkins, IBM, said that we have had experience in selecting material. This often has to do with children rejecting values of parents. Most of the use of the Internet is appropriate. Lots of misinformation is being propagated. There is lots of expertise available in the IETF. How are we going to provide access control information on the mass of information now being moved? Vint Cerf noted that problem servers tend to eliminate themselves. The problem seems self-solving. They either go away because of the traffic load or go behind a credit card barrier. Gary Malkin, Xylogics, feels that the IETF should not be involved in this issue. It should not create the mechanism. It will result in creating the ``ministry of truth''. If you tag information, you have given people keys of what to look for. A poll was taken to gauge the feelings of the audience. About 90 percent of the 100 people present suggested that the IETF do something. There is also the implication of doing nothing. Paul Mockapetris said that we should also ensure that access can be promoted. Rick Petkie, Compuserve, suggested that the IETF pursue this from a specification standpoint. You will have numerous groups going off and doing it if you do not. We support the idea of multiple rating services. There should be a rich amount of metadata provided. Barry Greene, Singapore Telecom, pointed out that lots of legislatures in the Asia-Pacific reason are looking at this problem. Rules would be imposed on Internet providers. It would hurt the growth in the ASEAN area. Providers will close up shop rather than risk prosecution if the imposed requirements cannot be met. Look at what happened in Hong Kong. It is a real issue. Singapore does not presently censor the Internet, although it is considering this for the future. Vint Cerf expressed that the amount of information is daunting. Not all of it could ever be marked. This is an issue that needs to be kept in mind. Christian Huitema, INRIA, reported that France has had some experience with Minitel. Once it started, there were problems with content. Legislative action was taken. Most offenders were removed. Others were put in a more expensive category. This happened ten years ago. It will happen again. We need to maintain liberties by allowing responsibilities to be asserted. Larry Masinter, Xerox, is in favor of a part of a solution for a larger problem. Most pornographic materials are copyrighted material. Vint Cerf said that would be satisfying if we wound up with an outcome that was otherwise very beneficial to the Internet. Jim Conklin, CREN, feels that this could be very useful. Metadata can be useful for searching and in research. Ron Daniel, LANL, has been working on URCs and on URC service. A rating service has been developed. This is one instance of a metadata service. It will result in new metadata infrastructure and applications. Steve Silverman, BT, reminded the group that third party systems would introduce a delay. Voluntary self-labeling makes more sense. Any third party system will be quickly overloaded. Vint Cerf pointed out that there is also the problem of legal liability with third party systems. Steve Moore, MooreWords, has similar concerns. Other bodies do not have expertise. The imposed solutions might not be technically feasible. The Internet community has its own values. That community should be selecting its own solutions. Frank Kastholtz, FTP Software, feels that we must assure that the mechanism is no big brother. Vint Cerf commented that most people would not accept a censorship model. Jonathan Pullen said that we may see different agencies doing it. Why not us? He is a system administrator for a high school and does not want to see the network pulled away from high schools. Vint Cerf suggested that 900 code systems experiences may have some relevance. Once implemented, they caused blocking services to come into existence. Another model is one that forces parties to take people off of mailing lists. John Tavs, IBM, indicated that some people want to censor. Mark Knopper, Ameritech, feels that parental control should be exercised. But, there are really a lot of ideas about content and access. IETF should not evaluate content types. We need to develop the tools for different cultures. An unknown participant said that third party approaches are not scalable. Mechanisms should be used that allow parents to control access. Also, providers need to be more responsible. Stu Weibel, OCLC, feels that issue should be addressed in the IETF. It is a resource description problem. It needs to be interoperable. He is against systems that are not neutral. Ali Bahreman, Enterprise Integration, would like action to be taken. There should be two working groups. One for developing the metadata problem and the other to deal with educating people. Christian Huitema asked about handling spies on the line. They could do a profile. We need to be concerned that technical solutions will not allow that. Chris Weider suggested that It is possible that if we provide the technical solution, IETF may be forced to provide the rating schemes. We need to make sure that boundaries exist so that the IETF will not be forced to rate. Gary Malkin said that there is no information out there that someone will not object to. We need to assure service providers are not responsible for content. Elliot Lear pointed out that there are a set of people who will not be satisfied. Larry Masinter commented that whether you have third party ratings or not is independent of who supplies it. This is not necessarily true and has not been the case in the experimental systems provided. You could also use audit systems. Access control is not the only mechanism. J. Allard, Microsoft, feels that despite anything we do, things will happen. Technologies and legislation will occur in home, work, government, and institutions. The 900 example raises a point. It was not designed with sufficient granularity. The blocking is unilateral. It is the only technical available. Censorship will be made on the basis of gross criteria (e.g., domain names) if sufficient granularity is not available. Legislation could not be very granular in blocking out information. Ted Hardy said that the IETF can define a content role. It can provide methods to take responsibility for themselves. Tim Berners-Lee said that W3C is not an Orwellien entity. We would be happy to work with IETF. Consortium work is proceeding because its members have said the work is necessary. The spread of Internet is already being impeded because of this problem. Orwellian fear, and a fear of centralization -- you can build something that is decentralized both technically and socially. Summary of Actions Vint presented the following summary of actions to be considered: 1. A working group could be formed in the IETF to look at a narrow way of dealing with this problem; address the issue that if a browser or server were to be created, schools or parents could acquire and run these tools. 2. Enough other activity is occurring that we should take advantage of it. 3. Have longer term possibilities that metadata could be used for other purposes. 4. A short term working group is being considered that would establish the means for ``kids places''. A metadata group exists already. A poll was taken and only two people objected. This will be a short-term effort. It should also have an educational component. There were a few more comments from the floor: Ron Daniel is a principal author on a URI metadata proposal. The next draft will be out in three months. Dirk will be rolling out something in October. Ron has something now. John Klensin stated that that work is not acceptable to the relevant directors as presently organized. Steve Moore feels that a general metadata solution is not likely for two to three years. A more narrow short-term solution is needed. Guy Almes thinks that the IETF also needs to ask what perverse uses this could be put to.