The preparation is loaded into wells at one end of the gel. At least one well is filled with reference DNA (i.e. DNA fragments of known length) for comparison with those of unknown length. Electric current is applied at opposite ends of electrophoresis chamber. A current is generated between a negative electrode at the top of loading end of the gel and a positive electrode at the bottom of the end of gel resulting in movement of fragments through pores of the gel. DNA molecules have a negative electric charges due to PO4(4-) which alternate with sugar molecules. Opposite electric charges tend to attract one another. The small DNA molecules move at faster speed as compared to larger ones. All DNA molecules of a given length migrate nearly the same distance into the gel and form bands. Each band represents many copies of DNA fragments having about the same length. After completion of electrophoresis gel is removed from the chamber and stained to make bands easily seen either with ethidium bromide (EB) or methylene blue. When gel is illuminated with UV light, fluorescent orange bands appear due to EB; methylene blue results in blue bands under normal room temperature.
Tuesday, December 30, 2008
Electrophoresis - Seperation and purification of DNA fragments
The preparation is loaded into wells at one end of the gel. At least one well is filled with reference DNA (i.e. DNA fragments of known length) for comparison with those of unknown length. Electric current is applied at opposite ends of electrophoresis chamber. A current is generated between a negative electrode at the top of loading end of the gel and a positive electrode at the bottom of the end of gel resulting in movement of fragments through pores of the gel. DNA molecules have a negative electric charges due to PO4(4-) which alternate with sugar molecules. Opposite electric charges tend to attract one another. The small DNA molecules move at faster speed as compared to larger ones. All DNA molecules of a given length migrate nearly the same distance into the gel and form bands. Each band represents many copies of DNA fragments having about the same length. After completion of electrophoresis gel is removed from the chamber and stained to make bands easily seen either with ethidium bromide (EB) or methylene blue. When gel is illuminated with UV light, fluorescent orange bands appear due to EB; methylene blue results in blue bands under normal room temperature.
Celera Genomics & HGP
Goals of the original Human Genome Project (HGP)
- identify all the approximately 20,000-25,000 genes in human DNA,
- determine the sequences of the 3 billion chemical base pairs that make up human DNA,
- store this information in databases,
- improve tools for data analysis,
- transfer related technologies to the private sector, and
- address the ethical, legal, and social issues (ELSI) that may arise from the project.
The goals of the original HGP were not only to determine all 3 billion base pairs in the human genome with a minimal error rate, but also to identify all the genes in this vast amount of data. This part of the project is still ongoing although a preliminary count indicates about 30,000 genes in the human genome, which is far fewer than predicted by most scientists.Another goal of the HGP was to develop faster, more efficient methods for DNA sequencing and sequence analysis and the transfer of these technologies to industry.The sequence of the human DNA is stored in databases available to anyone on the Internet. The U.S. National Center for Biotechnology Information (and sister organizations in Europe and Japan) house the gene sequence in a database known as Genbank, along with sequences of known and hypothetical genes and proteins. Other organizations such as the University of California, Santa Cruz, and ENSEMBL present additional data and annotation and powerful tools for visualizing and searching it. Computer programs have been developed to analyze the data, because the data themselves are difficult to interpret without them.The process of identifying the boundaries between genes and other features in raw DNA sequence is called genome annotation and is the domain of bioinformatics. While expert biologists make the best annotators, their work proceeds slowly, and computer programs are increasingly used to meet the high-throughput demands of genome sequencing projects. The best current technologies for annotation make use of statistical models that take advantage of parallels between DNA sequences and human language, using concepts from computer science such as formal grammars.Another, often overlooked, goal of the HGP is the study of its ethical, legal, and social implications. It is important to research these issues and find the most appropriate solutions before they become large dilemmas whose effect will manifest in the form of major political concerns.All humans have unique gene sequences, therefore the data published by the HGP does not represent the exact sequence of each and every individual's genome. It is the combined genome of a small number of anonymous donors. The HGP genome is a scaffold for future work in identifying differences among individuals. Most of the current effort in identifying differences among individuals involves single nucleotide polymorphisms and the HapMap.
How it was accomplished
The publicly funded groups NIH, the Sanger Institute in Great Britain, and numerous groups from around the world broke the genome into larger pieces; approximately 150,000 base pairs in length. These pieces are called "bacterial artificial chromosomes", or BACs, because they can be inserted into bacteria where they are copied by the bacterial replication machinery. Each of these pieces was then sequenced separately as a small "shotgun" project and then assembled. The larger, 150,000 base pair chunks were then stitched together to create chromosomes. This is known as the "hierarchical shotgun" approach, because the genome is first broken into relatively large chunks, which are then mapped to chromosomes before being selected for sequencing. The whole-genome shotgun (WGS) method is faster and cheaper, and by 2003 - thanks to the availability of clever assembly algorithms - it had become the standard approach to sequencing most mammalian genomes.
Whose genome was sequenced?
In the international public-sector Human Genome Project (HGP), researchers collected blood (female) or sperm (male) samples from a large number of donors. Only a few of many collected samples were processed as DNA resources. Thus the donor identities were protected so neither donors nor scientists could know whose DNA was sequenced. DNA clones from many different libraries were used in the overall project, with most of those libraries being created by Dr. Pieter J. de Jong. It has been informally reported, and is well known in the genomics community, that much of the DNA for the public HGP came from a single anonymous male donor from the state of New York.Technically, it is much easier to prepare DNA cleanly from sperm than from other cell types because of the much higher ratio of DNA to protein in sperm and the much smaller volume in which purifications can be done. Using sperm does provide all chromosomes for study, including equal numbers of sperm with the X (female) or Y (male) sex chromosomes. HGP scientists also used white cells from the blood of female donors so as to include female-originated samples. One minor technical issue is that sperm samples contain only half as much DNA from the X and Y chromosomes as from the other 22 chromosomes (the autosomes); this happens because each sperm cell contains only one X or one Y chromosome, but not both. Thus in 100 sperm cells, on average there will be 50 X and 50 Y chromosomes, as compared to 100 copies of each of the other chromosomes.Although the main sequencing phase of the HGP has been completed, studies of DNA variation continue in the International HapMap Project, whose goal is to identify patterns of SNP groups (called haplotypes, or “haps”).
The DNA samples for the HapMap came from a total of 270 individuals: Yoruba people in Ibadan, Nigeria; Japanese in Tokyo; Han Chinese in Beijing; and the French Centre d’Etude du Polymorphisme Humain (CEPH) resource, which consisted of residents of the United States having ancestry from Western and Northern Europe.In the Celera Genomics private-sector project, DNAs from five different individuals were used for sequencing. The lead scientist of Celera Genomics at that time, Craig Venter, later acknowledged (in a public letter to the journal Science) that his DNA was one of those in the pool.The Human Genome Projects - Benefits
What's Turning Genomics Vision Into Reality
In "A Vision for the Future of Genomics Research," published in the April 24, 2003 issue of the journal Nature, the National Human Genome Research Institute (NHGRI) details a myriad of research opportunities in the genome era. This backgrounder describes a few of the more visible, large-scale opportunities.
The International HapMap Project
Launched in October 2002 by NHGRI and its partners, the International HapMap Project has enlisted a worldwide consortium of scientists with the goal of producing the "next-generation" map of the human genome to speed the discovery of genes related to common illnesses such as asthma, cancer, diabetes and heart disease.Expected to take three years to complete, the "HapMap" will chart genetic variation within the human genome at an unprecedented level of precision. By comparing genetic differences among individuals and identifying those specifically associated with a condition, consortium members believe they can create a tool to help researchers detect the genetic contributions to many diseases. Whereas the Human Genome Project provided the foundation on which researchers are making dramatic genetic discoveries, the HapMap will begin building the framework to make the results of genomic research applicable to individuals.
ENCyclopedia Of DNA Elements (ENCODE)
This NHGRI-led project is designed to develop efficient ways of identifying and precisely locating all of the protein-coding genes, non-protein-coding genes and other sequence-based, functional elements contained in the human DNA sequence. Creating this monumental reference work will help scientists mine and fully utilize the human sequence, gain a deeper understanding of human biology, predict potential disease risk, and develop new strategies for the prevention and treatment of disease.The ENCODE project will begin as a pilot, in which participating research teams will work cooperatively to develop efficient, high-throughput methods for rigorously and fully analyzing a defined set of target regions comprising approximately 1 percent of the human genome. Analysis of this first 30 megabases (Mb) of human genome sequence will allow the project participants to test and compare a variety of existing and new technologies to find the functional elements in human DNA.
Chemical Genomics
NHGRI is exploring the acquisition and/or creation of publicly available libraries of organic chemical compounds, also referred to as small molecules, for use by basic scientists in their efforts to chart biological pathways. Such compounds have a number of attractive features for genome analysis, including their wide structural diversity, which mirrors the diversity of the genome; their ability in many cases to enter cells readily; and the fact that they can often serve as starting points for drug development. The use of these chemical compounds to probe gene function will complement more conventional nucleic acid approaches.This initiative offers enormous potential. However, it is a fundamentally new approach to genomics, and largely new to basic biomedical research as a whole. As a result, substantial investments in physical and human capital will be needed. NHGRI is currently planning for these needs, which will include large libraries of chemical compounds (500,000 - 1,000,000 total); capacity for robotic-enabled, high-throughput screening; and medicinal chemistry to convert compounds identified through such screening into useful biological tools.
Genomes to Life
The Department of Energy's "Genomes to Life" program focuses on single-cell organisms, or microbes. The fundamental goal is to understand the intricate details of the life processes of microbes so well that computational models can be developed to accurately describe and predict their responses to changes in their environment."Genomes to Life" aims to understand the activities of single-cell organisms on three levels: the proteins and multi-molecular machines that perform most of the cell's work; the gene regulatory networks that control these processes; and microbial associations or communities in which groups of different microbes carry out fundamental functions in nature. Once researchers understand how life functions at the microbial level, they hope to use the capabilities of these organisms to help meet many of our national challenges in energy and the environment.
Structural Genomics Consortium
Structural genomics is the systematic, high-throughput generation of the three-dimensional structure of proteins. The ultimate goal for studying the structural genomics of any organism is the complete structural description of all proteins encoded by the genome of that organism. Such three-dimensional structures will be crucial for rational drug design, for diagnosis and treatment of disease, and for advancing our understanding of basic biology. A broad collection of structures will provide valuable biological information beyond that which can be obtained from individual structures.
Dell TM Optiplex TM 360
Gigabyte GA-G31MX-S2/S3L Motherboards
Intel To Launch mobile Quad-Core Processors
Nvidia Launches Tegra Family of Processor
AMD Ships Tri-Core Processors
CREATIVE X-Figo
Windows Mobile?
Saturday, December 27, 2008
Internet Overtakes Newspapers As a News Source In 2008
According to Pew Research, 40% say they get most of their news about national and international issues from the internet, up from just 24% in September 2007. Television continues to be cited most frequently as a main source for national and international news, at 70%.
The future looks dim for television and newspapers.
For young people, though, the internet now rivals television as a main source of national and international news. Nearly six-in-ten Americans younger than 30 (59%) say they get most of their national and international news online; an identical percentage cites television.
The percentage of people younger than 30 citing television as a main news source has declined from 68% in September 2007 to 59% currently. This mirrors a trend seen earlier this year in campaign news consumption. (See “Internet Now Major Source of Campaign News,” News Interest Index, Oct. 31, 2008.)
The survey by the Pew Research Center for the People & the Press, conducted Dec. 3-7 among 1,489 adults, finds there has been little change in the individual TV news outlets that people rely on for national and international news. Nearly a quarter of the public (23%) says they get most of their news from CNN, while 17% cite Fox News; smaller shares mention other cable and broadcast outlets.
In an interview with a British newspaper The Daily Telegraph, Andy Burnham, the UK Culture Secretary, said that the Internet could be given cinema-style age ratings as part of an international crackdown on offensive and harmful online activity.
Calling the Internet "quite a dangerous place," the Cabinet minister also said, "... I think we are having to revisit that stuff seriously now. It's true across the board in terms of content, harmful content, and copyright. Libel is [also] an emerging issue.... There is content that should just not be available to be viewed. That is my view. Absolutely categorical. This is not a campaign against free speech, far from it; it is simply there is a wider public interest at stake when it involves harm to other people. We have got to get better at defining where the public interest lies and being clear about it."
International cooperation is viewed as essential by the UK Culture Secretary, and the new Obama administration offers new opportunities. "The change of administration is a big moment. We have got a real opportunity to make common cause," he says. "The more we seek international solutions to this stuff - the UK and the US working together - the more that an international norm will set an industry norm."
My view is that, despite the very negative reaction by those commenting on the article, several of the proposals mentioned by the Culture Secretary will be coming soon - probably in 2009. This interview offers a glimpse into what the current thinking is regarding Internet decency. As with other aspects of the Internet, the international challenges are immense, but UK experts are obviously working closely with their US counterparts on specific next steps.
Web ratings would be a significant, and very controversial, development for the public sector and for society as a whole. All online content would need to be classified (similar to movies but in real-time at sites like YouTube). Opponents argue that any rating systems will be biased and flawed.
No doubt, the new technology and processes required by the masses would be overwhelming. There are great arguments against government intervention. Current laws around Internet piracy can't even be enforced. What new enforcement police will be put in place? What happens to rating violators? Who decides what's what? What about sites that cross into mutiple categories (like newspapers). Is this approach "big brother" from government? How can we monitor real-time blogs, health sites, or other content that falls into various shades of gray?
I agree that the obstacles are huge, and yet I (reluctantly) support aspects of Andy Burnham's position. The negative attacks are unfair and don't offer workable solutions. We can't keep doing the same things and expect different results online. We must provide mechanisms for families to surf their values and not let a minority of "bad guys" control the Internet. While it would be best if the technology tools existed now to maintain one's integrity online without government involvement, our problems are getting worse - not better. A few weeks back, I wrote about ISAlliance's newly proposed cyber security social contract, which would also help if implemented.
What we need is easy-to-use technology to help move pragmatic proposals forward. No doubt, the big Internet players like Microsoft and Google are also involved in planning efforts. Perhaps proposals should start off with voluntary standards and extensive new training by ISPs? However, I agree with opponents that technology and legislation alone will not solve our Internet decency problems. We need to win the hearts and minds of the majority online. And yet, we also need to police the bad actors online. Setting appropriate standards (like speed limits on highways) is an important step.
How Xbox Works
Inside the X
The Games
How 3DO Creates Video Games
Developing the Game
Wednesday, December 24, 2008
Prey of the Carnivore
• Terrorism
• Child pornography/exploitation
• Espionage
• Information warfare
• Fraud There are some key issues that are causing a great deal of concern from various sources: • Privacy - Many folks think that Carnivore is a severe violation of privacy. While the potential for abuse is certainly there, the Electronic Communications Privacy Act (ECPA) provides legal protection of privacy for all types of electronic communication. Any type of electronics surveillance requires a court order and must show probable cause that the suspect is engaged in criminal activities. Therefore, use of Carnivore in any way that does not adhere to ECPA is illegal and can be considered unconstitutional.
• Regulation - There is a widespread belief that Carnivore is a huge system that can allow the U.S. government to seize control of the Internet and regulate its use. To do this would require an amazing infrastructure -- the FBI would need to place Carnivore systems at every ISP, including private, commercial and educational. While it is theoretically possible to do so for all of the ISPs operating in the United States, there is still no way to regulate those operating outside of U.S. jurisdiction. Any such move would also face serious opposition from every direction.
• Free speech - Some people think that Carnivore monitors all of the content flowing through an ISP, looking for certain keywords such as "bomb" or "assassination." Any packet sniffer can be set to look for certain patterns of characters or data. Without probable cause, though, the FBI has no justification to monitor your online activity and would be in severe violation of ECPA and your constitutional right to free speech if it did so.
• Echelon - This is a secret network rumored to be under development by the National Security Agency (NSA), supposedly designed to detect and capture packets crossing international borders that contain certain keywords, such as "bomb" or "assassination." There is no solid evidence to support the existence of Echelon. Many people have confused this rumored system with the very real Carnivore system. All of these concerns have made implementation of Carnivore an uphill battle for the FBI. The FBI has refused to disclose the source code and certain other pieces of technical information about Carnivore, which has only added to people's concerns. But, as long as it is used within the constraints and guidelines of ECPA, Carnivore has the potential to be a useful weapon in the war on crime.
The Process
1. The FBI has a reasonable suspicion that someone is engaged in criminal activities and requests a court order to view the suspect's online activity.
2. A court grants the request for a full content-wiretap of e-mail traffic only and issues an order. A term used in telephone surveillance, "content-wiretap" means that everything in the packet can be captured and used. The other type of wiretap is a trap-and-trace, which means that the FBI can only capture the destination information, such as the e-mail account of a message being sent out or the Web-site address that the suspect is visiting. A reverse form of trap-and-trace, called pen-register, tracks where e-mail to the suspect is coming from or where visits to a suspect's Web site originate.
3. The FBI contacts the suspect's ISP and requests a copy of the back-up files of the suspect's activity.
4. The ISP does not maintain customer-activity data as part of its back-up.
5. The FBI sets up a Carnivore computer at the ISP to monitor the suspect's activity. The computer consists of: A Pentium III Windows NT/2000 system with 128 megabytes (MB) of RAM A commercial communications software application A custom C++ application that works in conjunction with the commercial program above to provide the packet sniffing and filtering A type of physical lockout system that requires a special passcode to access the computer (This keeps anyone but the FBI from physically accessing the Carnivore system.) A network isolation device that makes the Carnivore system invisible to anything else on the network (This prevents anyone from hacking into the system from another computer.) A 2-gigabyte (GB) Iomega Jaz drive for storing the captured data (The Jaz drive uses 2-GB removable cartridges that can be swapped out as easily as a floppy disk.) 6. The FBI configures the Carnivore software with the IP address of the suspect so that Carnivore will only capture packets from this particular location. It ignores all other packets.
7. Carnivore copies all of the packets from the suspect's system without impeding the flow of the network traffic.
8. Once the copies are made, they go through a filter that only keeps the e-mail packets. The program determines what the packets contain based on the protocol of the packet. For example, all e-mail packets use the Simple Mail Transfer Protocol (SMTP).
9. The e-mail packets are saved to the Jaz cartridge. 10. Once every day or two, an FBI agent visits the ISP and swaps out the Jaz cartridge. The agent takes the retrieved cartridge and puts it in a container that is dated and sealed. If the seal is broken, the person breaking it must sign, date and reseal it -- otherwise, the cartridge can be considered "compromised."
11. The surveillance cannot continue for more than a month without an extension from the court. Once complete, the FBI removes the system from the ISP.
12. The captured data is processed using Packeteer and Coolminer.
13. If the results provide enough evidence, the FBI can use them as part of a case against the suspect. The example above shows how the system identifies which packets to store. Prey of the Carnivore The FBI plans to use Carnivore for specific reasons. Particularly, the agency will request a court order to use Carnivore when a person is suspected of:
• Terrorism
• Child pornography/exploitation
• Espionage
• Information warfare
• Fraud There are some key issues that are causing a great deal of concern from various sources:
• Privacy - Many folks think that Carnivore is a severe violation of privacy. While the potential for abuse is certainly there, the Electronic Communications Privacy Act (ECPA) provides legal protection of privacy for all types of electronic communication. Any type of electronics surveillance requires a court order and must show probable cause that the suspect is engaged in criminal activities. Therefore, use of Carnivore in any way that does not adhere to ECPA is illegal and can be considered unconstitutional.
• Regulation - There is a widespread belief that Carnivore is a huge system that can allow the U.S. government to seize control of the Internet and regulate its use. To do this would require an amazing infrastructure -- the FBI would need to place Carnivore systems at every ISP, including private, commercial and educational. While it is theoretically possible to do so for all of the ISPs operating in the United States, there is still no way to regulate those operating outside of U.S. jurisdiction. Any such move would also face serious opposition from every direction.
• Free speech - Some people think that Carnivore monitors all of the content flowing through an ISP, looking for certain keywords such as "bomb" or "assassination." Any packet sniffer can be set to look for certain patterns of characters or data. Without probable cause, though, the FBI has no justification to monitor your online activity and would be in severe violation of ECPA and your constitutional right to free speech if it did so.
• Echelon - This is a secret network rumored to be under development by the National Security Agency (NSA), supposedly designed to detect and capture packets crossing international borders that contain certain keywords, such as "bomb" or "assassination." There is no solid evidence to support the existence of Echelon. Many people have confused this rumored system with the very real Carnivore system. All of these concerns have made implementation of Carnivore an uphill battle for the FBI. The FBI has refused to disclose the source code and certain other pieces of technical information about Carnivore, which has only added to people's concerns. But, as long as it is used within the constraints and guidelines of ECPA, Carnivore has the potential to be a useful weapon in the war on crime.
Carnivorous Evolution
• Carnivore - A Windows NT/2000-based system that captures the information
• Packeteer - No official information released, but presumably an application for reassembling packets into cohesive messages or Web pages
• Coolminer - No official information released, but presumably an application for extrapolating and analyzing data found in the messages As you can see, officials have not released much information about the DragonWare Suite, nothing about Packeteer and Coolminer and very little detailed information about Carnivore. But we do know that Carnivore is basically a packet sniffer, a technology that is quite common and has been around for a while.
How Carnivore Work
I've heard that data travels in packets on a computer network. What is a packet, and why do networks use them?
• frame
• block
• cell
• segment Most packets are split into three parts:
• header - The header contains instructions about the data carried by the packet. These instructions may include: o Length of packet (some networks have fixed-length packets, while others rely on the header to contain this information) o Synchronization (a few bits that help the packet match up to the network) o Packet number (which packet this is in a sequence of packets) o Protocol (on networks that carry multiple types of information, the protocol defines what type of packet is being transmitted: e-mail, Web page, streaming video) o Destination address (where the packet is going) o Originating address (where the packet came from)
• payload - Also called the body or data of a packet. This is the actual data that the packet is delivering to the destination. If a packet is fixed-length, then the payload may be padded with blank information to make it the right size.
• trailer - The trailer, sometimes called the footer, typically contains a couple of bits that tell the receiving device that it has reached the end of the packet. It may also have some type of error checking. The most common error checking used in packets is Cyclic Redundancy Check (CRC). CRC is pretty neat. Here is how it works in certain computer networks: It takes the sum of all the 1s in the payload and adds them together. The result is stored as a hexadecimal value in the trailer. The receiving device adds up the 1s in the payload and compares the result to the value stored in the trailer. If the values match, the packet is good. But if the values do not match, the receiving device sends a request to the originating device to resend the packet. As an example, let's look at how an e-mail message might get broken into packets. Let's say that you send an e-mail to a friend. The e-mail is about 3,500 bits (3.5 kilobits) in size. The network you send it over uses fixed-length packets of 1,024 bits (1 kilobit). The header of each packet is 96 bits long and the trailer is 32 bits long, leaving 896 bits for the payload. To break the 3,500 bits of message into packets, you will need four packets (divide 3,500 by 896). Three packets will contain 896 bits of payload and the fourth will have 812 bits. Here is what one of the four packets would contain:Each packet's header will contain the proper protocols, the originating address (the IP address of your computer), the destination address (the IP address of the computer where you are sending the e-mail) and the packet number (1, 2, 3 or 4 since there are 4 packets). Routers in the network will look at the destination address in the header and compare it to their lookup table to find out where to send the packet. Once the packet arrives at its destination, your friend's computer will strip the header and trailer off each packet and reassemble the e-mail based on the numbered sequence of the packets.
Definition of a packet
Types of networks
Telecommunications network
Data recognition and useOpen systems interconnection > Data recognition and use The application layer is difficult to generalize, since its content is specific to each user. For example, distributed databases used in the banking and airline industries require several access and security issues to be solved at this level. Network transparency (making the physical distribution of resources irrelevant to the human user) also is handled at this level. The presentation layer, on the other hand, performs functions that are requested sufficiently often that a general solution is warranted. These functions are often placed in a software library that is accessible by several users running different applications. Examples are text conversion, data compression, and data encryption. User interface with the network is performed by the session layer, which handles the process of connecting to another computer, verifying user authenticity, and establishing a reliable communication process. This layer also ensures that files which can be altered by several network users are kept in order. Data from the session layer are accepted by the transport layer, which separates the data stream into smaller units, if necessary, and ensures that all arrive correctly at the destination. If fast throughput is needed, the transport layer may establish several simultaneous paths in the network and send different parts of the data over each path. Conversely, if low cost is a requirement, then the layer may time-multiplex several users' data over one path through the network. Flow control is also regulated at this level, ensuring that data from a fast source will not overrun a slow destination.
Open systems interconnectionDifferent communication requirements necessitate different network solutions, and these different network protocols can create significant problems of compatibility when networks are interconnected with one another. In order to overcome some of these interconnection problems, the open systems interconnection (OSI) was approved in 1983 as an international standard for communications architecture by the International Organization for Standardization (ISO) and the International Telegraph and Telephone Consultative Committee (CCITT). The OSI model, as shown in the figure, consists of seven layers, each of which is selected to perform a well-defined function at a different level of abstraction. The bottom three layers provide for the timely and correct transfer of data, and the top four ensure that arriving data are recognizable and useful. While all seven layers are usually necessary at each user location, only the bottom three are normally employed at a network node, since nodes are concerned only with timely and correct data transfer from point to point.
Spread-spectrum multiple access Network access > Random access > Spread-spectrum multiple access Since collisions are so detrimental to network performance, methods have been developed to allow multiple transmissions on a broadcast network without necessarily causing mutual packet destruction. One of the most successful is called spread-spectrum multiple access (SSMA). In SSMA simultaneous transmissions will cause only a slight increase in bit error probability for each user if the channel is not too heavily loaded. Error-free packets can be obtained by using an appropriate control code. Disadvantages of SSMA include wider signal bandwidth and greater equipment cost and complexity compared with conventional CSMA.
Network access Scheduled access In a scheduling method known as time-division multiple access (TDMA), a time slot is assigned in turn to each node, which uses the slot if it has something to transmit. If some nodes are much busier than others, then TDMA can be inefficient, since no data are passed during time slots allocated to silent nodes. In this case a reservation system may be implemented, in which there are fewer time slots than nodes and a node reserves a slot only when it is needed for transmission.
Since all nodes can hear each transmission in a broadcast network, a procedure must be established for allocating a communications channel to the node or nodes that have packets to transmit and at the same time preventing destructive interference from collisions (simultaneous transmissions). This type of communication, called multiple access, can be established either by scheduling (a technique in which nodes take turns transmitting in an orderly fashion) or by random access to the channel.
Using E-Smells
Creating a Virtual Stink
How Internet Odors Will Work
Information and Communication Technology
VoIP Gateways:
Gateways have become a central, yet complex, component in most state-of-the-art VoIP systems. Although they’ve been around for years, VoIP gateways remain something of a mystery. What, exactly, are these devices gateways to? Do they lead the way into a data network, a voice network, telephones, network management or outright confusion? In a way, they actually open the door to all of these areas. That's because VoIP gateways have become a central, yet complex, component in most state-of-the-art VoIP systems.VoIP gateways act as VoIP network translators and mediators. Perhaps most importantly, they translate calls placed through the public switched telephone network (PSTN) - the "regular" telephone system - into digital data packets that are compatible with an enterprise's VoIP system. VoIP gateways can also help direct VoIP calls to specific users with the assistance of built-in routing tables. Additionally, the units can translate between different VoIP protocols, such as H.323 and SIP, enabling compatibility between various VoIP systems and devices.Given all of these benefits, it's easy to see why VoIP gateways are highly recommended for virtually any VoIP implementation. Yet this hasn't always been the case. In VoIP's early days, system designers often "VoIP-enabled" switches and routers to handle key gateway tasks. But as VoIP networks grew larger and more sophisticated, and as end users began demanding higher quality and more reliable service, most designers began specifying standalone VoIP gateways for their systems.
Various Vendors
Getting Smarter
Building VoIP Gateways
• The VoIP voice quality is indistinguishable from the traditional phone calls.
• Rates for VoIP calls charged by Savytel represent a large saving, compared to the rates charged by the traditional telephone service providers
Gateway VoIP Implementation
• The VoIP voice quality is indistinguishable from the traditional phone calls.
• Rates for VoIP calls charged by Savytel represent a large saving, compared to the rates charged by the traditional telephone service providers.