A firewall is a dedicated appliance, or software running on another computer, which inspects network traffic passing through it, and denies or permits passage based on a set of rules.
Function
A firewall's basic task is to regulate some of the flow of traffic between computer networks of different trust levels. Typical examples are the Internet which is a zone with no trust and an internal network which is a zone of higher trust. A zone with an intermediate trust level, situated between the Internet and a trusted internal network, is often referred to as a "perimeter network" or Demilitarized zone (DMZ).
A firewall's function within a network is similar to firewalls with fire doors in building construction. In the former case, it is used to prevent network intrusion to the private network. In the latter case, it is intended to contain and delay structural fire from spreading to adjacent structures.
Without proper configuration, a firewall can often become worthless. Standard security practices dictate a "default-deny" firewall ruleset, in which the only network connections which are allowed are the ones that have been explicitly allowed. Unfortunately, such a configuration requires detailed understanding of the network applications and endpoints required for the organization's day-to-day operation. Many businesses lack such understanding, and therefore implement a "default-allow" ruleset, in which all traffic is allowed unless it has been specifically blocked. This configuration makes inadvertent network connections and system compromise much more likely.
History
The term "firewall" originally meant a wall to confine a fire or potential fire within a building, c.f. firewall (construction). Later uses refer to similar structures, such as the metal sheet separating the engine compartment of a vehicle or aircraft from the passenger compartment.
Firewall technology emerged in the late 1980s when the Internet was a fairly new technology in terms of its global use and connectivity. The original idea was formed in response to a number of major internet security breaches, which occurred in the late 1980s.
First generation - packet filters
The first paper published on firewall technology was in 1988, when engineers from Digital Equipment Corporation (DEC) developed filter systems known as packet filter firewalls. This fairly basic system was the first generation of what would become a highly evolved and technical internet security feature. At AT&T Bell Labs, Bill Cheswick and Steve Bellovin were continuing their research in packet filtering and developed a working model for their own company based upon their original first generation architecture.
Packet filters act by inspecting the "packets" which represent the basic unit of data transfer between computers on the Internet. If a packet matches the packet filter's set of rules, the packet filter will drop (silently discard) the packet, or reject it (discard it, and send "error responses" to the source).
This type of packet filtering pays no attention to whether a packet is part of an existing stream of traffic (it stores no information on connection "state"). Instead, it filters each packet based only on information contained in the packet itself (most commonly using a combination of the packet's source and destination address, its protocol, and, for TCP and UDP traffic, which comprises most internet communication, the port number).
Because TCP and UDP traffic by convention uses well known ports for particular types of traffic, a "stateless" packet filter can distinguish between, and thus control, those types of traffic (such as web browsing, remote printing, email transmission, file transfer), unless the machines on each side of the packet filter are both using the same non-standard ports.
Second generation - "stateful" filters
From 1980-1990 three colleagues from AT&T Bell Laboratories, Dave Presetto, Janardan Sharma, and Kshitij Nigam developed the second generation of firewalls, calling them circuit level firewalls.
Second Generation firewalls do not simply examine the contents of each packet on an individual basis without regard to their placement within the packet series as their predecessors had done, rather they compare some key parts of the trusted database packets. This technology is generally referred to as a 'stateful firewall' as it maintains records of all connections passing through the firewall and is able to determine whether a packet is the start of a new connection or part of an existing connection. Though there is still a set of static rules in such a firewall, the state of a connection can in itself be one of the criteria which trigger specific rules.This type of firewall can help prevent attacks which exploit existing connections, or certain Denial-of-service attacks.
Third generation - application layer
Publications by Gene Spafford of Purdue University, Bill Cheswick at AT&T Laboratories, and Marcus Ranum described a third generation firewall known as an application layer firewall, also known as a proxy-based firewall. Marcus Ranum's work on the technology spearheaded the creation of the first commercial product. The product was released by DEC who named it the DEC SEAL product. DEC’s first major sale was on June 13, 1991 to a chemical company based on the East Coast of the USA.The key benefit of application layer filtering is that it can "understand" certain applications and protocols (such as File Transfer Protocol, DNS, or web browsing), and it can detect whether an unwanted protocol is being sneaked through on a non-standard port or whether a protocol is being abused in a known harmful way.
Subsequent developments
In 1992, Bob Braden and Annette DeSchon at the University of Southern California (USC) were refining the concept of a firewall. The product known as "Visas" was the first system to have a visual integration interface with colours and icons, which could be easily implemented to and accessed on a computer operating system such as Microsoft's Windows or Apple's MacOS. In 1994 an Israeli company called Check Point Software Technologies built this into readily available software known as FireWall-1.
The existing deep packet inspection functionality of modern firewalls can be shared by Intrusion-prevention systems (IPS).
Currently, the Middlebox Communication Working Group of the Internet Engineering Task Force (IETF) is working on standardizing protocols for managing firewalls and other middleboxes.
Types
There are several classifications of firewalls depending on where the communication is taking place, where the communication is intercepted and the state that is being traced.
Network layer and packet filters
Network layer firewalls, also called packet filters, operate at a relatively low level of the TCP/IP protocol stack, not allowing packets to pass through the firewall unless they match the established rule set. The firewall administrator may define the rules; or default rules may apply. The term packet filter originated in the context of BSD operating systems.
Network layer firewalls generally fall into two sub-categories, stateful and stateless. Stateful firewalls maintain context about active sessions, and use that "state information" to speed up packet processing. Any existing network connection can be described by several properties, including source and destination IP address, UDP or TCP ports, and the current stage of the connection's lifetime (including session initiation, handshaking, data transfer, or completion connection). If a packet does not match an existing connection, it will be evaluated according to the ruleset for new connections. If a packet matches an existing connection based on comparison with the firewall's state table, it will be allowed to pass without further processing.
Stateless firewalls have packet-filtering capabilities, but cannot make more complex decisions on what stage communications between hosts have reached.
Application-layer
Application-layer firewalls work on the application level of the TCP/IP stack (i.e., all browser traffic, or all telnet or ftp traffic), and may intercept all packets traveling to or from an application. They block other packets (usually dropping them without acknowledgement to the sender). In principle, application firewalls can prevent all unwanted outside traffic from reaching protected machines.
On inspecting all packets for improper content, firewalls can restrict or prevent outright the spread of networked computer worms and trojans. In practice, however, this becomes so complex and so difficult to attempt (given the variety of applications and the diversity of content each may allow in its packet traffic) that comprehensive firewall design does not generally attempt this approach.
The XML firewall exemplifies a more recent kind of application-layer firewall.
Companies like SecureComputing (www.securecomputing.com) are a major manufacturer of Application Layer Firewalls.
Proxies
A proxy device (running either on dedicated hardware or as software on a general-purpose machine) may act as a firewall by responding to input packets (connection requests, for example) in the manner of an application, whilst blocking other packets.
Proxies make tampering with an internal system from the external network more difficult and misuse of one internal system would not necessarily cause a security breach exploitable from outside the firewall (as long as the application proxy remains intact and properly configured). Conversely, intruders may hijack a publicly-reachable system and use it as a proxy for their own purposes; the proxy then masquerades as that system to other internal machines. While use of internal address spaces enhances security, crackers may still employ methods such as IP spoofing to attempt to pass packets to a target network.
Network address translation
Firewalls often have network address translation (NAT) functionality, and the hosts protected behind a firewall commonly have addresses in the "private address range", as defined in RFC 1918. Firewalls often have such functionality to hide the true address of protected hosts. Originally, the NAT function was developed to address the limited number of IPv4 routable addresses that could be used or assigned to companies or individuals as well as reduce both the amount and therefore cost of obtaining enough public addresses for every computer in an organization. Hiding the addresses of protected devices has become an increasingly important defense against network reconnaissance.
Your Ip is
Sign by Dealighted - Coupons and Deals
Adbrite
Friday, February 29, 2008
Thursday, February 28, 2008
Internet security
In the computer industry, Internet security refers to techniques for ensuring that data stored in a computer cannot be read or compromised by any individuals without authorization. Most security measures involve data encryption and passwords. Data encryption is the translation of data into a form that is unintelligible without a deciphering mechanism. A password is a secret word or phrase that gives a user access to a particular program or system.
Routers
Network Address Translation (NAT) typically has the effect of preventing connections from being established inbound into a computer, whilst permitting connections out. For a small home network, software NAT can be used on the computer with the Internet connection, providing similar behaviour to a router and similar levels of security, but for a lower cost and lower complexity.
Firewalls
A firewall blocks all "roads and cars" through authorized ports on your computer, thus restricting unfettered access. A stateful firewall is a more secure form of firewall, and system administrators often combine a proxy firewall with a packet-filtering firewall to create a highly secure system. Most home users use a software firewall. These types of firewalls can create a log file where it records all the connection details (including connection attempts) with the PC.
Anti-virus
Some people or companies with malicious intentions write programs like computer viruses, worms, trojan horses and spyware. These programs are all characterised as being unwanted software that install themselves on your computer through deception.
Trojan horses are simply programs that conceal their true purpose or include a hidden functionality that a user would not want.
Worms are characterised by having the ability to replicate themselves and viruses are similar except that they achieve this by adding their code onto third party software. Once a virus or worm has infected a computer, it would typically infect other programs (in the case of viruses) and other computers.
Viruses also slow down system performance and cause strange system behavior and in many cases do serious harm to computers, either as deliberate, malicious damage or as unintentional side effects.
In order to prevent damage by viruses and worms, users typically install antivirus software, which runs in the background on the computer, detecting any suspicious software and preventing it from running.
Some malware that can be classified as trojans with a limited payload are not detected by most antivirus software and may require the use of other software designed to detect other classes of malware, including spyware.
Anti-spyware
Spyware is software that runs on a computer without the explicit permission of its user. It often gathers private information from a user's computer and sends this data over the Internet back to the software manufacturer. Adware is software that runs on a computer without the owner's consent, much like spyware. However, instead of taking information, it typically runs in the background and displays random or targeted pop-up advertisements. In many cases, this slows the computer down and may also cause software conflicts.
Browser choice
Internet Explorer is currently the most widely used web browser in the world, making it the prime target for phishing and many other possible attacks.
Routers
Network Address Translation (NAT) typically has the effect of preventing connections from being established inbound into a computer, whilst permitting connections out. For a small home network, software NAT can be used on the computer with the Internet connection, providing similar behaviour to a router and similar levels of security, but for a lower cost and lower complexity.
Firewalls
A firewall blocks all "roads and cars" through authorized ports on your computer, thus restricting unfettered access. A stateful firewall is a more secure form of firewall, and system administrators often combine a proxy firewall with a packet-filtering firewall to create a highly secure system. Most home users use a software firewall. These types of firewalls can create a log file where it records all the connection details (including connection attempts) with the PC.
Anti-virus
Some people or companies with malicious intentions write programs like computer viruses, worms, trojan horses and spyware. These programs are all characterised as being unwanted software that install themselves on your computer through deception.
Trojan horses are simply programs that conceal their true purpose or include a hidden functionality that a user would not want.
Worms are characterised by having the ability to replicate themselves and viruses are similar except that they achieve this by adding their code onto third party software. Once a virus or worm has infected a computer, it would typically infect other programs (in the case of viruses) and other computers.
Viruses also slow down system performance and cause strange system behavior and in many cases do serious harm to computers, either as deliberate, malicious damage or as unintentional side effects.
In order to prevent damage by viruses and worms, users typically install antivirus software, which runs in the background on the computer, detecting any suspicious software and preventing it from running.
Some malware that can be classified as trojans with a limited payload are not detected by most antivirus software and may require the use of other software designed to detect other classes of malware, including spyware.
Anti-spyware
Spyware is software that runs on a computer without the explicit permission of its user. It often gathers private information from a user's computer and sends this data over the Internet back to the software manufacturer. Adware is software that runs on a computer without the owner's consent, much like spyware. However, instead of taking information, it typically runs in the background and displays random or targeted pop-up advertisements. In many cases, this slows the computer down and may also cause software conflicts.
Browser choice
Internet Explorer is currently the most widely used web browser in the world, making it the prime target for phishing and many other possible attacks.
Wednesday, February 27, 2008
Call centre


A call centre or call center is a centralised office used for the purpose of receiving and transmitting a large volume of requests by telephone.A call centre is operated by a company to administer incoming product support or information inquiries from consumers. Outgoing calls for telemarketing, clientele, and debt collection are also made. In addition to a call centre, collective handling of letters, faxes, and e-mails at one location is known as a contact centre.A call centre is often operated through an extensive open workspace for call centre agents, with work stations that include a computer for each agent, a telephone set/headset connected to a telecom switch, and one or more supervisor stations. It can be independently operated or networked with additional centres, often linked to a corporate computer network, including mainframes, microcomputers and LANs. Increasingly, the voice and data pathways into the centre are linked through a set of new technologies called computer telephony integration (CTI).Most major businesses use call centres to interact with their customers. Examples include utility companies, mail order catalogue firms, and customer support for computer hardware and software. Some businesses even service internal functions through call centres. Examples of this include help desks and sales support.
Mathematical theory
A call centre can be seen from an operational point of view as a queueing network. The simplest call centre, consisting of a single type of customers and statistically-identical servers, can be viewed as a single-queue. Queueing theory is a branch of mathematics in which models of such queueing systems have been developed. These models, in turn, are used to support work force planning and management, for example by helping answer the following common staffing-question: given a service-level, as determined by management, what is the least number of telephone agents that is required to achieve it. (Prevalent examples of service levels are: at least 80% of the callers are answered within 20 seconds; or, no more than 3% of the customers hang-up due to impatience, before being served.)Queueing models also provide qualitative insight, for example identifying the circumstances under which economies of scale prevail, namely that a single large call centre is more effective at answering calls than several (distributed) smaller ones; or that cross-selling is beneficial; or that a call centre should be quality-driven or efficiency-driven or, most likely, both Quality and Efficiency Driven (abbreviated to QED). Recently, queueing models have also been used for planning and operating skills-based-routing of calls within a call centre, which entails the analysis of systems with multi-type customers and multi-skilled agents.Call centre operations have been supported by mathematical models beyond queueing, with operations research, which considers a wide range of optimisation problems, being very relevant. For example, for forecasting of calls, for determining shift-structures, and even for analysing customers' impatience while waiting to be served by an agent.
Administration of call centres
The centralisation of call management aims to improve a company's operations and reduce costs, while providing a standardised, streamlined, uniform service for consumers. To accommodate large customer bases, large warehouses are often converted to office space to host all call centre operations under one roof.Call centre staff can be monitored for quality control, level of proficiency, and customer service by computer technology that manages, measures and monitors the performance and activities of the workers. Typical contact centre operations focus on the discipline areas of workforce management, queue management, quality monitoring, and reporting. Reporting in a call centre can be further broken down into real time reporting and historical reporting. The types of information collected for a group of call centre agents can include: agents logged in, agents ready to take calls, agents available to take calls, agents in wrap up mode, average call duration, average call duration including wrap-up time, longest duration agent available, longest duration call in queue, number of calls in queue, number of calls offered, number of calls abandoned, average speed to answer, average speed to abandoned and service level, calculated by the percentage of calls answered in under a certain time period.Many Call centres use workforce management software, which is software that uses historical information coupled with projected need to generate automated schedules to meet anticipated staffing level needs.The relatively high cost of personnel and office space as well as need for large manpower and challenges around attrition, hiring and managing a large workforce influences outsourcing in the call centre industry.
Technology
Call centres use a wide variety of different technologies to allow them to manage large volumes of work. These technologies facilitate queueing and processing of calls, maintaining consistent work flow for agents and creating other business cost savings.
Patents
There are a large number of patents covering various aspects of call centre operation, automation, and technology. One of the early inventors in this field, Ronald A. Katz, personally holds over 50 patents covering inventions related to toll free numbers, automated attendant, automated call distribution, voice response unit, computer telephone integration and speech recognition.
Varieties of call centres
Some variations of call centre models are listed below:
Remote Agents – An alternative to housing all agents in a central facility is to use remote agents. These agents work from home and use internet technologies to connect.
Temporary Agents – Temporary agents who are called upon if demand increases more rapidly than planned.
Virtual Call centres – Virtual Call centres are created using many smaller centres in different locations and connecting them to one another. There are two methods used to route traffic around call centres: pre-delivery and post-delivery. Pre-delivery involves using an external switch to route the calls to the appropriate centre and post-delivery enables call centres to route a call they've received to another call centre.
Contact centres – Deal with more media than telephony alone including Email, Web Callback and internet Chat.
Tuesday, February 26, 2008
Network card

A network card, network adapter, LAN Adapter or NIC (network interface card) is a piece of computer hardware designed to allow computers to communicate over a computer network. It is both an OSI layer 1 (physical layer) and layer 2 (data link layer) device, as it provides physical access to a networking medium and provides a low-level addressing system through the use of MAC addresses. It allows users to connect to each other either by using cables or wirelessly.
Although other network technologies exist, Ethernet has achieved near-ubiquity since the mid-1990s. Every Ethernet network card has a unique 48-bit serial number called a MAC address, which is stored in ROM carried on the card. Every computer on an Ethernet network must have a card with a unique MAC address. No two cards ever manufactured share the same address. This is accomplished by the Institute of Electrical and Electronics Engineers (IEEE), which is responsible for assigning unique MAC addresses to the vendors of network interface controllers.Whereas network cards used to be expansion cards that plug into a computer bus, the low cost and ubiquity of the Ethernet standard means that most newer computers have a network interface built into the motherboard. These either have Ethernet capabilities integrated into the motherboard chipset, or implemented via a low cost dedicated Ethernet chip, connected through the PCI (or the newer PCI express bus). A separate network card is not required unless multiple interfaces are needed or some other type of network is used. Newer motherboards may even have dual network (Ethernet) interfaces built-in.The card implements the electronic circuitry required to communicate using a specific physical layer and data link layer standard such as Ethernet or token ring. This provides a base for a full network protocol stack, allowing communication among small groups of computers on the same LAN and large-scale network communications through routable protocols, such as IP.
There are four techniques used to transfer data, the NIC may use one or more of these techniques.
Polling is where the microprocessor examines the status of the peripheral under program control.
Programmed I/O is where the microprocessor alerts the designated peripheral by applying its address to the system's address bus.
Interrupt-driven I/O is where the peripheral alerts the microprocessor that it's ready to transfer data.
DMA is where the intelligent peripheral assumes control of the system bus to access memory directly. This removes load from the CPU but requires a separate processor on the card.
A network card typically has a twisted pair, BNC, or AUI socket where the network cable is connected, and a few LEDs to inform the user of whether the network is active, and whether or not there is data being transmitted on it. The Network Cards are typically available in 10/100/1000 Mbit/s(Mbit/s). This means they can support a transfer rate of 10 or 100 or 1000 Megabits per second.
Monday, February 25, 2008
Microsoft


Microsoft Corporation or often just MS, is an American multinational computer technology corporation with 79,000 employees in 102 countries and global annual revenue of US $51.12 billion as of 2007. It develops, manufactures, licenses, and supports a wide range of software products for computing devices. Headquartered in Redmond, Washington, USA, its best selling products are the Microsoft Windows operating system and the Microsoft Office suite of productivity software. These products have prominent positions in the desktop computer market, with market share estimates as high as 90% or more as of 2003 for Microsoft Office and 2006 for Microsoft Windows. One of Bill Gates' key visions is "to get a workstation running our software onto every desk and eventually in every home".
Founded to develop and sell BASIC interpreters for the Altair 8800, Microsoft rose to dominate the home computer operating system market with MS-DOS in the mid-1980s. The company released an initial public offering (IPO) in the stock market, which, due to the ensuing rise of the stock price, has made four billionaires and an estimated 12,000 millionaires from Microsoft employees. Throughout its history the company has been the target of criticism for various reasons, including monopolistic business practices—both the U.S. Justice Department and the European Commission, among others, brought Microsoft to court for antitrust violations and software bundling.
Microsoft has footholds in other markets besides operating systems and office suites, with assets such as the MSNBC cable television network, the MSN Internet portal, and the Microsoft Encarta multimedia encyclopedia. The company also markets both computer hardware products such as the Microsoft mouse and home entertainment products such as the Xbox, Xbox 360, Zune and MSN TV. Known for what is generally described as a developer-centric business culture, Microsoft has historically given customer support over Usenet newsgroups and the World Wide Web, and awards Microsoft MVP status to volunteers who are deemed helpful in assisting the company's customers. The company's official website is one of the most visited on the Internet, receiving more than 2.4 million unique page views per day according to Alexa.com, who ranked the site 18th amongst all websites for traffic rank on September 12, 2007.
History
1975–1985: Founding
Following the launch of the Altair 8800, Bill Gates called the creators of the new microcomputer, Micro Instrumentation and Telemetry Systems (MITS), offering to demonstrate an implementation of the BASIC programming language for the system. After the demonstration, MITS agreed to distribute Altair BASIC. Gates left Harvard University, moved to Albuquerque, New Mexico where MITS was located, and founded Microsoft there. The company's first international office was founded on November 1, 1978, in Japan, entitled "ASCII Microsoft" (now called "Microsoft Japan"). On January 1, 1979, the company moved from Albuquerque to a new home in Bellevue, Washington. Steve Ballmer joined the company on June 11, 1980, and later succeeded Bill Gates as CEO.
DOS (Disk Operating System) was the operating system that brought the company its first real success. On August 12, 1981, after negotiations with Digital Research failed, IBM awarded a contract to Microsoft to provide a version of the CP/M operating system, which was set to be used in the upcoming IBM Personal Computer (PC). For this deal, Microsoft purchased a CP/M clone called 86-DOS from Seattle Computer Products, which IBM renamed to PC-DOS. Later, the market saw a flood of IBM PC clones after Columbia Data Products successfully cloned the IBM BIOS, and by aggressively marketing MS-DOS to manufacturers of IBM-PC clones, Microsoft rose from a small player to one of the major software vendors in the home computer industry. The company expanded into new markets with the release of the Microsoft Mouse in 1983, as well as a publishing division named Microsoft Press.
1985–1995: OS/2 and Windows
In August 1985, Microsoft and IBM partnered in the development of a different operating system called OS/2. On November 20, 1985, Microsoft released its first retail version of Microsoft Windows, originally a graphical extension for its MS-DOS operating system. On March 13, 1986 the company went public with an IPO, with a starting initial offering price of $21.00 and ending at the first day of trading as at US $28.00. In 1987, Microsoft eventually released their first version of OS/2 to OEMs.In 1989, Microsoft introduced its flagship office suite, Microsoft Office. This was a bundle of separate office productivity applications, such as Microsoft Word and Microsoft Excel. On May 22, 1990 Microsoft launched Windows 3.0. The new version of Microsoft's operating system boasted such new features as streamlined user interface graphics and improved protected mode capability for the Intel 386 processor; it sold over 100,000 copies in two weeks. Windows at the time generated more revenue for Microsoft than OS/2, and the company decided to move more resources from OS/2 to Windows. In the ensuing years, the popularity of OS/2 declined, and Windows quickly became the favored PC platform.During the transition from MS-DOS to Windows, the success of Microsoft's product Microsoft Office allowed the company to gain ground on application-software competitors, such as WordPerfect and Lotus 1-2-3. According to The Register, Novell, an owner of WordPerfect for a time, alleged that Microsoft used its inside knowledge of the DOS and Windows kernels and of undocumented Application Programming Interface features to make Office perform better than its competitors. Eventually, Microsoft Office became the dominant business suite, with a market share far exceeding that of its competitors.
In 1993, Microsoft released Windows NT 3.1, a business operating system with the Windows 3.1 user interface but an entirely different kernel. In 1995, Microsoft released Windows 95, a new version of the company's flagship operating system which featured a completely new user interface, including a novel start button; more than a million copies of Microsoft Windows 95 were sold in the first four days after its release. The company also released its web browser, Internet Explorer, with the Windows 95 Plus! Pack in August 1995 and subsequent Windows versions.
1995–2005: Internet and legal issues
In the mid-90s, Microsoft began to expand its product line into computer networking and the World Wide Web. On August 24, 1995, it launched a major online service, MSN (Microsoft Network), as a direct competitor to AOL. MSN became an umbrella service for Microsoft's online services. The company continued to branch out into new markets in 1996, starting with a joint venture with NBC to create a new 24/7 cable news station, MSNBC. Microsoft entered the personal digital assistant (PDA) market in November with Windows CE 1.0, a new built-from-scratch version of their flagship operating system, specifically designed to run on low-memory, low-performance machines, such as handhelds and other small computers. Later in 1997, Internet Explorer 4.0 was released for both Mac OS and Windows, marking the beginning of the takeover of the browser market from rival Netscape. In October, the Justice Department filed a motion in the Federal District Court in which they stated that Microsoft had violated an agreement signed in 1994, and asked the court to stop the bundling of Internet Explorer with Windows.
The year 1998 was significant in Microsoft's history, with Bill Gates appointing Steve Ballmer as president of Microsoft but remaining as Chair and CEO himself. The company released Windows 98, an update to Windows 95 that incorporated a number of Internet-focused features and support for new types of devices. On April 3, 2000, a judgment was handed down in the case of United States v. Microsoft, calling the company an "abusive monopoly" and forcing the company to split into two separate units. Part of this ruling was later overturned by a federal appeals court, and eventually settled with the U.S. Department of Justice in 2001.
In 2001, Microsoft released Windows XP, the first version that encompassed the features of both its business and home product lines. XP introduced a new graphical user interface, the first such change since Windows 95. Later, with the release of the Xbox Microsoft entered the multi-billion-dollar game console market dominated by Sony and Nintendo. Microsoft encountered more turmoil in March 2004 when antitrust legal action was brought against it by the European Union for abusing its market dominance (see European Union Microsoft antitrust case), eventually resulting in a judgement to produce new versions of its Windows XP platform—called Windows XP Home Edition N and Windows XP Professional N—that did not include its Windows Media Player.
2006–present: Vista and other transitions
In 2006, Bill Gates announced a two year transition period from his role as Chief Software Architect, which would be taken by Ray Ozzie, and planned to remain the company's chairman, head of the Board of Directors and act as an adviser on key projects.[46] As of December 2007, Windows Vista, released in January 2007, is Microsoft's latest operating system. Microsoft Office 2007 was released at the same time; its "Ribbon" user interface is a significant departure from its predecessors. On 1st February, 2008, Microsoft made an unsolicited bid to purchase the fully diluted outstanding shares of Yahoo for up to $44.6 billion, though this offer was later rejected on February 10. Microsoft is not privately haggling with Yahoo over the software maker's rejected $31-per-share buyout offer for the Internet pioneer, Bill Gates said on February 19, 2008.Microsoft Corp. told on February 21, 2008 it will share more information about its products and technology. The company wants to make it easier for developers to create software that works with its products.
Product divisions
To be more precise in tracking performance of each unit and delegating responsibility, Microsoft reorganized into seven core business groups—each an independent financial entity—in April 2002. Later, on September 20, 2005, Microsoft announced a rationalization of its original seven business groups into the three core divisions that exist today: the Windows Client, MSN and Server and Tool groups were merged into the Microsoft Platform Products and Services Division; the Information Worker and Microsoft Business Solutions groups were merged into the Microsoft Business Division; and the Mobile and Embedded Devices and Home and Entertainment groups were merged into the Microsoft Entertainment and Devices Division.
Platform Products and Services Division
This division produces Microsoft's flagship product, the Windows operating system. It has been produced in many versions, including Windows 3.1, Windows 95, Windows 98, Windows 2000, Windows Me, Windows Server 2003, Windows XP and Windows Vista. Almost all IBM compatible personal computers come with Windows preinstalled. The current desktop version of Windows is Windows Vista. The online service MSN, the cable television station MSNBC and the Microsoft online magazine Slate are all part of this division. (Slate was acquired by The Washington Post on December 21, 2004.) At the end of 1997, Microsoft acquired Hotmail, the most popular webmail service, which it rebranded as "MSN Hotmail". In 1999, Microsoft introduced MSN Messenger, an instant messaging client, to compete with the popular AOL Instant Messenger. Along with Windows Vista, MSN Messenger became Windows Live Messenger.
Microsoft Visual Studio is the company's set of programming tools and compilers. The software product is GUI-oriented and links easily with the Windows APIs, but must be specially configured if used with non-Microsoft libraries. The current version is Visual Studio 2008. The previous version, Visual Studio 2005 was a major improvement over its predecessor, Visual Studio.Net 2003, named after the .NET initiative, a Microsoft marketing initiative covering a number of technologies. Microsoft's definition of .NET continues to evolve. As of 2004, .NET aims to ease the development of Microsoft Windows-based applications that use the Internet, by deploying a new Microsoft communications system, Indigo (now renamed Windows Communication Foundation). This is intended to address some issues previously introduced by Microsoft's DLL design, which made it difficult, even impossible in some situations, to manage, install multiple versions of complex software packages on the same system (see DLL-hell), and provide a more consistent development platform for all Windows applications (see Common Language Infrastructure). In addition, the Company established a set of certification programs to recognize individuals who have expertise in its software and solutions. Similar to offerings from Cisco, Sun Microsystems, Novell, IBM, and Oracle Corporation, these tests are designed to identify a minimal set of proficiencies in a specific role; this includes developers ("Microsoft Certified Solution Developer"), system/network analysts ("Microsoft Certified Systems Engineer"), trainers ("Microsoft Certified Trainers") and administrators ("Microsoft Certified Systems Administrator" and "Microsoft Certified Database Administrator").
Microsoft offers a suite of server software, entitled Windows Server System. Windows Server 2003, an operating system for network servers, is the core of the Windows Server System line. Another server product, Systems Management Server, is a collection of tools providing remote-control abilities, patch management, software distribution and a hardware/software inventory. Other server products include:
Microsoft SQL Server, a relational database management system;
Microsoft Exchange Server, for certain business-oriented e-mail features;
Small Business Server, for messaging and other small business-oriented features; and
Microsoft BizTalk Server, for employee integration assistance and other functions.
Business Division
The Microsoft Business Division produces Microsoft Office, which is the company's line of office software. The software product includes Word (a word processor), Access (a personal relational database application), Excel (a spreadsheet program), Outlook (Windows-only groupware, frequently used with Exchange Server), PowerPoint (presentation software), and Publisher (desktop publishing software). A number of other products were added later with the release of Office 2003 including Visio, Project, MapPoint, InfoPath and OneNote.The division focuses on developing financial and business management software for companies. These products include products formerly produced by the Business Solutions Group, which was created in April 2001 with the acquisition of Great Plains. Subsequently, Navision was acquired to provide a similar entry into the European market, resulting in the planned release of Microsoft Dynamics NAV in 2006. The group markets Axapta and Solomon, catering to similar markets, which is scheduled to be combined with the Navision and Great Plains lines into a common platform called Microsoft Dynamics.
Entertainment and Devices Division
Microsoft has attempted to expand the Windows brand into many other markets, with products such as Windows CE for PDAs and its "Windows-powered" Smartphone products. Microsoft initially entered the mobile market through Windows CE for handheld devices, which today has developed into Windows Mobile 6. The focus of the operating system is on devices where the OS may not directly be visible to the end user, in particular, appliances and cars. The company produces MSN TV, formerly WebTV, a television-based Internet appliance. Microsoft used to sell a set-top Digital Video Recorder (DVR) called the UltimateTV, which allowed users to record up to 35 hours of television programming from a direct-to-home satellite television provider DirecTV. This was the main competition in the UK for British Sky Broadcasting's (BSkyB) SKY + service, owned by Rupert Murdoch. UltimateTV has since been discontinued, with DirecTV instead opting to market DVRs from TiVo Inc. before later switching to their own DVR brand.
Microsoft sells computer games that run on Windows PCs, including titles such as Age of Empires, Halo and the Microsoft Flight Simulator series. It produces a line of reference works that include encyclopedias and atlases, under the name Encarta. Microsoft Zone hosts free premium and retail games where players can compete against each other and in tournaments. Microsoft entered the multi-billion-dollar game console market dominated by Sony and Nintendo in late 2001, with the release of the Xbox. The company develops and publishes its own video games for this console, with the help of its Microsoft Game Studios subsidiary, in addition to third-party Xbox video game publishers such as Electronic Arts and Activision, who pay a license fee to publish games for the system. The Xbox also has a successor in the Xbox 360, released on 2005-11-22 in North America and other countries. With the Xbox 360, Microsoft hopes to compensate for the losses incurred with the original Xbox. However, Microsoft made some decisions considered controversial in the video gaming community, such as releasing the console with high failure rates, selling two different versions of the system, one without the HDD and providing limited backward compatibility with only particular Xbox titles. . In addition to the Xbox line of products, Microsoft also markets a number of other computing-related hardware products as well, including mice, keyboards, joysticks, and gamepads, along with other game controllers, the production of which is outsourced in most cases. As of 15 November 2007, Microsoft announced the purchase of Musiwave, Openwave's mobile phone music sales business.
Business culture
Microsoft has often been described as having a developer-centric business culture. A great deal of time and money is spent each year on recruiting young university-trained software developers and on keeping them in the company. For example, while many software companies often place an entry-level software developer in a cubicle desk within a large office space filled with other cubicles, Microsoft assigns a private or semiprivate closed office to every developer or pair of developers. In addition, key decision makers at every level are either developers or former developers. In a sense, the software developers at Microsoft are considered the "stars" of the company in the same way that the sales staff at IBM are considered the "stars" of their company.Within Microsoft the expression "eating our own dog food" is used to describe the policy of using the latest Microsoft products inside the company in an effort to test them in "real-world" situations. Only prerelease and beta versions of products are considered dog food. This is usually shortened to just "dogfood" and is used as noun, verb, and adjective. The company is also known for their hiring process, dubbed the "Microsoft interview", which is notorious for off-the-wall questions such as "Why is a manhole cover round?" and is a process often mimicked in other organizations, although these types of questions are rarer now than they were in the past. For fun, Microsoft also hosts the Microsoft Puzzle Hunt, an annual puzzle hunt (a live puzzle game where teams compete to solve a series of puzzles) held at the Redmond campus.As of 2006, Microsoft employees, not including Bill Gates, have given over $2.5 billion dollars to non-profit organizations worldwide, making Microsoft the worldwide top company in per-employee donations. In January 2007, the Harris Interactive/The Wall Street Journal Reputation Quotient survey concluded that Microsoft had the world's best corporate reputation, citing strong financial performance, vision & leadership, workplace environment rankings, and the charitable deeds of the Bill & Melinda Gates Foundation.
User culture
Technical reference for developers and articles for various Microsoft magazines such as Microsoft Systems Journal (or MSJ) are available through the Microsoft Developer Network, often called MSDN. MSDN also offers subscriptions for companies and individuals, and the more expensive subscriptions usually offer access to pre-release beta versions of Microsoft software. In recent years, Microsoft launched a community site for developers and users, entitled Channel9, which provides many modern features such as a wiki and an Internet forum. Another community site that provides daily videocasts and other services, On10.net, launched on March 3, 2006.
Most free technical support available through Microsoft is provided through online Usenet newsgroups (in the early days it was also provided on CompuServe). There are several of these newsgroups for nearly every product Microsoft provides, and often they are monitored by Microsoft employees. People who are helpful on the newsgroups can be elected by other peers or Microsoft employees for Microsoft Most Valuable Professional (MVP) status, which entitles people to a sort of special social status, in addition to possibilities for awards and other benefits.
By 2005, the city of Seattle in the state of Washington had 2,500 users who owned smartphone and desktop computer versions of the JamBayes Traffic Forecasting Service, developed by researchers at Microsoft and the University of Washington. Kleiner Perkins Caufield & Byers, Sequoia Capital, Skymoon Ventures, Crescendo Ventures, ZenShin Capital Partners, Artis Capital, Gold Hill Capital, and several individuals gave Dash USD$45 million for Dash Express which Wired News says "learns from its users". "If a Dash owner is moving 5 miles per hour in a 45 mph (72 km/h) zone, Dash servers will realize he's in traffic and warn other Dash drivers to choose faster routes".
Corporate structure
The company is run by a Board of Directors consisting of ten people, made up of mostly company outsiders (as is customary for publicly traded companies). Current members of the board of directors are: Steve Ballmer, James Cash, Jr., Dina Dublon, Bill Gates, Raymond Gilmartin, Reed Hastings, David Marquardt, Charles Noski, Helmut Panke, and Jon Shirley. The ten board members are elected every year at the annual shareholders' meeting, and those who do not get a majority of votes must submit a resignation to the board, which will subsequently choose whether or not to accept the resignation. There are five committees within the board which oversee more specific matters. These committees include the Audit Committee, which handles accounting issues with the company including auditing and reporting; the Compensation Committee, which approves compensation for the CEO and other employees of the company; the Finance Committee, which handles financial matters such as proposing mergers and acquisitions; the Governance and Nominating Committee, which handles various corporate matters including nomination of the board; and the Antitrust Compliance Committee, which attempts to prevent company practices from violating antitrust laws.
There are several other aspects to the corporate structure of Microsoft. For worldwide matters there is the Executive Team, made up of sixteen company officers across the globe, which is charged with various duties including making sure employees understand Microsoft's culture of business. The sixteen officers of the Executive Team include the Chairman and Chief Software Architect, the CEO, the General Counsel and Secretary, the CFO, senior and group vice presidents from the business units, the CEO of the Europe, the Middle East and Africa regions; and the heads of Worldwide Sales, Marketing and Services; Human Resources; and Corporate Marketing. In addition to the Executive Team there is also the Corporate Staff Council, which handles all major staff functions of the company, including approving corporate policies. The Corporate Staff Council is made up of employees from the Law and Corporate Affairs, Finance, Human Resources, Corporate Marketing, and Advanced Strategy and Policy groups at Microsoft. Other Executive Officers include the Presidents and Vice Presidents of the various product divisions, leaders of the marketing section, and the CTO, among others.
Stock
When the company debuted its IPO in March 13, 1986, the stock price was US $21. By the close of the first trading day, the stock had closed at $28, equivalent to 9.7 cents when adjusted for the company's first nine splits. The initial close and ensuing rise in subsequent years made several Microsoft employees millions. The stock price peaked in 1999 at around US $119 (US $60.928 adjusting for splits). While the company has had nine stock splits, the first of which was in September 18, 1987, the company did not start offering a dividend until January 16, 2003. The dividend for the 2003 fiscal year was eight cents per share, followed by a dividend of sixteen cents per share the subsequent year. The company switched from yearly to quarterly dividends in 2005, for eight cents a share per quarter with a special one-time payout of three dollars per share for the second quarter of the fiscal year.
Around 2003 the stock price began a slow descent. Despite the company's ninth split on February 2, 2003 and subsequent increases in dividend payouts, the price of Microsoft's stock continued to fall for the next several years.
Diversity
In 2005, Microsoft received a 100% rating in the Corporate Equality Index from the Human Rights Campaign, a ranking of companies by how progressive the organization deems their policies concerning LGBT (lesbian, gay, bisexual and transsexual) employees. Partly through the work of the Gay and Lesbian Employees at Microsoft (GLEAM) group, Microsoft added gender expression to its anti-discrimination policies in April 2005, and the Human Rights Campaign upgraded Microsoft's Corporate Equality Index from its 86% rating in 2004 to its current 100% rating.
In April 2005, Microsoft received wide criticism for withdrawing support from Washington state's H.B. 1515 bill that would have extended the state's current anti-discrimination laws to people with alternate sexual orientations. Microsoft was accused of bowing to pressure from local evangelical pastor Ken Hutcherson who met with a senior Microsoft executive and threatened a national boycott of Microsoft's products. Microsoft also revealed they were paying evangelical conservative Ralph Reed's company Century Strategies a $20,000 monthly fee. Over 2,000 employees signed a petition asking Microsoft to reinstate support for the bill. Under harsh criticism from both outside and inside the company's walls, Microsoft decided to support the bill again in May 2005.
Microsoft hires many foreign workers as well as domestic ones, and is an outspoken opponent of the cap on H1B visas, which allow companies in the United States to employ certain foreign workers. Bill Gates claims the cap on H1B visas make it difficult to hire employees for the company, stating "I'd certainly get rid of the H1B cap.
Sunday, February 24, 2008
Compaq


Compaq Computer Corporation was an American personal computer company founded in 1982, and is now a brand name of Hewlett-Packard.
The company was formed by Rod Canion, Jim Harris and Dick Murto — former Texas Instruments senior managers. The name "COMPAQ" was derived from "Compatibility and Quality", as at its formation Compaq produced some of the first IBM PC compatible computers.
Once the largest supplier of personal computing systems in the world, Compaq existed as an independent corporation until 2002, when it merged with Hewlett-Packard.
History
1980s
Compaq was founded in February 1982 by Rod Canion, Jim Harris and Bill Murto, three senior managers from semiconductor manufacturer Texas Instruments. Each invested $1,000 to form the company. Their first venture capital came from Ben Rosen and Sevin-Rosen partners. Like many small startups with unique beginnings, the original Compaq PC was first sketched out on a placemat by the founders while dining in a local Houston restaurant, House of Pies.
Two key marketing executives in Compaq's early years, Jim D'Arezzo and Sparky Sparks, had come from IBM's PC Group.
Compaq Portable
In November 1982 Compaq announced their first product, the Compaq Portable, a portable IBM PC compatible personal computer. It was released in March 1983 at $2995, considerably more affordable than competitors at the time. The Compaq Portable was one of the progenitors of today's laptop. It was the second IBM PC compatible, being capable of running all software that would run on an IBM PC. It was a commercial success, selling 53,000 units in its first year. The Compaq Portable was the first in the range of the Compaq Portable series. Compaq was able to market a legal IBM clone because IBM mostly used "off the shelf" parts for their PC. Furthermore, Microsoft had kept the right to license the operating system to other computer manufacturers. The only part which had to be duplicated was the BIOS, which Compaq did legally by using clean room reverse engineering for $1 million. Phoenix Technologies were the first to follow their lead, but soon "clone BIOSes" were available from several vendors.
Deskpro
On June 28th 1984 Compaq Released the Compaq Deskpro, a 16-bit desktop computer using an Intel 8086 microprocessor running at 7.14 MHz. It was considerably faster than an IBM PC and was, like the Compaq Portable, also capable of running IBM software. This was the first of the Compaq Deskpro line of computers.
Deskpro 386
When in 1986 Compaq introduced the first PC based on Intel's new 80386 microprocessor, the Compaq Deskpro 386, they began a period of increasing performance leadership over IBM, who were not yet using this processor. An IBM machine eventually reached the market seven months later, but by that time Compaq was the 386 supplier of choice and IBM had lost its image of technical leadership.
Systempro
This technical leadership and the rivalry with IBM was emphasised when the Systempro server was launched in late 1989 - this was a true server product with standard support for a second CPU and RAID, but also the first product to feature the EISA bus which was designed in reaction to IBM's MCA (MicroChannel Architecture).
1990s
At the same time as they began to dominate the server market, in the early 1990s Compaq entered the retail computer market with the Presario, and was one of the first manufacturers in the mid-1990s to market a sub-$1000 PC. In order to maintain the prices it wanted, Compaq became the first first-tier computer manufacturer to utilize CPUs from AMD and Cyrix. The price war resulting from Compaq's actions ultimately drove numerous competitors, most notably IBM and Packard Bell, from this market.
In 1997, Compaq bought Tandem Computers, known for their NonStop server line. This acquisition instantly gave Compaq a presence in the higher end business computing market. In 1998, Compaq acquired Digital Equipment Corporation, the leading company in the previous generation of computing during the 1970s and early 1980s. This acquisition made Compaq, at the time, the second largest computer maker in the world in terms of revenue. Unfortunately for the company, CEO Eckhard Pfeiffer, who engineered both mergers, had little vision for what the combined companies should do, or indeed how the three dramatically different cultures could work as a single entity, and Compaq struggled as a result. Pfeiffer was forced out as CEO in 1999 in a coup led by board chairman Ben Rosen and was succeeded by Michael Capellas, who had been serving as Compaq's CIO. Capellas was able to restore some of the luster lost in the latter part of the Pfeiffer era, but the company still struggled against lower-cost competitors such as Dell.
During November 1999, Compaq began to work with Microsoft to create the first in a line of small-scale, web-based computer systems called MSN Companions.
Merger with HP
In 2001, Compaq engaged in a merger with Hewlett-Packard. Numerous large HP shareholders, including Walter Hewlett, publicly opposed the deal, which resulted in an impassioned public proxy battle between those for and against the deal.
The merger was approved only after the narrowest of margins, and allegations of vote buying (primarily involving an alleged last-second back-room deal with Deutsche Bank) haunted the new company.
It was subsequently disclosed that HP had retained Deutsche Bank's investment banking division in January 2002 to assist in the merger. HP had agreed to pay Deutsche Bank $1 million guaranteed, and another $1 million contingent upon approval of the merger. On August 19, 2003, the United States Securities and Exchange Commission charged Deutsche Bank with failing to disclose a material conflict of interest in its voting of client proxies for the merger and imposed a civil penalty of $750,000. Deutsche Bank consented without admitting or denying the findings.
Before the merger, Compaq's ticker symbol was CPQ. This was melded with Hewlett-Packard's previous symbol (HWP) to create the current symbol of HPQ.
Post merger
Capellas left the company after serving less than a year as President of HP to become CEO of MCI Worldcom, leading it to be purchased by Verizon. Carly Fiorina, the Chairman and CEO of HP, added Capellas's responsibilities to her own.
Fiorina helmed HP for nearly three years after Capellas left. HP laid off thousands of former Compaq, DEC, HP, and Tandem employees, its stock price generally declined and profits did not perk up. Though the merger initially made it the number one PC maker, it soon lost the lead and further market share to Dell. In addition, the merging of stagnant Compaq with HP's lucrative printing and imaging division was criticized as that overshadowed the latter's profitability. In February 2005, the Board of Directors ousted Fiorina. Former Compaq CEO Capellas was mentioned by some as a potential successor, but several months afterwards, Mark Hurd was hired as CEO.
In late 2005, HPQ seemed to find its feet under the new leadership of Mark Hurd. At this same time Dell seemed to be faltering and HPQ took back the #1 sales position. Hurd separated the PC division from the imaging and printing vision. HP's PC segment has since been reinvigorated and now generates more revenue than the traditionally more profitable printers.
Most Compaq products have been re-branded with the HP nameplate, such as the company's market leading ProLiant server line, while the Compaq brand remains on only some consumer-orientated products, notably Compaq Presario PCs. HP's business computers line was deprecated in favour of the Compaq Evo line, which was rebranded HP Compaq. HP's Jornada PDAs were replaced by Compaq iPAQ PDAs, which were renamed HP iPAQ.
In May of 2007, HP in a press release announced a new logo for their Compaq Division to be placed on the new model Compaq Presarios.
Sponsorship
Compaq sponsored Queens Park Rangers Football Club from 1994 to 1996, during their most recent two seasons as a Premier League club.
Compaq also sponsored the Williams team in Formula One.
Two sports stadiums were named after the company:
The Compaq Center, of Houston, Texas, formerly The Summit, until its sports teams moved to the Toyota Center. The building became the new home of Lakewood Church, one of the largest Protestant congregations in the United States.
The Compaq Center at San Jose, later renamed the HP Pavilion when HP purchased Compaq.
Saturday, February 23, 2008
Parallel port

A parallel port is a type of interface found on computers (personal and otherwise) for connecting various peripherals. It is also known as a printer port or Centronics port . The IEEE 1284 standard defines the bi-directional version of the port.
History
The Centronics Model 101 printer was introduced in 1970 and included the first parallel interface for printers. The interface was developed by Dr. An Wang, Robert Howard and Prentice Robinson at Wang Laboratories. The now-familiar connector was selected because Wang had a surplus stock of 20,000 Amphenol 36-pin micro ribbon connectors that were originally used for one of their early calculators. The Centronics parallel interface quickly became a de facto industry standard; manufacturers of the time tended to use various connectors on the system side, so a variety of cables were required. For example, early VAX systems used a DC-37 connector, NCR used the 36-pin micro ribbon connector, Texas Instruments used a 25-pin card edge connector and Data General used a 50-pin micro ribbon connector.
Dataproducts introduced a very different implementation of the parallel interface for their printers. It used a DC-37 connector on the host side and a 50 pin connector on the printer side—either a DD-50 (sometimes incorrectly referred to as a "DB50") or the block shaped M-50 connector; the M-50 was also referred to as Winchester. Dataproducts parallel was available in a short-line for connections up to 50 feet (15 m) and a long-line version for connections from 50 feet (15 m) to 500 feet (150 m). The Dataproducts interface was found on many mainframe systems up through the 1990s, and many printer manufacturers offered the Dataproducts interface as an option.
IBM released the IBM Personal Computer in 1981 and included a variant of the Centronics interface— only IBM logo printers (rebranded from Epson) could be used with the IBM PC. IBM standardized the parallel cable with a DB25F connector on the PC side and the Centronics connector on the printer side. Vendors soon released printers compatible with both standard Centronics and the IBM implementation.
IBM implemented an early form of bidirectional interface in in 1987. HP introduced their version of bidirectional, known as Bitronics, on the LaserJet 4 in 1992. The Bitronics and Centronics interfaces were superseded by the IEEE 1284 standard in 1994.
Uses
Before the advent of USB, the parallel interface was adapted to access a number of peripheral devices other than printers. Probably one of the earliest devices to use parallel were dongles used as a hardware key form of software copy protection. Zip drives and scanners were early implementations followed by external modems, sound cards, webcams, gamepads, joysticks and external hard disk drives and CD-ROM drives. Adapters were available to run SCSI devices via parallel. Other devices such as EPROM programmers and hardware controllers could be connected parallel.
Current use
At the consumer level, the USB interface—and in some cases Ethernet—has effectively replaced the parallel printer port. Many manufacturers of personal computers and laptops consider parallel to be a to be a legacy port and no longer include the parallel interface. USB to parallel adapters are available to use parallel-only printers with USB-only systems.
Program interface
In versions of Microsoft Windows that did not use the Windows NT kernel (as well as MS-DOS and some other operating systems), programs could access the parallel port with simple outportb() and inportb() subroutine commands. In operating systems such as Windows NT and Unix (NetBSD, FreeBSD, Solaris, 386BSD, etc), the microprocessor is operated in a different security ring, and accesses to the parallel port is inhibited, unless using the required driver. This improves security and arbitration of device contention. On Linux, inb() and outb() can be used when a process is run as root.
Monodirectional parallel ports
In early parallel ports the data lines were monodirectional (data out only) so it was not easily possible to feed data in to the computer. However, a workaround was possible by using 4 of the 5 status lines. A circuit could be constructed to split each 8-bit byte into two 4-bit nibbles which were fed in sequentially through the status lines. Each pair of nibbles was then re-combined into an 8-bit byte.
Friday, February 22, 2008
Image scanner


In computing, a scanner is a device that analyzes images, printed text, or handwriting, or an object (such as an ornament) and converts it to a digital image. Most scanners today are variations of the desktop (or flatbed) scanner. The flatbed scanner is the most common in offices. Hand-held scanners, where the device is moved by hand, were briefly popular but are now not used due to the difficulty of obtaining a high-quality image. Both these types of scanners use charge-coupled device (CCD) or Contact Image Sensor (CIS) as the image sensor, whereas older drum scanners use a photomultiplier tube as the image sensor.
Another category of scanner is a rotary scanner, used for high-speed document scanning. This is another kind of drum scanner, but it uses a CCD array instead of a photomultiplier.
Other types of scanners are planetary scanners, which take photographs of books and documents, and 3D scanners, for producing three-dimensional models of objects, but these types of scanner are considerably more expensive than other types of scanners.
Another category of scanner are digital camera scanners, which are based on the concept of reprographic cameras. Due to increasing resolution and new features such as anti-shake, digital cameras have become an attractive alternative to regular scanners. While still containing disadvantages compared to traditional scanners, digital cameras offer unmatched advantages in speed and portability.
Types
There are different types of scanners, depending on the user's purposes. Described below are the most commonly used types that can be found in the market:
Drum
Drum scanners capture image information with photomultiplier tubes (PMT), rather than the charge-coupled-device (CCD) arrays found in flatbed scanners and inexpensive film scanners. Reflective and transmissive originals are mounted on an acrylic cylinder, the scanner drum, which rotates at high speed while it passes the object being scanned in front of precision optics that deliver image information to the PMTs. Most modern color drum scanners use 3 matched PMTs, which read red, blue, and green light respectively. Light from the original artwork is split into separate red, blue, and green beams in the optical bench of the scanner.
The drum scanner gets its name from the large glass drum on which the original artwork is mounted for scanning: they usually take11"x17" documents, but maximum size varies by manufacturer. One of the unique features of drum scanners is the ability to control sample area and aperture size independently. The sample size is the area that the scanner encoder reads to create an individual pixel. The aperture is the actual opening that allows light into the optical bench of the scanner. The ability to control aperture and sample size separately is particularly useful for smoothing film grain when scanning black-and white and color negative originals.
While drum scanners are capable of scanning both reflective and transmissive artwork, a good-quality flatbed scanner can produce excellent scans from reflective artwork. As a result, drum scanners are rarely used to scan prints now that high quality inexpensive flatbed scanners are readily available. Film, however, is where drum scanners continue to be the tool of choice for high-end applications. Because film can be wet-mounted to the scanner drum and because of the exceptional sensitivity of the PMTs, drum scanners are capable of capturing very subtle details in film originals.
Currently only a few companies continue to manufacture drum scanners. While prices of both new and used units have come down over the last decade, they still require a considerable monetary investment when compared to CCD flatbed and film scanners. However, drum scanners remain in demand due to their capacity to produce scans that are superior in resolution, color gradation, and value structure. Also, since drum scanners are capable of resolutions up to 12,000 PPI, their use is generally recommended when a scanned image is going to be enlarged.
In most current graphic-arts operations, very-high-quality flatbed scanners have replaced drum scanners, being both less expensive and faster. However, drum scanners continue to be used in high-end applications, such as museum-quality archiving of photographs and print production of high-quality books and magazine advertisements. In addition, due to the greater availability of pre-owned units many fine-art photographers are acquiring drum scanners, which has created a new niche market for the machines.
The first image scanner ever developed was a drum scanner. It was built in 1957 at the US National Bureau of Standards by a team led by Russell Kirsch. The first image ever scanned on this machine was a 5 cm square photograph of Kirsch's then-three-month-old son, Walden. The black and white image had a resolution of 176 pixels on a side.
Flatbed
A flatbed scanner is usually composed of a glass pane (or platen), under which there is a bright light (often xenon or cold cathode fluorescent) which illuminates the pane, and a moving optical array, whether CCD or CIS. Color scanners typically contain three rows (arrays) of sensors with red, green, and blue filters. Images to be scanned are placed face down on the glass, an opaque cover is lowered over it to exclude ambient light, and the sensor array and light source move across the pane, reading the entire area. An image is therefore visible to the charge-coupled device only because of the light it reflects. Transparent images do not work in this way, and require special accessories that illuminate them from the upper side. Many scanners offer this as an option.
Film
Slide" (positive) or negative film can be scanned in equipment specially manufactured for this purpose. Usually, uncut film strips of up to six frames, or four mounted slides, are inserted in a carrier, which is moved by a stepper motor across a lens and CCD sensor inside the scanner. Some models even have adaptors for APS film cassettes. Dedicated film scanners often offer better resolution than flatbed scanners, partly because they do not need to scan large areas.
Hand
Hand scanners are manual devices that are dragged across the surface of the image to be scanned. Scanning documents in this manner requires a steady hand, as an uneven scanning rate would produce distorted images - a little light on the scanner would indicate if the motion was too fast. They typically have a "start" button, which is held by the user for the duration of the scan; some switches to set the optical resolution; and a roller, which generates a clock pulse for synchronisation with the computer. Most hand scanners were monochrome, and produced light from an array of green LEDs to illuminate the image. A typical hand scanner also had a small window through which the document being scanned could be viewed. They were popular during the early 1990s and usually had a proprietary interface module specific to a particular type of computer, usually an Atari ST or Commodore Amiga.
Quality
Scanners typically read red-green-blue color (RGB) data from the array. This data is then processed with some proprietary algorithm to correct for different exposure conditions and sent to the computer, via the device's input/output interface (usually SCSI or LPT in machines pre-dating the USB standard). Color depth varies depending on the scanning array characteristics, but is usually at least 24 bits. High quality models have 48 bits or more color depth. The other qualifying parameter for a scanner is its resolution, measured in pixels per inch (ppi), sometimes more accurately referred to as samples per inch (spi). Instead of using the scanner's true optical resolution, the only meaningful parameter, manufacturers like to refer to the interpolated resolution, which is much higher thanks to software interpolation. As of 2004, a good flatbed scanner has an optical resolution of 1600–3200 ppi, high-end flatbed scanners can scan up to 5400 ppi, and a good drum scanner has an optical resolution of 8000–14,000 ppi.
Manufacturers often claim interpolated resolutions as high as 19,200 ppi; but such numbers carry little meaningful value, because the number of possible interpolated pixels is unlimited.
The higher the resolution, the larger the file. In most cases, there is a trade-off between manageable file size and level of detail.
The third important parameter for a scanner is its density range. A high density range means that the scanner is able to reproduce shadow details and brightness details in one scan.
Computer connection
Scanning the document is only one part of the process. For the scanned image to be useful, it must be transferred from the scanner to an application running on the computer. There are two basic issues: (1) how the scanner is physically connected to the computer and (2) how the application retrieves the information from the scanner.
Physical Connection to the Computer
The amount of data generated by a scanner can be very large: a 600 DPI 9"x11" (slightly larger than A4 paper) uncompressed 24-bit image consumes about 100 megabytes of uncompressed data in transfer and storage on the host computer. Recent scanners can generate this volume of data in a matter of seconds, making a fast connection desirable.
There are four common connections used by scanners:
Parallel - Connecting through a parallel port is the slowest common transfer method. Early scanners had parallel port connections that could not transfer data faster than 70 kilobytes/second. The primary advantage of the parallel port connection was economy -- it avoided adding an interface card to the computer.
Small Computer System Interface (SCSI), which is supported by most computers only via an additional SCSI interface card. Some SCSI scanners are supplied together with a dedicated SCSI card for a PC, although any SCSI controller can be used. During the evolution of the SCSI standard speeds increased, with backwards compatibility; a SCSI connection can transfer data at the highest speed which both the controller and the device support. SCSI has been largely replaced by USB and Firewire, one or both of which are directly supported by most computers, and which are easier to set up than SCSI.
Universal Serial Bus (USB) scanners can transfer data quickly, and they are easier to use and cheaper than SCSI devices. The early USB 1.1 standard could transfer data at only 1.5 megabytes per second (slower than SCSI), but the later USB 2.0 standard can theoretically transfer up to 60 megabytes per second (although everyday rates are much lower), resulting in faster operation.
FireWire is an interface that is much faster than USB 1.1 and comparable to USB 2.0. FireWire speeds are 25, 50, and 100 megabytes per second (but a device may not support all speeds). There's also a newer 400 megabyte per second speed.
Applications Programming Interface
An application such as Adobe Photoshop must communicate with the scanner. There are many different scanners, and many of those scanners use different protocols. In order to simplify applications programming, some Applications Programming Interfaces ("API") were developed. The API presents a uniform interface to the scanner. This means that the application does not need to know the specific details of the scanner in order to access it directly. For example, Adobe Photoshop supports the TWAIN standard; consequently, (in an ideal world) Photoshop can acquire an image from any scanner that also supports TWAIN.
In practice, there are often problems with an application communicating with a scanner. Either the application or the scanner manufacturer (or both) may have faults in their implementation of the API.
Typically, the API is implemented as a dynamically linked library. Each scanner manufacturer provides software that translates the API procedure calls into primitive commands that are issued to a hardware controller (such as the SCSI, USB, or FireWire controller). The manufacturer's part of the API is commonly called a device driver, but that designation is not strictly accurate: the API does not run in kernel mode and does not directly access the device.
Some scanner manufacturers will offer more than one API.
Most scanners use the TWAIN API. The TWAIN API, originally used for low-end and home-use equipment, is now widely used for large-volume scanning.
Other scanner API's are
ISIS, created by Pixel Translations, which still uses SCSI-II for performance reasons, is used by large, departmental-scale, machines.
SANE (Scanner Access Now Easy) is a free/open source API for accessing scanners. Originally developed for Unix and Linux operating systems, it has been ported to OS/2, Mac OS X, and Microsoft Windows. Unlike TWAIN, SANE does not handle the user interface. This allows batch scans and transparent network access without any special support from the device driver.
Output data
The scanned result is a non-compressed RGB image, which can be transferred to a computer's memory. Some scanners compress and clean up the image using embedded firmware. Once on the computer, the image can be processed with a raster graphics program (such as Photoshop or the GIMP) and saved on a storage device (such as a hard disk).
In common use, scanned pictures are stored on a computer's hard disk, normally in image formats such as JPEG, TIFF, Bitmap, and PNG. Some scanners can also be used to capture editable text, so long as the text can be read by the computer in a discernible font. This process is called Optical Character Recognition (OCR).
Document processing
The scanning or digitization of paper documents for storage makes different requirements of the scanning equipment used than scanning of pictures for reproduction. While documents can be scanned on general-purpose scanners, it is more efficiently performed on dedicated document scanners manufactured by Atiz Innovation, Böwe Bell & Howell, Canon, Epson, Fujitsu, HP, Kodak and other companies.
When scanning large quantities of documents, speed and paper-handling is very important, but the resolution of the scan will normally be much lower than for good reproduction of pictures.
Document scanners have document feeders, usually larger than those sometimes found on copiers or all-purpose scanners. Scans are made at high speed, perhaps 20 to 150 pages per minute, often in grayscale, although many scanners support color. Many scanners can scan both sides of double-sided originals (duplex operation). Sophisticated document scanners have firmware of software that cleans up scans of text as they are produced, eliminating accidental marks and sharpening type; this would be unacceptable for photographic work, where marks cannot reliably be distinguished from desired fine detail. Files created are compressed as they are made.
The resolution used is usually from 150 to 300 dpi, although the hardware may be capable of somewhat higher resolution; this produces images of text good enough to read and for optical character recognition (OCR), without the higher demands on storage space required by higher-resolution images.
Document scans are often processed using OCR technology to create editable and searchable files. Most scanners use ISIS or Twain device drivers to scan documents into TIFF format so that the scanned pages can be fed into a document management system that will handle the archiving and retrieval of the scanned pages. Lossy JPEG compression, which is very efficient for pictures, is undesirable for text documents, as slanted straight edges take on a jagged appearance, and solid black (or other color) text on a light background compresses well with lossless compression formats.
While paper feeding and scanning can be done automatically and quickly, preparation and indexing are necessary and require much work by humans. Preparation involves manually inspecting the papers to be scanned and making sure that they are in order, unfolded, without staples or anything else that might jam the scanner.
Indexing involves associating keywords to files so that they can be retrieved by content. This process can sometimes be automated to some extent, but is likely to involve manual labour. One common practice is the use of barcode-recognition technology: during preparation, barcode sheets with folder names are inserted into the document files, folders, and document groups. Using automatic batch scanning, the documents are saved into the appropriate folders, and an index is created for integration into document-management software systems.
A specialized form of document scanning is book scanning. Technical difficulties arise from the books usually being bound and sometimes fragile and irreplaceable, but some manufacturers have developed specialized machinery to deal with this. For instance, Atiz DIY scanner uses a V-shaped cradle and a V-shaped transparent platen to handle brittle books. Often special robotic mechanisms are used to automate the page turning and scanning process.
Thursday, February 21, 2008
IBM


International Business Machines Corporation (abbreviated IBM, nicknamed "Big Blue"; NYSE: IBM) is a multinational computer technology and consulting corporation headquartered in Armonk, New York, USA. The company is one of the few information technology companies with a continuous history dating back to the 19th century. IBM manufactures and sells computer hardware and software, and offers infrastructure services, hosting services, and consulting services in areas ranging from mainframe computers to nanotechnology.
IBM has been known through most of its recent history as the world's largest computer company; with over 355,000 employees worldwide, IBM is the largest information technology employer in the world. Despite falling behind Hewlett-Packard in total revenue since 2006, it remains the most profitable. Since 1990, IBM's annual sales growth has trailed behind the US economic growth due to global deregulation and competition.
IBM holds more patents than any other U.S. based technology company. It has engineers and consultants in over 170 countries and IBM Research has eight laboratories worldwide. IBM employees have earned three Nobel Prizes, four Turing Awards, five National Medals of Technology, and five National Medals of Science. As a chip maker, IBM is among the Worldwide Top 20 Semiconductor Sales Leaders.
History
The company which became IBM was founded in 1888 as the Tabulating Machine Company by Herman Hollerith, in Broome County, New York. It was incorporated as Computing Tabulating Recording Corporation (CTR) on June 16, 1911, and was listed on the New York Stock Exchange in 1916. IBM adopted its current name in 1924, when it became a Fortune 500 company.
The author Edwin Black has alleged that, during World War II, IBM CEO Thomas J. Watson used overseas subsidiaries to provide the Third Reich with punch card machines that could help the Nazis to track down the European Jewry. IBM denies that they had control over these subsidiaries after the Nazis took control of them. A lawsuit against IBM based on these allegations was dismissed.
In the 1950s, IBM became the dominant vendor in the emerging computer industry with the release of the IBM 701 and other models in the IBM_700/7000_series of mainframes. The company's dominance became even more pronounced in the 1960s and 1970s with the IBM System/360 and IBM System/370 mainframes, however antitrust actions by the United States Department of Justice, the rise of minicomputer companies like Digital Equipment Corporation and Data General, and the introduction of the microprocessor all contributed to dilution of IBM's position in the industry, eventually leading the company to diversify into other areas including personal computers, software, and services.
In 1981 IBM introduced the IBM Personal Computer which is the original version and progenitor of the IBM PC compatible hardware platform.
IBM's PC division was bought by Chinese company Lenovo on May 1, 2005 for $655 million in cash and $600 million in Lenovo stock. On January 25, 2007, Ricoh announced purchase of IBM Printing Systems Division for $725 million and investment in 3-year joint venture to form a new Ricoh subsidiary, InfoPrint Solutions Company; Ricoh will own a 51% share, and IBM will own a 49% share in InfoPrint.
Corporate culture of IBM
Big Blue
Big Blue is a nickname for IBM; several theories exist regarding its origin. One theory, substantiated by people who worked for IBM at the time, is that IBM field reps coined the term in the 1960s, referring to the color of the mainframes IBM installed in the 1960s and early 1970s. "All blue" was a term used to describe a loyal IBM customer, and business writers later picked up the term. Another theory suggests that Big Blue simply refers to the Company's logo. A third theory suggests that Big Blue refers to a former company dress code that required many IBM employees to wear only white shirts and many wore blue suits. In any event, IBM keyboards, typewriters, and some other manufactured devices, have played on the "Big Blue" concept, using the color for enter keys and carriage returns.
Sales
IBM has often been described as having a sales-centric or a sales-oriented business culture. Traditionally, many IBM executives and general managers are chosen from the sales force. The current CEO, Sam Palmisano, for example, joined the company as a salesman and, unusually for CEOs of major corporations, has no MBA or postgraduate qualification. Middle and top management are often enlisted to give direct support to salesmen when pitching sales to important customers.
Uniform
A dark (or gray) suit, white shirt, and a "sincere" tie was the public uniform for IBM employees for most of the 20th century. During IBM's management transformation in the 1990s, CEO Lou Gerstner relaxed these codes, normalizing the dress and behavior of IBM employees to resemble their counterparts in other large technology companies.
IBM company values and "Jam"
In 2003, IBM embarked on an ambitious project to rewrite company values. Using its Jam technology, the company hosted Intranet-based online discussions on key business issues with 50,000 employees over 3 days. The discussions were analyzed by sophisticated text analysis software (eClassifier) to mine online comments for themes. As a result of the 2003 Jam, the company values were updated to reflect three modern business, marketplace and employee views: "Dedication to every client's success", "Innovation that matters - for our company and for the world", "Trust and personal responsibility in all relationships".
In 2004, another Jam was conducted during which 52,000 employees exchanged best practices for 72 hours. They focused on finding actionable ideas to support implementation of the values previously identified. A new post-Jam Ratings event was developed to allow IBMers to select key ideas that support the values. The board of directors cited this Jam when awarding Palmisano a pay rise in the spring of 2005.
In July and September 2006, Palmisano launched another jam called InnovationJam. InnovationJam was the largest online brainstorming session ever with more than 150,000 participants from 104 countries. The participants were IBM employees, members of IBM employees' families, universities, partners, and customers. InnovationJam was divided in two sessions (one in July and one in September) for 72 hours each and generated more than 46,000 ideas. In November 2006, IBM declared that they will invest $US 100 million in the 10 best ideas from InnovationJam.
Open source
IBM has been influenced by the Open Source Initiative, and began supporting Linux in 1998. The company invests billions of dollars in services and software based on Linux through the IBM Linux Technology Center, which includes over 300 Linux kernel developers. IBM has also released code under different open-source licenses, such as the platform-independent software framework Eclipse (worth approximately US$40 million at the time of the donation) and the Java-based relational database management system (RDBMS) Apache Derby. IBM's open source involvement has not been trouble-free, however.
Project Management Center of Excellence
The IBM Project Management Center of Excellence (PM COE) is a program dedicated to defining and executing the steps IBM must take to strengthen its project management capabilities. Functioning as IBM's think tank, the PM COE combines external industry trends and directions with IBM business, organizational, and geographic requirements and insight. Upon this foundation deliverables (such as project management policy, practices, methods, and tools) are developed.
All IBM Project Managers (PMs) on the Project Management track (dimension) must complete either accreditation or IBM certification. Junior PMs (Associate PM and Advisory PM) are accredited after self-assessment and authorization from supervisors. Senior PMs (Senior PM and Executive PM) must go through a stringent IBM certification process. By validating project managers' expertise and skills against consistent worldwide standards, certification helps maintain customer confidence in the high quality of IBM professionals and it recognizes IBM professionals for their skills and experience.
Becoming certified is public recognition of achieving a significant career milestone and demonstrating expertise in the profession. Prior to applying for IBM certification each individual must have:
successfully passed PMI exam (i.e. be a certified PMP).
verifiable documentation and approval for mastery/expertise in a well-defined set of PM skills.
several years of PM experience spanning at least 3 verifiable projects within the immediate 5 years (including specific role, team size, and budget requirements).
verifiable documentation and proof of at least one area of specialty.
demonstrated the use of IBM's Worldwide Project Management Method (WWPMM).
completed extensive classroom and online education and testing.
IBM PM Certification is a well-defined review and verification process with many intricate details. In its most simplified form, it broadly involves:
Candidate preparing a detailed package with proof of above requirements.
Package review, approval, and support by at least two levels of Senior Management.
Package review and re-verification by PM COE expert.
Personal interviews with the PM COE Certification board.
Candidates whose experience, skills, knowledge and education are deemed valid, verifiable and accurate, are certified by the board as either Certified Senior Project Manager (CSPM) or Certified Executive Project Manager (CEPM).
IBM PM Certification is a significant achievement for any IBMer. It is a deliberately long process with multiple checkpoints designed to ensure the integrity, fairness and validity of the certification.
Diversity and workforce issues
IBM's efforts to promote workforce diversity and equal opportunity date back at least to World War I, when the company hired disabled veterans. IBM was the only technology company ranked in Working Mother magazine's Top 10 for 2004, and one of two technology companies in 2005 (the other company being Hewlett-Packard).
The company has traditionally resisted labor union organizing, although unions represent some IBM workers outside the United States.
In the 1990s, two major pension program changes, including a conversion to a cash balance plan, resulted in an employee class action lawsuit alleging age discrimination. IBM employees won the lawsuit and arrived at a partial settlement, although appeals are still underway. IBM also settled a major overtime class-action lawsuit in 2006.
Historically IBM has had a good reputation of long-term staff retention with few large scale layoffs. In more recent years there have been a number of broad sweeping cuts to the workforce as IBM attempts to adapt to changing market conditions and a declining profit base. After posting weaker than expected revenues in the first quarter of 2005, IBM eliminated 14,500 positions from its workforce, predominantly in Europe. In May 2005, IBM Ireland said to staff that the MD(Micro-electronics Division) facility was closing down by the end of 2005 and offered a settlement to staff. However, all staff that wished to stay with the Company were redeployed within IBM Ireland. The production moved to a company called Amkor in Singapore who purchased IBM's Microelectronics business in Singapore and is widely agreed that IBM promised this Company a full load capacity in return for the purchase of the facility. On June 8, 2005, IBM Canada Ltd. eliminated approximately 700 positions. IBM projects these as part of a strategy to "rebalance" its portfolio of professional skills & businesses. IBM India and other IBM offices in China, the Philippines and Costa Rica have been witnessing a recruitment boom and steady growth in number of employees due to lower wages.
On October 10, 2005, IBM became the first major company in the world to formally commit to not using genetic information in its employment decisions. This came just a few months after IBM announced its support of the National Geographic Society's Genographic Project.
Subscribe to:
Comments (Atom)