Copy Right and Fair Use...
Bad never looked so good
The International Intellectual Property Alliance unveiled the new report today in association with the Congressional International Anti-Piracy Caucus at an event in Washington, DC. The report doesn't even try to quantify losses to piracy anymore—last year, an official US government report concluded that such estimates were all deeply unreliable. Instead, it simply asserts without evidence that "piracy inhibits… growth in the US and around the world."
"Inhibits growth" doesn't quite equal "causes staggering job losses," the traditional anti-piracy rallying cry. Indeed, copyright industries are being "hard hit" by piracy in the way that plenty of other US industries are desperate to get "hit." (In this sense, the report is bit like the MPAA's routine announcements of record-setting box office revenues even as the movie studios conjure visions of apocalypse.)
During the recession of the last few years, the report shows that copyright-based businesses have far exceeded the US economy as a whole.
Guns and Roses Copyright Scandal
One of the most bizarre cases yet with the FBI arresting a blogger for blogger for breaking copyright laws by posting copies of songs from the soon-to-be-released Guns N' Roses album, Chinese Democracy. Kevin Cogill, under the online Alias "Skwerl", has been charged with copyright violations after he uploaded nine of the album's songs for downloading purposes. Cogill could spend up to five years in jail.
This is a case of complete overkill with other bloggers saying it's time to show the guy support. AS Wired's Eliot Van Buskirk says, Cogill's fate now rests in the band's hands. The Gunners have put out a statement sitting on the fence.
"Presently, though we don't support this guy's actions at that level, our interest is in the original source. We can't comment publicly at this time as the investigation is ongoing."
Just another sign of how out of touch with the music industry is with the real world.
Fair Use
Fair use is a copyright principle based on the belief that the public is entitled to freely use portions of copyrighted materials for purposes of commentary and criticism. For example, if you wish to criticize a novelist, you should have the freedom to quote a portion of the novelist’s work without asking permission. Absent this freedom, copyright owners could stifle any negative comments about their work
http://www.newswise.com/articles/economists-say-copyright-and-patent-laws-are-killing-innovation-hurting-economy
Fair Use
Uses That Are Generally Fair Uses
Subject to some general limitations discussed later in this article, the following types of uses are usually deemed fair uses:
Criticism and comment -- for example, quoting or excerpting a work in a review or criticism for purposes of illustration or comment.
News reporting -- for example, summarizing an address or article, with brief quotations, in a news report.
Research and scholarship -- for example, quoting a short passage in a scholarly, scientific, or technical work for illustration or clarification of the author's observations.
Nonprofit educational uses -- for example, photocopying of limited portions of written works by teachers for classroom use.
Parody -- that is, a work that ridicules another, usually well-known, work by imitating it in a comic way.
Copyright
Fox News Sued For Copyright Infringement; Complaint Mocks Murdoch's Comments On 'Stealing' Content
from the gonna-come-back-to-bite-you dept
It's always funny how those organizations that seem to be against the concept of fair use have it come back to bite them. You may remember, a few months ago, as part of his campaign against "aggregator" sites that "steal" from him, Murdoch commented that fair use would likely be barred in the courts if properly challenged, suggesting he didn't believe in fair use at all. We already noted the irony of this, given how many different aggregator sites Murdoch owns as part of News Corp. Now those statements may also be causing a bit of a problem in court as well.
A bunch of folks have been sending in the news that a former advisor to Michael Jackson who apparently holds the copyright on certain interview footage is suing Fox News over airing parts of the interview recently. In response Fox has claimed "fair use," over the use in a news program -- and I actually agree that it seems like a case of fair use -- but the copyright holder actually uses Murdoch's words against him:
The filing chides Murdoch, who has threatened to sue the British Broadcasting Corp. and others for copyright infringement because he claims they are stealing content from his company's newspapers.
"Fox sanctimoniously operates unencumbered by the very copyright restrictions it seeks to impose on its competitors," the lawsuit states.
Once again, it appears that a copyright holder doesn't believe in fair use for others, but only for themselves.
Fair use on the Internet
A US court case in 2003, Kelly v. Arriba Soft Corporation, provides and develops the relationship between thumbnails, inline linking and fair use. In the lower District Court case on a motion for summary judgment, Arriba Soft was found to have violated copyright without a fair use defense in the use of thumbnail pictures and inline linking from Kelly's website in Arriba's image search engine. That decision was appealed and contested by Internet rights activists such as the Electronic Frontier Foundation, who argued that it is clearly covered under fair use.
On appeal, the 9th Circuit Court of Appeals found in favour of the defendant. In reaching its decision, the court utilized the above-mentioned four-factor analysis. Firstly, it found the purpose of creating the thumbnail images as previews to be sufficiently transformative, noting that they were not meant to be viewed at high resolution like the original artwork was. Secondly, the fact that the photographs had already been published diminished the significance of their nature as creative works. Thirdly, although normally making a "full" replication of a copyrighted work may appear to violate copyright, here it was found to be reasonable and necessary in light of the intended use. Lastly, the court found that the market for the original photographs would not be substantially diminished by the creation of the thumbnails. To the contrary, the thumbnail searches could increase exposure of the originals. In looking at all these factors as a whole, the court found that the thumbnails were fair use and remanded the case to the lower court for trial after issuing a revised opinion on July 7, 2003. The remaining issues were resolved with a default judgment after Arriba Soft had experienced significant financial problems and failed to reach a negotiated settlement.
In August 2008 US District Judge Jeremy Fogel of San Jose, California ruled that copyright holders cannot order a deletion of an online file without determining whether that posting reflected "fair use" of the copyrighted material. The case involved Stephanie Lenz, a writer and editor from Gallitzin, Pennsylvania, who made a home video of her thirteen-month-old son dancing to Prince's song Let's Go Crazy and posted the video on YouTube. Four months later, Universal Music, the owner of the copyright to the song, ordered YouTube to remove the video enforcing the Digital Millennium Copyright Act. Lenz notified YouTube immediately that her video was within the scope of fair use, and demanded that it be restored. YouTube complied after six weeks, not two weeks as required by the Digital Millennium Copyright Act. Lenz then sued Universal Music in California for her legal costs, claiming the music company had acted in bad faith by ordering removal of a video that represented fair-use of the song.[27]
Reference: One of the most bizarre cases yet with the FBI arresting a blogger for blogger for breaking copyright laws by posting copies of songs from the soon-to-be-released Guns N' Roses album, Chinese Democracy. Kevin Cogill, under the online Alias "Skwerl", has been charged with copyright violations after he uploaded nine of the album's songs for downloading purposes. Cogill could spend up to five years in jail.
This is a case of complete overkill with other bloggers saying it's time to show the guy support. AS Wired's Eliot Van Buskirk says, Cogill's fate now rests in the band's hands. The Gunners have put out a statement sitting on the fence.
"Presently, though we don't support this guy's actions at that level, our interest is in the original source. We can't comment publicly at this time as the investigation is ongoing."
Just another sign of how out of touch with the music industry is with the real world.
Reference: http://en.wikipedia.org/wiki/Fair_use
Reference: http://www.techdirt.com/articles/20100108/1446417680.shtml
Reference: Uses That Are Generally Fair Uses
Subject to some general limitations discussed later in this article, the following types of uses are usually deemed fair uses:
Criticism and comment -- for example, quoting or excerpting a work in a review or criticism for purposes of illustration or comment.
News reporting -- for example, summarizing an address or article, with brief quotations, in a news report.
Research and scholarship -- for example, quoting a short passage in a scholarly, scientific, or technical work for illustration or clarification of the author's observations.
Nonprofit educational uses -- for example, photocopying of limited portions of written works by teachers for classroom use.
Parody -- that is, a work that ridicules another, usually well-known, work by imitating it in a comic way.
Reference: http://fairuse.stanford.edu/Copyright_and_Fair_Use_Overview/chapter9/
Reference: http://arstechnica.com/tech-policy/news/2011/11/piracy-problems-us-copyright-industries-show-terrific-health.ars
Tomas
วันจันทร์ที่ 13 กุมภาพันธ์ พ.ศ. 2555
Chapter 12 Knowledge Management...
Knowledge Management Comes Quite Naturally to Humans
While there are normally only five ways to organize information — LATCH (Location, Alphabet, Time, Category, or Hierarchy), these five ways have a lot of versatility (Wurman, 2001). For example, a youngster with a toy car collection may sort them by color, make, type, size, type of play, or a dozen other divisions. The youngster can even make up categories as new divisions, play activities, or wants appear. However, a computer is considered "intelligent" if it can sort a collection into one category. Yet, many organizations are placing their bets on computer systems due to the amount of data such systems can hold and the speed at which it can sort and distribute once such categories and data are made known to it.
http://www.nwlink.com/~donclark/performance/dikw.jpg
History
KM efforts have a long history, to include on-the-job discussions, formal apprenticeship, discussion forums, corporate libraries, professional training and mentoring programs. More recently, with increased use of computers in the second half of the 20th century, specific adaptations of technologies such as knowledge bases, expert systems, knowledge repositories, group decision support systems, intranets, and computer-supported cooperative work have been introduced to further enhance such efforts.[1]
In 1999, the term personal knowledge management was introduced which refers to the management of knowledge at the individual level (Wright 2005).
In terms of the enterprise, early collections of case studies recognized the importance of knowledge management dimensions of strategy, process, and measurement (Morey, Maybury & Thuraisingham 2002). Key lessons learned included: people and the cultural norms which influence their behaviors are the most critical resources for successful knowledge creation, dissemination, and application; cognitive, social, and organizational learning processes are essential to the success of a knowledge management strategy; and measurement, benchmarking, and incentives are essential to accelerate the learning process and to drive cultural change. In short, knowledge management programs can yield impressive benefits to individuals and organizations if they are purposeful, concrete, and action-oriented.
More recently with the advent of the Web 2.0, the concept of Knowledge Management has evolved towards a vision more based on people participation and emergence. This line of evolution is termed Enterprise 2.0 (McAfee 2006). However, there is an ongoing debate and discussions (Lakhani & McAfee 2007) as to whether Enterprise 2.0 is just a fad that does not bring anything new or useful or whether it is, indeed, the future of knowledge management (Davenport 2008).
Momentum of Knowledge Management
The last few years have seen a rapidly growing interest in the topic of knowledge management. 'Leveraging Knowledge for Sustainable Advantage' was the title of one of the first conferences (in 1995) that brought knowledge management onto the management agenda. From 1997 a surge of books, magazines and websites have come onto the scene. Today (2003) most large organizations have some form of knowledge management initiative. Many companies have created knowledge teams and appointed CKOs (Chief Knowledge Officers). Knowledge is firmly on the strategic agenda.
Knowledge management systems
A knowledge management system (KMS) is the software framework (toolbox) that is intended to assist, via knowledge processing functions, those who desire to formulate and retrieve knowledge for different applications, such as system design and specification, term bank construction, documentation or ontology design for (multilingual) language processing. The various tools of such a framework should help users to originate and organise ideas or understand and communicate ideas more easily and accurately than can be done with most current tools. A KMS is an integrated multifunctional system that can support all main knowledge management and knowledge processing activities, such as:
Capturing;
Organising;
Classifying and understanding;
Debugging and editing;
Finding and retrieving;
Disseminating, transferring and sharing knowledge.
Current knowledge management systems, in particular those in the field of information retrieval, are
Too narrow in many respects. For example, one application, one type of user, one type of knowledge representation, one type of knowledge operation, etc.;
Too hard to use. For example, specialised knowledge is needed and long training curves are necessary;
Not widely known or available.
One main task of a KMS is to search for specific information. This is mainly done by an information retrieval component which will be central for our evaluation work.
Reference: http://www.issco.unige.ch/en/research/projects/ewg95//node213.html
One main task of a KMS is to search for specific information. This is mainly done by an information retrieval component which will be central for our evaluation work.
http://www.skyrme.com/insights/22km.htm
http://en.wikipedia.org/wiki/Knowledge_management
http://www.nwlink.com/~donclark/knowledge/km.html
While there are normally only five ways to organize information — LATCH (Location, Alphabet, Time, Category, or Hierarchy), these five ways have a lot of versatility (Wurman, 2001). For example, a youngster with a toy car collection may sort them by color, make, type, size, type of play, or a dozen other divisions. The youngster can even make up categories as new divisions, play activities, or wants appear. However, a computer is considered "intelligent" if it can sort a collection into one category. Yet, many organizations are placing their bets on computer systems due to the amount of data such systems can hold and the speed at which it can sort and distribute once such categories and data are made known to it.
http://www.nwlink.com/~donclark/performance/dikw.jpg
History
KM efforts have a long history, to include on-the-job discussions, formal apprenticeship, discussion forums, corporate libraries, professional training and mentoring programs. More recently, with increased use of computers in the second half of the 20th century, specific adaptations of technologies such as knowledge bases, expert systems, knowledge repositories, group decision support systems, intranets, and computer-supported cooperative work have been introduced to further enhance such efforts.[1]
In 1999, the term personal knowledge management was introduced which refers to the management of knowledge at the individual level (Wright 2005).
In terms of the enterprise, early collections of case studies recognized the importance of knowledge management dimensions of strategy, process, and measurement (Morey, Maybury & Thuraisingham 2002). Key lessons learned included: people and the cultural norms which influence their behaviors are the most critical resources for successful knowledge creation, dissemination, and application; cognitive, social, and organizational learning processes are essential to the success of a knowledge management strategy; and measurement, benchmarking, and incentives are essential to accelerate the learning process and to drive cultural change. In short, knowledge management programs can yield impressive benefits to individuals and organizations if they are purposeful, concrete, and action-oriented.
More recently with the advent of the Web 2.0, the concept of Knowledge Management has evolved towards a vision more based on people participation and emergence. This line of evolution is termed Enterprise 2.0 (McAfee 2006). However, there is an ongoing debate and discussions (Lakhani & McAfee 2007) as to whether Enterprise 2.0 is just a fad that does not bring anything new or useful or whether it is, indeed, the future of knowledge management (Davenport 2008).
Momentum of Knowledge Management
The last few years have seen a rapidly growing interest in the topic of knowledge management. 'Leveraging Knowledge for Sustainable Advantage' was the title of one of the first conferences (in 1995) that brought knowledge management onto the management agenda. From 1997 a surge of books, magazines and websites have come onto the scene. Today (2003) most large organizations have some form of knowledge management initiative. Many companies have created knowledge teams and appointed CKOs (Chief Knowledge Officers). Knowledge is firmly on the strategic agenda.
Knowledge management systems
A knowledge management system (KMS) is the software framework (toolbox) that is intended to assist, via knowledge processing functions, those who desire to formulate and retrieve knowledge for different applications, such as system design and specification, term bank construction, documentation or ontology design for (multilingual) language processing. The various tools of such a framework should help users to originate and organise ideas or understand and communicate ideas more easily and accurately than can be done with most current tools. A KMS is an integrated multifunctional system that can support all main knowledge management and knowledge processing activities, such as:
Capturing;
Organising;
Classifying and understanding;
Debugging and editing;
Finding and retrieving;
Disseminating, transferring and sharing knowledge.
Current knowledge management systems, in particular those in the field of information retrieval, are
Too narrow in many respects. For example, one application, one type of user, one type of knowledge representation, one type of knowledge operation, etc.;
Too hard to use. For example, specialised knowledge is needed and long training curves are necessary;
Not widely known or available.
One main task of a KMS is to search for specific information. This is mainly done by an information retrieval component which will be central for our evaluation work.
Reference: http://www.issco.unige.ch/en/research/projects/ewg95//node213.html
One main task of a KMS is to search for specific information. This is mainly done by an information retrieval component which will be central for our evaluation work.
http://www.skyrme.com/insights/22km.htm
http://en.wikipedia.org/wiki/Knowledge_management
http://www.nwlink.com/~donclark/knowledge/km.html
วันจันทร์ที่ 6 กุมภาพันธ์ พ.ศ. 2555
Chapter 11
Examples of Information systems...
Information
System Description
Executive Support Systems
An Executive Support System ("ESS") is designed to help senior management make strategic decisions. It gathers, analyses and summarises the key internal and external information used in the business.
A good way to think about an ESS is to imagine the senior management team in an aircraft cockpit - with the instrument panel showing them the status of all the key business activities. ESS typically involve lots of data analysis and modelling tools such as "what-if" analysis to help strategic decision-making.
Management Information Systems
A management information system ("MIS") is mainly concerned with internal sources of information. MIS usually take data from the transaction processing systems (see below) and summarise it into a series of management reports.
MIS reports tend to be used by middle management and operational supervisors.
Decision-Support Systems Decision-support systems ("DSS") are specifically designed to help management make decisions in situations where there is uncertainty about the possible outcomes of those decisions. DSS comprise tools and techniques to help gather relevant information and analyse the options and alternatives. DSS often involves use of complex spreadsheet and databases to create "what-if" models.
Knowledge Management Systems
Knowledge Management Systems ("KMS") exist to help businesses create and share information. These are typically used in a business where employees create new knowledge and expertise - which can then be shared by other people in the organisation to create further commercial opportunities. Good examples include firms of lawyers, accountants and management consultants.
KMS are built around systems which allow efficient categorisation and distribution of knowledge. For example, the knowledge itself might be contained in word processing documents, spreadsheets, PowerPoint presentations. internet pages or whatever. To share the knowledge, a KMS would use group collaboration systems such as an intranet.
Transaction Processing Systems
As the name implies, Transaction Processing Systems ("TPS") are designed to process routine transactions efficiently and accurately. A business will have several (sometimes many) TPS; for example:
- Billing systems to send invoices to customers
- Systems to calculate the weekly and monthly payroll and tax payments
- Production and purchasing systems to calculate raw material requirements
- Stock control systems to process all movements into, within and out of the business
Office Automation Systems Office Automation Systems are systems that try to improve the productivity of employees who need to process data and information. Perhaps the best example is the wide range of software systems that exist to improve the productivity of employees working in an office (e.g. Microsoft Office XP) or systems that allow employees to work from home or whilst on the move.
Why do organizations need information systems?
Computers are essential today. We check our email with it, find answers to questions, watch media, bank and more using computers. There for we need systems that can organize, and serve information when people around the world request it. Servers do the task. Each website has a server. A computer that host the website. When you type Google.com the server who host that website receives data from your computer and sends data back. Letting you access the servers data. Thus you see Google's homepage. Without servers there would be no websites.
Information Systems
Information is the lifeblood of any organization. Damaged or lost data can cause disruptions in normal business activities leading to financial losses, law suits, etc. Information systems, which comprise hardware, software, data, applications, communication and people, help an organization to better manage and secure its critical corporate, customer and employee data. Information systems also improve integration and work processes...the benefits go on and on.
Answer
An information system is also a system but differs from other kinds of systems because its objective is to monitor and document the operations of other systems, which we can call target systems. An information system owes its existence to the target system. For example, production activities would be the target system for a production scheduling information system, human resources would be the target system of a human resource information system, and so on. We could say that every reactive system may have a subsystem that can be considered as an information system whose objective is to monitor and control such a system. The main functions of an information system may be input, processing, output, storage and control at work place.
reference: http://tutor2u.net/business/ict/intro_information_system_types.htm
reference: http://wiki.answers.com/Q/Why_do_organizations_need_information_systems
Information
System Description
Executive Support Systems
An Executive Support System ("ESS") is designed to help senior management make strategic decisions. It gathers, analyses and summarises the key internal and external information used in the business.
A good way to think about an ESS is to imagine the senior management team in an aircraft cockpit - with the instrument panel showing them the status of all the key business activities. ESS typically involve lots of data analysis and modelling tools such as "what-if" analysis to help strategic decision-making.
Management Information Systems
A management information system ("MIS") is mainly concerned with internal sources of information. MIS usually take data from the transaction processing systems (see below) and summarise it into a series of management reports.
MIS reports tend to be used by middle management and operational supervisors.
Decision-Support Systems Decision-support systems ("DSS") are specifically designed to help management make decisions in situations where there is uncertainty about the possible outcomes of those decisions. DSS comprise tools and techniques to help gather relevant information and analyse the options and alternatives. DSS often involves use of complex spreadsheet and databases to create "what-if" models.
Knowledge Management Systems
Knowledge Management Systems ("KMS") exist to help businesses create and share information. These are typically used in a business where employees create new knowledge and expertise - which can then be shared by other people in the organisation to create further commercial opportunities. Good examples include firms of lawyers, accountants and management consultants.
KMS are built around systems which allow efficient categorisation and distribution of knowledge. For example, the knowledge itself might be contained in word processing documents, spreadsheets, PowerPoint presentations. internet pages or whatever. To share the knowledge, a KMS would use group collaboration systems such as an intranet.
Transaction Processing Systems
As the name implies, Transaction Processing Systems ("TPS") are designed to process routine transactions efficiently and accurately. A business will have several (sometimes many) TPS; for example:
- Billing systems to send invoices to customers
- Systems to calculate the weekly and monthly payroll and tax payments
- Production and purchasing systems to calculate raw material requirements
- Stock control systems to process all movements into, within and out of the business
Office Automation Systems Office Automation Systems are systems that try to improve the productivity of employees who need to process data and information. Perhaps the best example is the wide range of software systems that exist to improve the productivity of employees working in an office (e.g. Microsoft Office XP) or systems that allow employees to work from home or whilst on the move.
Why do organizations need information systems?
Computers are essential today. We check our email with it, find answers to questions, watch media, bank and more using computers. There for we need systems that can organize, and serve information when people around the world request it. Servers do the task. Each website has a server. A computer that host the website. When you type Google.com the server who host that website receives data from your computer and sends data back. Letting you access the servers data. Thus you see Google's homepage. Without servers there would be no websites.
Information Systems
Information is the lifeblood of any organization. Damaged or lost data can cause disruptions in normal business activities leading to financial losses, law suits, etc. Information systems, which comprise hardware, software, data, applications, communication and people, help an organization to better manage and secure its critical corporate, customer and employee data. Information systems also improve integration and work processes...the benefits go on and on.
Answer
An information system is also a system but differs from other kinds of systems because its objective is to monitor and document the operations of other systems, which we can call target systems. An information system owes its existence to the target system. For example, production activities would be the target system for a production scheduling information system, human resources would be the target system of a human resource information system, and so on. We could say that every reactive system may have a subsystem that can be considered as an information system whose objective is to monitor and control such a system. The main functions of an information system may be input, processing, output, storage and control at work place.
reference: http://tutor2u.net/business/ict/intro_information_system_types.htm
reference: http://wiki.answers.com/Q/Why_do_organizations_need_information_systems
วันจันทร์ที่ 30 มกราคม พ.ศ. 2555
Information technology Pros and Cons...
Pros And Cons of Information Technology
By Scott Beckstead 3 Comments
Image via Wikipedia
Information Technology has contributed to the world with a high extent. Without its contribution there is a doubt that we could achieve this ‘technological world’ today or not. But even though, some people are raising some statements (it may be baseless) that Information Technology (IT) Solutions are taking away the Privacy of Normal people and breaking overall reputation of World Wide Human knowledge.
So let us check in the article some good sides (pros) and bad effects(cons) of Information Technology in some points. According to Google’s definition service, it says-
“the branch of engineering that deals with the use of computers and telecommunications to retrieve and store and transmit information”. – Google Definition.
So by the definition as it provides, we understand here that ‘the engineering branch which helps us to communicate, pass our individual knowledge, store & share them by computer and other modern technologies is called Information technology.’
Information Technology Pros
1. The world got flexibility
What we think, do or plan must be shared with our co-workers, colleagues and friends. The internet technology has advanced this system to a great extent. The telephone idea (by Alexander Graham Bell) has been modified and made as Cell Phones to increase more flexibility in communication and talk to our dear fellows whenever we require!
2. The sense of responsibility has increased
Let us take ‘Barack Obama- USA President’ as the figure. With the use of networking sites (Twitter and Facebook), blogs, social bookmarking, the leader can approach to the world whenever necessary and we can receive the news and updates which he has done (or wants to be done by us) within a very short period.
3. Easy thinking & evolution in transportation
To think and to research, we need resources to find what our past people has thought, what quotes they have left for us (+information + theory). We can find them by a single click in Search engines (specially Google, Yahoo!). By getting a clear cut idea, we get the chance of contributing the world with new technological ideas and inventions and share what we have learnt throughout our lives.
And throughout the ages, it helped us to evolute the transportation strategy which helps us to visit from a place to another by (Roads, highways, air, water and in the skies!)
4. Saves thousand of lives daily
So, by the point heading I hope you understand I am referring towards Medical Sector development. Each day people are getting relief by the perfect use of Medicine, Hospital Technology with addition of (X-Rays, Laser Treatments) and more on the queue. By the combination of the World Health Organization, various fatal diseases can be overcome and just expelled from specified countries by quick plans and ideas.
5. Increase the sense of Human Rights
The technology can remind of our human rights, basic needs and give updates where relief or worldwide help is necessary. During earthquakes, terrible floods, while co-operation is necessary the World Wide Web can help us to collect the donation by a desired amount.
It is not possible to just figure out everything about the good and bad sides of Information and Technology within a page and article as because it has mixed with every aspects and corners of our lives. Rather, let us look at the side effects, bad sides (cons) which IT-sectors have brought to the Human Society.
Information Technology Cons
It has taken away people’s Privacy
As IT-Sectors have wined the people’s heart worldwide. People are here to share and store any kind of information, private date in their hard drives and private online databases. But due to some Cyber-Criminals, nothing is SO Safe both online and offline. If someone becomes a bit careless, s/he may needs to pay high for it. (It’s serious).
The online community is not safe for Family anymore
Children under age may often share Cell Numbers, Private Email Address which can be hacked by people and can pass it to the criminals who have a blue-print to harm the society. And people are loosing credit card privacy and other payment processing options. Again, there are some sites created by Nasty Guys, which can lead under eighteen teenagers to a different path – That Is Going To Bring Harms To The Nation.
It is going to damage a Human’s Natural Power
We can think, gather human principles (ethical knowledge) and make co-operative relationships between friends and families. But due to harmful aspects of (IT) people are becoming fully technological based. And it can bring huge damage to the society as its taking away the natural thoughts and organic ideas.
It can bring World Destruction without Efficient Administration
This is an extra point which I am writing by remembering various Science Fictions. Great scholars have though about the matter wisely. Til now, we (humans) are possessing the leading place in the world and administering the computer technology. But a day MAY come when the technology is going to administer us in all aspects. It may probably happen that we are converted to the slaves of Technology.
So, by this cons I am not trying to tell that Technology is here to bring harms only, because I myself is a technological man who passes 24 hours browsing computer and talking on phones. But as a part of human society we need to give up a look at the both sides of IT.
The pros and cons of information technology
April 17th, 2011 | Author: computers-technology
Information technology has helped the world with the same level. Without his contribution, there is no doubt that we now do not get this "technical world" or. But even if some people raise a number of statements (which may be unfounded) that the information technology (IT) are taking away the privacy of ordinary people and breaking international reputation of the world wide human knowledge.
So let some positive aspects in this article (OPR) and poorEffects (v) Information Technology in places. According to the definition of a Google service, it is said-
"The branch of engineering that get on the use of computers and telecommunications, and store and transmit information." – Definition of Google.
Therefore, the definition in that it provides, it is clear that this is "the branch of technology that helps us communicate with each other, technology transfer individual knowledge, and our store with your computer and other modernmeans Information Technology. "
Information Technology Pro
1. The world has the flexibility
we do or what to think, plan employees, colleagues and friends are our shared with. Internet technology, this system has a high degree. The idea of the telephone (Alexander Graham Bell) was changed and made like mobile phones, to increase the flexibility in communication and talk to our dear comrades, wheneverDemand!
2. The sense of responsibility has increased
Let 'Barack Obama President of USA, "as the figure. The use of networking sites (Twitter and Facebook), blogs, social bookmarking, the LEADER approach in the world, if necessary, we can send messages and updates that are ( or will be made by us) will be 'within a very short period of time.
3. Just to think, and developments in the transport sector
Of thought and research, weResources need to find out what people thought of our past, which the quotations that have left us (information theory + +). We can use it with a single click in the search engines (especially Google ™, Yahoo) to find. By a clear cut idea, we have the opportunity to help the world with new ideas and technological inventions and to share what we learned during our lives.
And over the centuries has contributed to a widening of the transport strategy that helps us attend to one place to anotherof (roads, highways, air, water and sky!)
4. This allows you to save thousands of lives every day
Then, from the point of entry I hope you understand me relate to the development of the medical field. Every day people relief with the perfect use of medicine, hospital technology, by adding (X-rays, laser treatments) and more in the queue. With the combination of World Health Organization, several life-threatening diseases to overcome and be expelled from the newly establishedCountries rapidly and the plans and ideas.
5. Increase the sense of human rights
Technology can remember those of our human rights, basic needs and updates made available to or around the world where it is necessary to help alleviate. During earthquakes, floods, horrible, that during the cooperation is necessary on the World Wide Web Help us to collect the donation of a set.
They can not understand, it's about good and bad sides of the Information and Technology ManagementWithin a page and how, as he mixes with all aspects and angles of our lives. Rather, we have seen in side effects bad side (cons) that led in the sectors of human society.
Information Technology Cons
He pulled the privacy of individuals
As the IT sector the hearts of people around the world have wined. People are here to share and store any kind of information, the date of the individuals in their private hard drives and onlineDatabases. But because of some computer criminals, is not as safe, both online and offline. If someone has a little "sloppy, he / she has to pay high. (E 'heavy).
The online community is not safe for the family more
Children under the age can often share cell numbers, e-mail contact, which can be manipulated by people and pass it to the criminals who harm a blueprint of the company. And people lose their privacy and credit card payments from otherProcessing options. Again, there are several Web sites created perverse that a child can be worn under the age of eighteen in a different place – that the nation is to bring Harms.
E 'intention to harm a human Natural Power
People should be the best of all creations (in Islam). We can imagine, to collect the human principles (ethical knowledge) and then working relationships with friends and family. But because of the harmful aspects of the (IT) People are always fullbased technology. And it can bring great harm to society by taking away the thoughts and ideas of natural organic.
You can bring effective administration without World Destruction
This is something more that I write, having regard to the various science-fiction. Great scholars have, even if the problem with wisdom. Up to now (people) we are holding a prominent place in the world and the management of information technology. But come one day, when theThe technology is for us to manage all aspects. It is possible that we will probably convert the slaves of technology.
So from this I am not trying to say that the technology to bring harm here, just because I have a technological man over 24 hours of navigation computer and talk on the phone itself. But as part of human society, we must take a look at both sides.
reference: http://113tidbits.com/the-pros-and-cons-of-information-technology/3696/
reference: http://technology-personal-tech.chailit.com/the-pros-and-cons-of-information-technology.html
By Scott Beckstead 3 Comments
Image via Wikipedia
Information Technology has contributed to the world with a high extent. Without its contribution there is a doubt that we could achieve this ‘technological world’ today or not. But even though, some people are raising some statements (it may be baseless) that Information Technology (IT) Solutions are taking away the Privacy of Normal people and breaking overall reputation of World Wide Human knowledge.
So let us check in the article some good sides (pros) and bad effects(cons) of Information Technology in some points. According to Google’s definition service, it says-
“the branch of engineering that deals with the use of computers and telecommunications to retrieve and store and transmit information”. – Google Definition.
So by the definition as it provides, we understand here that ‘the engineering branch which helps us to communicate, pass our individual knowledge, store & share them by computer and other modern technologies is called Information technology.’
Information Technology Pros
1. The world got flexibility
What we think, do or plan must be shared with our co-workers, colleagues and friends. The internet technology has advanced this system to a great extent. The telephone idea (by Alexander Graham Bell) has been modified and made as Cell Phones to increase more flexibility in communication and talk to our dear fellows whenever we require!
2. The sense of responsibility has increased
Let us take ‘Barack Obama- USA President’ as the figure. With the use of networking sites (Twitter and Facebook), blogs, social bookmarking, the leader can approach to the world whenever necessary and we can receive the news and updates which he has done (or wants to be done by us) within a very short period.
3. Easy thinking & evolution in transportation
To think and to research, we need resources to find what our past people has thought, what quotes they have left for us (+information + theory). We can find them by a single click in Search engines (specially Google, Yahoo!). By getting a clear cut idea, we get the chance of contributing the world with new technological ideas and inventions and share what we have learnt throughout our lives.
And throughout the ages, it helped us to evolute the transportation strategy which helps us to visit from a place to another by (Roads, highways, air, water and in the skies!)
4. Saves thousand of lives daily
So, by the point heading I hope you understand I am referring towards Medical Sector development. Each day people are getting relief by the perfect use of Medicine, Hospital Technology with addition of (X-Rays, Laser Treatments) and more on the queue. By the combination of the World Health Organization, various fatal diseases can be overcome and just expelled from specified countries by quick plans and ideas.
5. Increase the sense of Human Rights
The technology can remind of our human rights, basic needs and give updates where relief or worldwide help is necessary. During earthquakes, terrible floods, while co-operation is necessary the World Wide Web can help us to collect the donation by a desired amount.
It is not possible to just figure out everything about the good and bad sides of Information and Technology within a page and article as because it has mixed with every aspects and corners of our lives. Rather, let us look at the side effects, bad sides (cons) which IT-sectors have brought to the Human Society.
Information Technology Cons
It has taken away people’s Privacy
As IT-Sectors have wined the people’s heart worldwide. People are here to share and store any kind of information, private date in their hard drives and private online databases. But due to some Cyber-Criminals, nothing is SO Safe both online and offline. If someone becomes a bit careless, s/he may needs to pay high for it. (It’s serious).
The online community is not safe for Family anymore
Children under age may often share Cell Numbers, Private Email Address which can be hacked by people and can pass it to the criminals who have a blue-print to harm the society. And people are loosing credit card privacy and other payment processing options. Again, there are some sites created by Nasty Guys, which can lead under eighteen teenagers to a different path – That Is Going To Bring Harms To The Nation.
It is going to damage a Human’s Natural Power
We can think, gather human principles (ethical knowledge) and make co-operative relationships between friends and families. But due to harmful aspects of (IT) people are becoming fully technological based. And it can bring huge damage to the society as its taking away the natural thoughts and organic ideas.
It can bring World Destruction without Efficient Administration
This is an extra point which I am writing by remembering various Science Fictions. Great scholars have though about the matter wisely. Til now, we (humans) are possessing the leading place in the world and administering the computer technology. But a day MAY come when the technology is going to administer us in all aspects. It may probably happen that we are converted to the slaves of Technology.
So, by this cons I am not trying to tell that Technology is here to bring harms only, because I myself is a technological man who passes 24 hours browsing computer and talking on phones. But as a part of human society we need to give up a look at the both sides of IT.
The pros and cons of information technology
April 17th, 2011 | Author: computers-technology
Information technology has helped the world with the same level. Without his contribution, there is no doubt that we now do not get this "technical world" or. But even if some people raise a number of statements (which may be unfounded) that the information technology (IT) are taking away the privacy of ordinary people and breaking international reputation of the world wide human knowledge.
So let some positive aspects in this article (OPR) and poorEffects (v) Information Technology in places. According to the definition of a Google service, it is said-
"The branch of engineering that get on the use of computers and telecommunications, and store and transmit information." – Definition of Google.
Therefore, the definition in that it provides, it is clear that this is "the branch of technology that helps us communicate with each other, technology transfer individual knowledge, and our store with your computer and other modernmeans Information Technology. "
Information Technology Pro
1. The world has the flexibility
we do or what to think, plan employees, colleagues and friends are our shared with. Internet technology, this system has a high degree. The idea of the telephone (Alexander Graham Bell) was changed and made like mobile phones, to increase the flexibility in communication and talk to our dear comrades, wheneverDemand!
2. The sense of responsibility has increased
Let 'Barack Obama President of USA, "as the figure. The use of networking sites (Twitter and Facebook), blogs, social bookmarking, the LEADER approach in the world, if necessary, we can send messages and updates that are ( or will be made by us) will be 'within a very short period of time.
3. Just to think, and developments in the transport sector
Of thought and research, weResources need to find out what people thought of our past, which the quotations that have left us (information theory + +). We can use it with a single click in the search engines (especially Google ™, Yahoo) to find. By a clear cut idea, we have the opportunity to help the world with new ideas and technological inventions and to share what we learned during our lives.
And over the centuries has contributed to a widening of the transport strategy that helps us attend to one place to anotherof (roads, highways, air, water and sky!)
4. This allows you to save thousands of lives every day
Then, from the point of entry I hope you understand me relate to the development of the medical field. Every day people relief with the perfect use of medicine, hospital technology, by adding (X-rays, laser treatments) and more in the queue. With the combination of World Health Organization, several life-threatening diseases to overcome and be expelled from the newly establishedCountries rapidly and the plans and ideas.
5. Increase the sense of human rights
Technology can remember those of our human rights, basic needs and updates made available to or around the world where it is necessary to help alleviate. During earthquakes, floods, horrible, that during the cooperation is necessary on the World Wide Web Help us to collect the donation of a set.
They can not understand, it's about good and bad sides of the Information and Technology ManagementWithin a page and how, as he mixes with all aspects and angles of our lives. Rather, we have seen in side effects bad side (cons) that led in the sectors of human society.
Information Technology Cons
He pulled the privacy of individuals
As the IT sector the hearts of people around the world have wined. People are here to share and store any kind of information, the date of the individuals in their private hard drives and onlineDatabases. But because of some computer criminals, is not as safe, both online and offline. If someone has a little "sloppy, he / she has to pay high. (E 'heavy).
The online community is not safe for the family more
Children under the age can often share cell numbers, e-mail contact, which can be manipulated by people and pass it to the criminals who harm a blueprint of the company. And people lose their privacy and credit card payments from otherProcessing options. Again, there are several Web sites created perverse that a child can be worn under the age of eighteen in a different place – that the nation is to bring Harms.
E 'intention to harm a human Natural Power
People should be the best of all creations (in Islam). We can imagine, to collect the human principles (ethical knowledge) and then working relationships with friends and family. But because of the harmful aspects of the (IT) People are always fullbased technology. And it can bring great harm to society by taking away the thoughts and ideas of natural organic.
You can bring effective administration without World Destruction
This is something more that I write, having regard to the various science-fiction. Great scholars have, even if the problem with wisdom. Up to now (people) we are holding a prominent place in the world and the management of information technology. But come one day, when theThe technology is for us to manage all aspects. It is possible that we will probably convert the slaves of technology.
So from this I am not trying to say that the technology to bring harm here, just because I have a technological man over 24 hours of navigation computer and talk on the phone itself. But as part of human society, we must take a look at both sides.
reference: http://113tidbits.com/the-pros-and-cons-of-information-technology/3696/
reference: http://technology-personal-tech.chailit.com/the-pros-and-cons-of-information-technology.html
วันจันทร์ที่ 23 มกราคม พ.ศ. 2555
Chapter 8
Search Engines...
How do search engines work?
The good news about the Internet and its most visible component, the World Wide Web, is that there are hundreds of millions of pages available, waiting to present information on an amazing variety of topics. The bad news about the Internet is that there are hundreds of millions of pages available, most of them titled according to the whim of their author, almost all of them sitting on servers with cryptic names. When you need to know about a particular subject, how do you know which pages to read? If you're like most people, you visit an Internet search engine.
Internet search engines are special sites on the Web that are designed to help people find information stored on other sites. There are differences in the ways various search engines work, but they all perform three basic tasks:
They search the Internet -- or select pieces of the Internet -- based on important words.
They keep an index of the words they find, and where they find them.
They allow users to look for words or combinations of words found in that index.
Early search engines held an index of a few hundred thousand pages and documents, and received maybe one or two thousand inquiries each day. Today, a top search engine will index hundreds of millions of pages, and respond to tens of millions of queries per day. In this article, we'll tell you how these major tasks are performed, and how Internet search engines put the pieces together in order to let you find the information you need on the Web.
They are three types of search engines:
Crawler-based search engines
Human-powered directories or Search engine directory
Meta or hybrid search engines
Crawler-based search engines
Crawler-based search engines, such as Google (http://www.google.com), create their listings automatically. They "crawl" or "spider" the web, then people search through what they have found. If web pages are changed, crawler-based search engines eventually find these changes, and that can affect how those pages are listed. Page titles, body copy and other elements all play a role.
The life span of a typical web query normally lasts less than half a second, yet involves a number of different steps that must be completed before results can be delivered to a person seeking information. The following graphic (Figure 1) illustrates this life span (from http://www.google.com/corporate/tech.html):
3. The search results are returned to the user in a fraction of a second.
1. The web server sends the query to the index servers. The content inside the index servers is similar to the index in the back of a book - it tells which pages contain the words that match the query.
2. The query travels to the doc servers, which actually retrieve the stored documents. Snippets are generated to describe each search result.
Human-powered directories
A human-powered directory, such as the Open Directory Project (http://www.dmoz.org/about.html) depends on humans for its listings. (Yahoo!, which used to be a directory, now gets its information from the use of crawlers.) A directory gets its information from submissions, which include a short description to the directory for the entire site, or from editors who write one for sites they review. A search looks for matches only in the descriptions submitted. Changing web pages, therefore, has no effect on how they are listed. Techniques that are useful for improving a listing with a search engine have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed for free than a poor site.
Hybrid search engines
Today, it is extremely common for crawler-type and human-powered results to be combined when conducting a search. Usually, a hybrid search engine will favor one type of listings over another. For example, MSN Search (http://www.imagine-msn.com/search/tour/moreprecise.aspx) is more likely to present human-powered listings from LookSmart (http://search.looksmart.com/). However, it also presents crawler-based results, especially for more obscure queries.
5 Example of Search Engines on the internet:
google
yahoo
Bing
altavista
Lycos
Reference
http://computer.howstuffworks.com/internet/basics/search-engine2.htm
http://edtech.boisestate.edu/bschroeder/publicizing/three_types.htm
Posted by YUKIKAZE at Monday, October 17, 2011 0 Argument
Labels: Access to Library and Information system, Computer, Internet, Search Engine
How do search engines work?
The good news about the Internet and its most visible component, the World Wide Web, is that there are hundreds of millions of pages available, waiting to present information on an amazing variety of topics. The bad news about the Internet is that there are hundreds of millions of pages available, most of them titled according to the whim of their author, almost all of them sitting on servers with cryptic names. When you need to know about a particular subject, how do you know which pages to read? If you're like most people, you visit an Internet search engine.
Internet search engines are special sites on the Web that are designed to help people find information stored on other sites. There are differences in the ways various search engines work, but they all perform three basic tasks:
They search the Internet -- or select pieces of the Internet -- based on important words.
They keep an index of the words they find, and where they find them.
They allow users to look for words or combinations of words found in that index.
Early search engines held an index of a few hundred thousand pages and documents, and received maybe one or two thousand inquiries each day. Today, a top search engine will index hundreds of millions of pages, and respond to tens of millions of queries per day. In this article, we'll tell you how these major tasks are performed, and how Internet search engines put the pieces together in order to let you find the information you need on the Web.
They are three types of search engines:
Crawler-based search engines
Human-powered directories or Search engine directory
Meta or hybrid search engines
Crawler-based search engines
Crawler-based search engines, such as Google (http://www.google.com), create their listings automatically. They "crawl" or "spider" the web, then people search through what they have found. If web pages are changed, crawler-based search engines eventually find these changes, and that can affect how those pages are listed. Page titles, body copy and other elements all play a role.
The life span of a typical web query normally lasts less than half a second, yet involves a number of different steps that must be completed before results can be delivered to a person seeking information. The following graphic (Figure 1) illustrates this life span (from http://www.google.com/corporate/tech.html):
3. The search results are returned to the user in a fraction of a second.
1. The web server sends the query to the index servers. The content inside the index servers is similar to the index in the back of a book - it tells which pages contain the words that match the query.
2. The query travels to the doc servers, which actually retrieve the stored documents. Snippets are generated to describe each search result.
Human-powered directories
A human-powered directory, such as the Open Directory Project (http://www.dmoz.org/about.html) depends on humans for its listings. (Yahoo!, which used to be a directory, now gets its information from the use of crawlers.) A directory gets its information from submissions, which include a short description to the directory for the entire site, or from editors who write one for sites they review. A search looks for matches only in the descriptions submitted. Changing web pages, therefore, has no effect on how they are listed. Techniques that are useful for improving a listing with a search engine have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed for free than a poor site.
Hybrid search engines
Today, it is extremely common for crawler-type and human-powered results to be combined when conducting a search. Usually, a hybrid search engine will favor one type of listings over another. For example, MSN Search (http://www.imagine-msn.com/search/tour/moreprecise.aspx) is more likely to present human-powered listings from LookSmart (http://search.looksmart.com/). However, it also presents crawler-based results, especially for more obscure queries.
5 Example of Search Engines on the internet:
yahoo
Bing
altavista
Lycos
Reference
http://computer.howstuffworks.com/internet/basics/search-engine2.htm
http://edtech.boisestate.edu/bschroeder/publicizing/three_types.htm
Posted by YUKIKAZE at Monday, October 17, 2011 0 Argument
Labels: Access to Library and Information system, Computer, Internet, Search Engine
Characteristics of good and bias websites...
Eight Characteristics of a Good Website
By Mandy Porta, posted on April 13, 2009
Is your website producing the results you hoped for? Are you serious about the effectiveness of your online investment? This article explains eight basic ingredients of a successful website. Take note, because missing an ingredient can result in a poor aftertaste for your website visitors.
1. Original, Fresh Content
Content is king in the web world. People visit websites for the primary purpose of finding content, so make sure you deliver. Website content should be unique and up to date. Fresh content will keep visitors and search engines coming back for more. Don’t forget to proofread!
2. Target Audience
From a quick scan of your website, visitors should be able to determine what you offer and how you can benefit them. A good website will have headlines and text that speaks to the target audience’s needs and wants. Many websites simply list what their company does without saying how they can benefit their target audience. Keep your audience in mind when designing your website to be sure that it will appeal to them and encourage them to take action (whether that is to submit a contact form, signup for a newsletter or buy a product).
3. User-Friendly Navigation
A good website has content that is easy to find. Pages should be organized and named in a way that the target audience will easily understand. For instance, a services page would be better labeled “Services” than “Business Solutions.” Keep your navigation consistent from page to page to avoid any possible confusion. Double check all your links to make sure they are working. Make sure that your most popular content is no more than a click away from your homepage. If your website has a lot of content, provide a search box so visitors can quickly find what they are looking for.
4. Simple and Professional Design
A good website will have an attractive layout that is easy on the eyes. Be sure your colors contrast well and your text doesn’t require a magnifying glass to read. Personally, I can’t stand reading large amounts of content written in white on a black background. It strains my eyes. Reducing the contrast a bit can help (light grey text on dark grey background).
Lots of text can overwhelm a user. Breaking up text into subheads and bullet points will improve the layout of the page and make the text more scannable. No one has time to read every word on a page.
Use design elements to draw attention to or to enhance the content of a page. With every design element added, take a step back and make sure it serves a purpose and does not detract from the usability of the site. Put things where users expect them to be. However, do try to make your website look unique. Just remember that simple, professional design will be much more effective than flashy, overcrowded design.
5. Speed
How many seconds will you wait for a page to load before you give up and leave a website? Many factors can affect the loading time of a website including coding, number of graphics, the server speed, traffic volume on the website and the capabilities of a user’s computer.
Make sure your server has the proper amount of space/bandwidth for your website and that your website code is lightweight. Use large graphics sparingly. Use CSS styles in place of graphics where possible. Waiting for large graphics or a fancy flash animation to load on each page will surely turn away some visitors.
6. Search engine optimization
SEO is one of the most commonly neglected aspects of a website, but a website is useless if no one can find it. Think about the keywords that users may search for to find a product or service you offer. Do some research to see how often those keywords are searched for through a tool like Google’s Keyword Tool. Use keywords in titles, meta tags, headings, file names and in the content of your site. Search engine optimization can mean the difference between getting 500 visitors a month and 500+ visitors a day.
7. Link building
Links are an important factor in determining where your website appears in search engine results. Find as many legitimate sites as you can that will link you your website. Add your link to your business profile in directories like Google Local, Yahoo! Local, Merchant Circle, Insider Pages, Yellow Pages, LinkedIn and more. Submit articles and PR to sites that will include a link back to your website. The best way to get links to your site, however, is to provide unique and interesting content that people want to link to. You can make it easy for them to share your content by providing links or buttons such as the addthis.com button.
8. Tracking
A good website is a work in progress. A nice tool like Google Analytics will keep track of the number of people who come to your website, what pages they viewed, where they came from, what keywords they used in search engines, how many left after the first page and more. Unlike other media, websites can be easily tracked to see what is working and what isn’t. This data will help you to improve the quality and structure of your site.
Conclusion
Remember to include these characteristics in your next website design project. A professional-looking website with interesting content that is easy to navigate and can be found in search engines is sure to bring value to your business.
Feel free to comment. What other ingredients can spice up a website? What has been your most effective strategy in gaining website traffic?
Did you find my blog post helpful? Subscribe to the RSS feed, subscribe by email, or follow us on Twitter or Facebook to stay up to date with our latest posts on various marketing topics.
Related Articles
Information bias (psychology)
From Wikipedia, the free encyclopedia
For other uses, see Information bias (disambiguation).
This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (April 2008)
Information bias is a type of cognitive bias, and involves e.g. distorted evaluation of information. Information bias occurs due to people's curiosity and confusion of goals when trying to choose a course of action.
Contents [hide]
1 Over-evaluation of information
1.1 Globoma experiment
2 References
3 See also
[edit]Over-evaluation of information
An example of information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.
Examples of information bias are prevalent in medical diagnosis. Subjects in experiments concerning medical diagnostic problems show an information bias in which they seek information that is unnecessary in deciding the course of treatment.
[edit]Globoma experiment
In an experiment,[1] subjects considered this diagnostic problem involving fictitious diseases:
A female patient is presenting symptoms and a history which both suggest a diagnosis of globoma, with about 80% probability. If it isn't globoma, it's either popitis or flapemia. Each disease has its own treatment which is ineffective against the other two diseases. A test called the ET scan would certainly yield a positive result if the patient had popitis, and a negative result if she has flapemia. If the patient has globoma, a positive and negative result are equally likely. If the ET scan was the only test you could do, should you do it? Why or why not?
Many subjects answered that they would conduct the ET scan even if it were costly, and even if it were the only test that could be done. However, the test in question does not affect the course of action as to what treatment should be done. Because the probability of globoma is so high with a probability of 80%, the patient would be treated for globoma no matter what the test says. Globoma is the most probable disease before or after the ET scan.
In this example, we can calculate the value of the ET scan. Out of 100 patients, a total of 80 people will have globoma regardless of whether the ET scan is positive or negative. Since it is equally likely for a patient with globoma to have a positive or negative ET scan result, 40 people will have a positive ET scan and 40 people will have a negative ET scan, which totals to 80 people having globoma. This means that a total of 20 people will have either popitis or flapemia regardless of the result of the ET scan. The number of patients with globoma will always be greater than the number of patients with popitis or flapemia in either case of a positive or negative ET scan so the ET scan is useless in determining what disease to treat. The ET scan will indicate that globoma should be treated regardless of the result.
[edit]Reference...
Key elements of an Effective Website
1. Appearrence
2. Content
3.Functionality
4. Website Usability
5. Search Engine Optimization
www.google.com
www.yahoo.com
www.facebook.com
www.youtube.com
www.msn.com
By Mandy Porta, posted on April 13, 2009
Is your website producing the results you hoped for? Are you serious about the effectiveness of your online investment? This article explains eight basic ingredients of a successful website. Take note, because missing an ingredient can result in a poor aftertaste for your website visitors.
1. Original, Fresh Content
Content is king in the web world. People visit websites for the primary purpose of finding content, so make sure you deliver. Website content should be unique and up to date. Fresh content will keep visitors and search engines coming back for more. Don’t forget to proofread!
2. Target Audience
From a quick scan of your website, visitors should be able to determine what you offer and how you can benefit them. A good website will have headlines and text that speaks to the target audience’s needs and wants. Many websites simply list what their company does without saying how they can benefit their target audience. Keep your audience in mind when designing your website to be sure that it will appeal to them and encourage them to take action (whether that is to submit a contact form, signup for a newsletter or buy a product).
3. User-Friendly Navigation
A good website has content that is easy to find. Pages should be organized and named in a way that the target audience will easily understand. For instance, a services page would be better labeled “Services” than “Business Solutions.” Keep your navigation consistent from page to page to avoid any possible confusion. Double check all your links to make sure they are working. Make sure that your most popular content is no more than a click away from your homepage. If your website has a lot of content, provide a search box so visitors can quickly find what they are looking for.
4. Simple and Professional Design
A good website will have an attractive layout that is easy on the eyes. Be sure your colors contrast well and your text doesn’t require a magnifying glass to read. Personally, I can’t stand reading large amounts of content written in white on a black background. It strains my eyes. Reducing the contrast a bit can help (light grey text on dark grey background).
Lots of text can overwhelm a user. Breaking up text into subheads and bullet points will improve the layout of the page and make the text more scannable. No one has time to read every word on a page.
Use design elements to draw attention to or to enhance the content of a page. With every design element added, take a step back and make sure it serves a purpose and does not detract from the usability of the site. Put things where users expect them to be. However, do try to make your website look unique. Just remember that simple, professional design will be much more effective than flashy, overcrowded design.
5. Speed
How many seconds will you wait for a page to load before you give up and leave a website? Many factors can affect the loading time of a website including coding, number of graphics, the server speed, traffic volume on the website and the capabilities of a user’s computer.
Make sure your server has the proper amount of space/bandwidth for your website and that your website code is lightweight. Use large graphics sparingly. Use CSS styles in place of graphics where possible. Waiting for large graphics or a fancy flash animation to load on each page will surely turn away some visitors.
6. Search engine optimization
SEO is one of the most commonly neglected aspects of a website, but a website is useless if no one can find it. Think about the keywords that users may search for to find a product or service you offer. Do some research to see how often those keywords are searched for through a tool like Google’s Keyword Tool. Use keywords in titles, meta tags, headings, file names and in the content of your site. Search engine optimization can mean the difference between getting 500 visitors a month and 500+ visitors a day.
7. Link building
Links are an important factor in determining where your website appears in search engine results. Find as many legitimate sites as you can that will link you your website. Add your link to your business profile in directories like Google Local, Yahoo! Local, Merchant Circle, Insider Pages, Yellow Pages, LinkedIn and more. Submit articles and PR to sites that will include a link back to your website. The best way to get links to your site, however, is to provide unique and interesting content that people want to link to. You can make it easy for them to share your content by providing links or buttons such as the addthis.com button.
8. Tracking
A good website is a work in progress. A nice tool like Google Analytics will keep track of the number of people who come to your website, what pages they viewed, where they came from, what keywords they used in search engines, how many left after the first page and more. Unlike other media, websites can be easily tracked to see what is working and what isn’t. This data will help you to improve the quality and structure of your site.
Conclusion
Remember to include these characteristics in your next website design project. A professional-looking website with interesting content that is easy to navigate and can be found in search engines is sure to bring value to your business.
Feel free to comment. What other ingredients can spice up a website? What has been your most effective strategy in gaining website traffic?
Did you find my blog post helpful? Subscribe to the RSS feed, subscribe by email, or follow us on Twitter or Facebook to stay up to date with our latest posts on various marketing topics.
Related Articles
Information bias (psychology)
From Wikipedia, the free encyclopedia
For other uses, see Information bias (disambiguation).
This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (April 2008)
Information bias is a type of cognitive bias, and involves e.g. distorted evaluation of information. Information bias occurs due to people's curiosity and confusion of goals when trying to choose a course of action.
Contents [hide]
1 Over-evaluation of information
1.1 Globoma experiment
2 References
3 See also
[edit]Over-evaluation of information
An example of information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.
Examples of information bias are prevalent in medical diagnosis. Subjects in experiments concerning medical diagnostic problems show an information bias in which they seek information that is unnecessary in deciding the course of treatment.
[edit]Globoma experiment
In an experiment,[1] subjects considered this diagnostic problem involving fictitious diseases:
A female patient is presenting symptoms and a history which both suggest a diagnosis of globoma, with about 80% probability. If it isn't globoma, it's either popitis or flapemia. Each disease has its own treatment which is ineffective against the other two diseases. A test called the ET scan would certainly yield a positive result if the patient had popitis, and a negative result if she has flapemia. If the patient has globoma, a positive and negative result are equally likely. If the ET scan was the only test you could do, should you do it? Why or why not?
Many subjects answered that they would conduct the ET scan even if it were costly, and even if it were the only test that could be done. However, the test in question does not affect the course of action as to what treatment should be done. Because the probability of globoma is so high with a probability of 80%, the patient would be treated for globoma no matter what the test says. Globoma is the most probable disease before or after the ET scan.
In this example, we can calculate the value of the ET scan. Out of 100 patients, a total of 80 people will have globoma regardless of whether the ET scan is positive or negative. Since it is equally likely for a patient with globoma to have a positive or negative ET scan result, 40 people will have a positive ET scan and 40 people will have a negative ET scan, which totals to 80 people having globoma. This means that a total of 20 people will have either popitis or flapemia regardless of the result of the ET scan. The number of patients with globoma will always be greater than the number of patients with popitis or flapemia in either case of a positive or negative ET scan so the ET scan is useless in determining what disease to treat. The ET scan will indicate that globoma should be treated regardless of the result.
[edit]Reference...
Key elements of an Effective Website
1. Appearrence
2. Content
3.Functionality
4. Website Usability
5. Search Engine Optimization
www.google.com
www.yahoo.com
www.facebook.com
www.youtube.com
www.msn.com
วันจันทร์ที่ 3 ตุลาคม พ.ศ. 2554
Chapter 6
(how to write a report and citation...)http://www.monash.edu.au/lls/llonline/writing/information-technology/report/1.3.3.xml
1. Introduction
The purpose of this report is to survey the current state of scanner technology and to briefly discuss predicted advancements in the field.
By examining a range of recently published journal articles, magazine articles and internet sites on the topic of scanners this report describes the main types of scanners in common use today and examines their performance in relation to four criteria: resolution, bit-depth, dynamic range and software. The report then considers the effect of further technological advances in these four areas, as well as the deployment of new sensor technology on the future development of scanners.
The first scanner, initially referred to as a 'reading machine', was developed in 1960 by Jacob Rabinow, a Russian born engineer. The device could scan printed material and then compare each character to a set of standards in a matrix using, for the first time, the "best match principle" to determine the original message (Blatner, Fleishman and Roth 1998, p.3). This reading machine was to form the basis for the development of current scanning, sorting and processing machines.
An early improvement on the reading machine was the drum scanner. These scanners used a type of scanning technology called photomultiplier tubes (PMT). Drum scanners are still used in industry today because of the high quality images they produce. The development of smaller, more economical scanners such as desktop scanners and scanners for domestic use followed the drum scanner as the number of computer users increased and computer technology advanced.
Scanners can now capture images from a wide variety of two and three dimensional sources. These images are converted to digitised computer files that can be stored on a hard-drive or floppy disk. With the aid of specific software, these images can then be manipulated and enhanced by the user. It is now possible to deploy electronic acquisition to create an entire layout (including all graphic elements) from the same computer. This means manual stripping is no longer required (Scanners, digital cameras and photo CDs 2000). Scanners are considered an invaluable tool for adding graphics and text to documents and have been readily adopted by both business and domestic users.
2. How scanners work
A scanner is a device that uses a light source to electronically convert an image into binary data (0s and 1s). This binary data can then be used to store the scanned image on a computer. A scanner recreates an image by using small electronic components referred to as the scanner's 'eyes' (Scanner tips 2000). The type of 'eyes' used in today's scanners are charge-coupled devices (CCD) and photomultiplier tubes (PMT). These electronic eyes measure the amount of light reflected from individual points on the page and translate it to digital signals that correspond to the brightness of each point (Englander 2000).
To create a file on the computer that represents a colour image, the scanner divides the image into a grid with many individual points called pixels or picture elements (Scanner tips 2000). A scanning head, termed a row of 'eyes', reads over the grid and assigns a number to each pixel based on the main colour in that pixel, using green, blue and red. For example an aqua pixel would be saved as a number to represent the proportion of blue, green and red which represents the colour aqua (Scanners, digital cameras and photo CDs 2000).
3. Types of scanners
There are five main types of scanners in common use today: drum scanners, flatbed scanners, sheet-fed scanners, slide scanners, and hand held scanners.
3.1 Drum scanners
Drum scanners were widely used in the past, however they are much less commonly used today due to advances in scanner technology. As a result of their expense, these machines are primarily used by professionals in industry, where they are considered important due to the high-end quality image they produce and because they use PMT technology which is more sophisticated than charge-coupled devices (CCDs) and contact image sensor's (CISs). Drum scanners are difficult to operate and technicians operate these scanners by placing the item to be scanned on a glass cylinder rotating at high speeds around the sensor (Sullivan 1996).
3.2 Flatbed scanners
The most popular scanners for general use are flatbed scanners. This type of scanner is highly versatile because it is able to scan flat objects as well as small three dimensional objects. Flat-bed scanners operate by placing the item to be scanned on a glass window while scanning heads move underneath it. A transparency adapter is used to scan transparent originals such as slides or x-rays, and an automatic document feeder is available for scanning large numbers of documents (Scanner tips 2000).
3.3 Sheet-fed scanners
Sheet-fed scanners have grown in popularity in recent years, particularly for small office or domestic use as they are reasonably priced, can scan full-sized documents and are compact, requiring limited desk space (Scanner tips 2000). Most models of sheet-fed scanners have an inbuilt document feeder to overcome the problem of manually feeding one sheet of paper at a time. However the actual process or scanning with a sheet-fed scanner may result in distortion as the image to be scanned moves over the scanning heads (Scanner tips 2000). A further limitation of sheet-fed scanners is that they are unable to scan three dimensional objects.
3.4 Slide scanners
This type of scanner is used to scan items such as slides that need careful handling during scanning. Unlike other scanners, the scanning heads in slide scanners do not reflect light from the image, but rather pass light through it. This enables these scanners to produce superior results without distortions caused by reflective light. To be able to scan small and detailed items, these scanners have a large number of eyes on the scanning head which produces a high quality result. Slide scanners tend to be more expensive and less versatile than flatbed and sheet-fed scanners as they are limited to only scanning slides and film. These scanners, however, are well suited to users requiring high quality scans of large numbers of slides (Scanner tips 2000).
3.5 Hand held scanners
Hand held scanners are compact, portable scanners which are simply dragged across a page manually to capture an image. These scanners are easy to use and economical to purchase; however, their use is limited to text of up to four inches in diameter that does not require a high resolution. For this reason, hand held scanners are unsuitable for colour images. A further disadvantage of hand held scanners is that the user must have a steady hand when scanning or the resulting image will be distorted (Scanner tips 2000).
4. Scanner specifications
The performance of a scanner can be examined in relation to four main criteria: resolution, bit-depth, dynamic range and software.
4.1 Resolution
Resolution is a measure of how many pixels a scanner can sample in a given image. It is used to describe the amount of detail in an image (Figeiredo, McIllree and Thomas 1996). Higher resolution scanners are generally more expensive and produce superior results as they have a greater capacity to capture detail. Scanners have two types of resolutions: optical resolution and interpolated resolution.
Optical resolution, or hardware resolution, is a measure of how many pixels a scanner can actually read. A current model desktop scanner typically has a resolution of 300 x 300 dots per inch (dpi) (Anderson 1999). This means that this scanner has a scanning head with 300 sensors per inch, so it can sample 300 dpi in one direction and 300 dpi in the other direction by stopping the scanning head 300 times per inch in both directions. Some scanners stop the scanning head more frequently as it moves down the page, giving an optical resolution of 300 x 600 dpi; however, scanning more frequently in one direction does not improve the result of the scan. The basic requirement for scanning detailed images and line art from photos or other printed originals is an optical resolution of 600 dpi. When scanning slides and negatives the minimum optical resolution is 1200 dpi.
Interpolated resolution measures the number of pixels a scanner is able to predict. A scanner can turn a 300 x 300 dpi scan into a 600 x 600 dpi scan by looking in-between scanned pixels and guessing what that spot would have looked like if it had been scanned. This prediction is then used to insert new pixels in between the actual ones scanned. This technique is less precise than optical resolution; however it assists in improving the enlargement of scanned images.
4.2 Bit depth
Bit depth refers to the amount of information that a scanner records for each pixel when converting an image to digital form. Scanners differ in the amount of data they record for each pixel within an image. The simplest kinds of scanners only record data related to black and white details and have a bit depth of 1 (Anderson 1999). The minimum bit depth required for scanning photographs and documents is 24-bits, while slides, negatives or transparencies need a scanner with at least 30-bits.
Thus for a scanner to produce a high quality scan with colour, a higher bit depth is required. In general, current scanners have a bit depth of 24, which means that 8 bits of information can be collected for the three primary colours used in scanning; blue, red and green (Anderson 1999). This high resolution allows scanners to produce images close to photographic quality.
4.3 Dynamic range
Dynamic range refers to the measurement of the range of tones a scanner can record on a scale of 0.0 to 4.0, with 0.0 being perfect white and 4.0 being perfect black. Colour flat-bed scanners usually have a dynamic range of 2.4. A range of this measurement is unable to provide high quality colour scans. A dynamic range of 2.8 and 3.2 is suited to professional purposes and can be found in high-end scanners. An even higher dynamic range of 3.0 to 3.8 can be provided by drum scanners.
4.4 Software
A scanner, like any type of hardware, requires software. Typically the two most common pieces of software provided with scanners include optical character recognition (OCR) and image editing software. Optical character recognition software translates the information recorded in a scan, tiny dots, into a text file which can be edited. Image editing software allows the tones and colours of an image to be manipulated for better printing and display. Image editing also gives filters to apply special effects to scanned images
5. Future developments
The quality of scanned images is constantly improving as characteristics such as resolution, bit-depth and dynamic range are enhanced and further developed. More sophisticated image editing and optical character recognition software development is also resulting in superior quality scans. Future advances are expected to result in the incorporation of specialized scanners into other types of technology such as the recently developed digital camera. This device allows the user to take pictures of three-dimensional objects much like a regular camera, except that instead of using film, the objects are scanned by the camera in a similar process to the functioning of a flatbed scanner.
The relatively new area of sensor technology in the form of a contact image sensor (CIS) (see Appendix 1) is expected to improve the functionality of scanners and the quality of images as it "replaces the cumbersome optical reduction technique with a single row of sensors" (Grotta and Wiener 1998, p. 1). Developers have already been able to produce a CIS scanner which is thinner, lighter, more energy efficient and cheaper to manufacture than a traditional CCD base device. However, the quality of the scan is not as good as its counterparts. Further development of CIS technology is needed to improve image quality and colour, and to address the problem of a limited 300 or 600 dpi.
6. Conclusion
This report has identified five types of scanners currently available. Some are primarily used for professional purposes such as the drum scanner; others are used more broadly in the workplace and home such as flatbed scanners and to a lesser extent sheetfed scanners. Scanners for specialized purposes have also been identified such as slide and handheld scanners. The performance of these scanners is dependent upon their resolution, bit-depth, dynamic range and software. Scanners have improved significantly in recent years in terms of weight, size, price and speed, and the replacement of CCD technology with CIS technology is anticipated to produce further benefits to these areas as well as to scan quality. The impact of these improvements is expected to increase the accessibility of scanner technology to a wider range of users and its suitability for a wider range of purposes. In relation to this, the future of scanner technology seems to point to the convergence of different technologies. Specialized scanners are currently being incorporated into other types of technologies such as digital cameras, printers, and photocopiers. This can be expected to continue with other forms of technology in conjunction with further improvements to image quality, speed, price, size and weight.
7. Reference list
Anderson, D. The PC Guide. [http:www.pctechguide.com/18scanners.htm].
Blatner, D., Fleishman, G. Roth, G. (1998) Real world scanning and halftones 2nd edition, Peachpit Press, USA.
Englander, I (2000). The Architecture of computer hardware and systems software. John Wiley, USA, p272.
Figeiredo, J. McIllree, J. Thomas, N. (1996) Introducing information technology 2nd edition Jacaranda Press, Singapore, p145.
Grotta, D. and Weiner, S. What's now ...What's next. [http://www.zdnet.com/pcmag/features/scanners98/intro.html] PC Magazines 20 October 1998. 8/4/00
Prepress, scanners, digital cameras and photoCDs. [http://www.prepress.pps.com/mem/lib/ptr/scanners.html] 1998. 6/4/00
Scansoft scanner tips [http://www.scannercentral.com/scanners/tips/tips1.asp] 2000.6/4/00
Sullivan. M. Types of scanners. [http://hsdesign.com/scanning/types/types.html] 1996. 8/4/00
1. Introduction
The purpose of this report is to survey the current state of scanner technology and to briefly discuss predicted advancements in the field.
By examining a range of recently published journal articles, magazine articles and internet sites on the topic of scanners this report describes the main types of scanners in common use today and examines their performance in relation to four criteria: resolution, bit-depth, dynamic range and software. The report then considers the effect of further technological advances in these four areas, as well as the deployment of new sensor technology on the future development of scanners.
The first scanner, initially referred to as a 'reading machine', was developed in 1960 by Jacob Rabinow, a Russian born engineer. The device could scan printed material and then compare each character to a set of standards in a matrix using, for the first time, the "best match principle" to determine the original message (Blatner, Fleishman and Roth 1998, p.3). This reading machine was to form the basis for the development of current scanning, sorting and processing machines.
An early improvement on the reading machine was the drum scanner. These scanners used a type of scanning technology called photomultiplier tubes (PMT). Drum scanners are still used in industry today because of the high quality images they produce. The development of smaller, more economical scanners such as desktop scanners and scanners for domestic use followed the drum scanner as the number of computer users increased and computer technology advanced.
Scanners can now capture images from a wide variety of two and three dimensional sources. These images are converted to digitised computer files that can be stored on a hard-drive or floppy disk. With the aid of specific software, these images can then be manipulated and enhanced by the user. It is now possible to deploy electronic acquisition to create an entire layout (including all graphic elements) from the same computer. This means manual stripping is no longer required (Scanners, digital cameras and photo CDs 2000). Scanners are considered an invaluable tool for adding graphics and text to documents and have been readily adopted by both business and domestic users.
2. How scanners work
A scanner is a device that uses a light source to electronically convert an image into binary data (0s and 1s). This binary data can then be used to store the scanned image on a computer. A scanner recreates an image by using small electronic components referred to as the scanner's 'eyes' (Scanner tips 2000). The type of 'eyes' used in today's scanners are charge-coupled devices (CCD) and photomultiplier tubes (PMT). These electronic eyes measure the amount of light reflected from individual points on the page and translate it to digital signals that correspond to the brightness of each point (Englander 2000).
To create a file on the computer that represents a colour image, the scanner divides the image into a grid with many individual points called pixels or picture elements (Scanner tips 2000). A scanning head, termed a row of 'eyes', reads over the grid and assigns a number to each pixel based on the main colour in that pixel, using green, blue and red. For example an aqua pixel would be saved as a number to represent the proportion of blue, green and red which represents the colour aqua (Scanners, digital cameras and photo CDs 2000).
3. Types of scanners
There are five main types of scanners in common use today: drum scanners, flatbed scanners, sheet-fed scanners, slide scanners, and hand held scanners.
3.1 Drum scanners
Drum scanners were widely used in the past, however they are much less commonly used today due to advances in scanner technology. As a result of their expense, these machines are primarily used by professionals in industry, where they are considered important due to the high-end quality image they produce and because they use PMT technology which is more sophisticated than charge-coupled devices (CCDs) and contact image sensor's (CISs). Drum scanners are difficult to operate and technicians operate these scanners by placing the item to be scanned on a glass cylinder rotating at high speeds around the sensor (Sullivan 1996).
3.2 Flatbed scanners
The most popular scanners for general use are flatbed scanners. This type of scanner is highly versatile because it is able to scan flat objects as well as small three dimensional objects. Flat-bed scanners operate by placing the item to be scanned on a glass window while scanning heads move underneath it. A transparency adapter is used to scan transparent originals such as slides or x-rays, and an automatic document feeder is available for scanning large numbers of documents (Scanner tips 2000).
3.3 Sheet-fed scanners
Sheet-fed scanners have grown in popularity in recent years, particularly for small office or domestic use as they are reasonably priced, can scan full-sized documents and are compact, requiring limited desk space (Scanner tips 2000). Most models of sheet-fed scanners have an inbuilt document feeder to overcome the problem of manually feeding one sheet of paper at a time. However the actual process or scanning with a sheet-fed scanner may result in distortion as the image to be scanned moves over the scanning heads (Scanner tips 2000). A further limitation of sheet-fed scanners is that they are unable to scan three dimensional objects.
3.4 Slide scanners
This type of scanner is used to scan items such as slides that need careful handling during scanning. Unlike other scanners, the scanning heads in slide scanners do not reflect light from the image, but rather pass light through it. This enables these scanners to produce superior results without distortions caused by reflective light. To be able to scan small and detailed items, these scanners have a large number of eyes on the scanning head which produces a high quality result. Slide scanners tend to be more expensive and less versatile than flatbed and sheet-fed scanners as they are limited to only scanning slides and film. These scanners, however, are well suited to users requiring high quality scans of large numbers of slides (Scanner tips 2000).
3.5 Hand held scanners
Hand held scanners are compact, portable scanners which are simply dragged across a page manually to capture an image. These scanners are easy to use and economical to purchase; however, their use is limited to text of up to four inches in diameter that does not require a high resolution. For this reason, hand held scanners are unsuitable for colour images. A further disadvantage of hand held scanners is that the user must have a steady hand when scanning or the resulting image will be distorted (Scanner tips 2000).
4. Scanner specifications
The performance of a scanner can be examined in relation to four main criteria: resolution, bit-depth, dynamic range and software.
4.1 Resolution
Resolution is a measure of how many pixels a scanner can sample in a given image. It is used to describe the amount of detail in an image (Figeiredo, McIllree and Thomas 1996). Higher resolution scanners are generally more expensive and produce superior results as they have a greater capacity to capture detail. Scanners have two types of resolutions: optical resolution and interpolated resolution.
Optical resolution, or hardware resolution, is a measure of how many pixels a scanner can actually read. A current model desktop scanner typically has a resolution of 300 x 300 dots per inch (dpi) (Anderson 1999). This means that this scanner has a scanning head with 300 sensors per inch, so it can sample 300 dpi in one direction and 300 dpi in the other direction by stopping the scanning head 300 times per inch in both directions. Some scanners stop the scanning head more frequently as it moves down the page, giving an optical resolution of 300 x 600 dpi; however, scanning more frequently in one direction does not improve the result of the scan. The basic requirement for scanning detailed images and line art from photos or other printed originals is an optical resolution of 600 dpi. When scanning slides and negatives the minimum optical resolution is 1200 dpi.
Interpolated resolution measures the number of pixels a scanner is able to predict. A scanner can turn a 300 x 300 dpi scan into a 600 x 600 dpi scan by looking in-between scanned pixels and guessing what that spot would have looked like if it had been scanned. This prediction is then used to insert new pixels in between the actual ones scanned. This technique is less precise than optical resolution; however it assists in improving the enlargement of scanned images.
4.2 Bit depth
Bit depth refers to the amount of information that a scanner records for each pixel when converting an image to digital form. Scanners differ in the amount of data they record for each pixel within an image. The simplest kinds of scanners only record data related to black and white details and have a bit depth of 1 (Anderson 1999). The minimum bit depth required for scanning photographs and documents is 24-bits, while slides, negatives or transparencies need a scanner with at least 30-bits.
Thus for a scanner to produce a high quality scan with colour, a higher bit depth is required. In general, current scanners have a bit depth of 24, which means that 8 bits of information can be collected for the three primary colours used in scanning; blue, red and green (Anderson 1999). This high resolution allows scanners to produce images close to photographic quality.
4.3 Dynamic range
Dynamic range refers to the measurement of the range of tones a scanner can record on a scale of 0.0 to 4.0, with 0.0 being perfect white and 4.0 being perfect black. Colour flat-bed scanners usually have a dynamic range of 2.4. A range of this measurement is unable to provide high quality colour scans. A dynamic range of 2.8 and 3.2 is suited to professional purposes and can be found in high-end scanners. An even higher dynamic range of 3.0 to 3.8 can be provided by drum scanners.
4.4 Software
A scanner, like any type of hardware, requires software. Typically the two most common pieces of software provided with scanners include optical character recognition (OCR) and image editing software. Optical character recognition software translates the information recorded in a scan, tiny dots, into a text file which can be edited. Image editing software allows the tones and colours of an image to be manipulated for better printing and display. Image editing also gives filters to apply special effects to scanned images
5. Future developments
The quality of scanned images is constantly improving as characteristics such as resolution, bit-depth and dynamic range are enhanced and further developed. More sophisticated image editing and optical character recognition software development is also resulting in superior quality scans. Future advances are expected to result in the incorporation of specialized scanners into other types of technology such as the recently developed digital camera. This device allows the user to take pictures of three-dimensional objects much like a regular camera, except that instead of using film, the objects are scanned by the camera in a similar process to the functioning of a flatbed scanner.
The relatively new area of sensor technology in the form of a contact image sensor (CIS) (see Appendix 1) is expected to improve the functionality of scanners and the quality of images as it "replaces the cumbersome optical reduction technique with a single row of sensors" (Grotta and Wiener 1998, p. 1). Developers have already been able to produce a CIS scanner which is thinner, lighter, more energy efficient and cheaper to manufacture than a traditional CCD base device. However, the quality of the scan is not as good as its counterparts. Further development of CIS technology is needed to improve image quality and colour, and to address the problem of a limited 300 or 600 dpi.
6. Conclusion
This report has identified five types of scanners currently available. Some are primarily used for professional purposes such as the drum scanner; others are used more broadly in the workplace and home such as flatbed scanners and to a lesser extent sheetfed scanners. Scanners for specialized purposes have also been identified such as slide and handheld scanners. The performance of these scanners is dependent upon their resolution, bit-depth, dynamic range and software. Scanners have improved significantly in recent years in terms of weight, size, price and speed, and the replacement of CCD technology with CIS technology is anticipated to produce further benefits to these areas as well as to scan quality. The impact of these improvements is expected to increase the accessibility of scanner technology to a wider range of users and its suitability for a wider range of purposes. In relation to this, the future of scanner technology seems to point to the convergence of different technologies. Specialized scanners are currently being incorporated into other types of technologies such as digital cameras, printers, and photocopiers. This can be expected to continue with other forms of technology in conjunction with further improvements to image quality, speed, price, size and weight.
7. Reference list
Anderson, D. The PC Guide. [http:www.pctechguide.com/18scanners.htm].
Blatner, D., Fleishman, G. Roth, G. (1998) Real world scanning and halftones 2nd edition, Peachpit Press, USA.
Englander, I (2000). The Architecture of computer hardware and systems software. John Wiley, USA, p272.
Figeiredo, J. McIllree, J. Thomas, N. (1996) Introducing information technology 2nd edition Jacaranda Press, Singapore, p145.
Grotta, D. and Weiner, S. What's now ...What's next. [http://www.zdnet.com/pcmag/features/scanners98/intro.html] PC Magazines 20 October 1998. 8/4/00
Prepress, scanners, digital cameras and photoCDs. [http://www.prepress.pps.com/mem/lib/ptr/scanners.html] 1998. 6/4/00
Scansoft scanner tips [http://www.scannercentral.com/scanners/tips/tips1.asp] 2000.6/4/00
Sullivan. M. Types of scanners. [http://hsdesign.com/scanning/types/types.html] 1996. 8/4/00
สมัครสมาชิก:
บทความ (Atom)