Home » » Computer and Mathematics

Computer and Mathematics

ALGORITHMS

In mathematics, method of solving a problem by repeatedly using a simpler computational method. A basic example is the process of long division in arithmetic. The term algorithm is now applied to many kinds of problem solving that employ a mechanical sequence of steps, as in setting up a computer program. The sequence may be displayed in the form of a flowchart in order to make it easier to follow.

As with algorithms used in arithmetic, algorithms for computers can range from simple to highly complex. In all cases, however, the task that the algorithm is to accomplish must be definable. That is, the definition may involve mathematical or logic terms or a compilation of data or written instructions, but the task itself must be one that can be stated in some way. In terms of ordinary computer usage, this means that algorithms must be programmable, even if the tasks themselves turn out to have no solution.

In computational devices with a built-in microcomputer logic, this logic is a form of algorithm. As computers increase in complexity, more and more software-program algorithms are taking the form of what is called hard software. That is, they are increasingly becoming part of the basic circuitry of computers or are easily attached adjuncts, as well as standing alone in special devices such as office payroll machines. Many different applications algorithms are now available, and highly advanced systems such as artificial intelligence algorithms may become common in the future.

ARTIFICIAL INTELLIGENCE

Artificial Intelligence (AI), a term that in its broadest sense would indicate the ability of an artifact to perform the same kinds of functions that characterize human thought. The possibility of developing some such artifact has intrigued human beings since ancient times. With the growth of modern science, the search for AI has taken two major directions: psychological and physiological research into the nature of human thought, and the technological development of increasingly sophisticated computing systems.

In the latter sense, the term AI has been applied to computer systems and programs capable of performing tasks more complex than straightforward programming, although still far from the realm of actual thought. The most important fields of research in this area are information processing, pattern recognition, game-playing computers, and applied fields such as medical diagnosis. Current research in information processing deals with programs that enable a computer to understand written or spoken information and to produce summaries, answer specific questions, or redistribute information to users interested in specific areas of this information. Essential to such programs is the ability of the system to generate grammatically correct sentences and to establish linkages between words, ideas, and associations with other ideas. Research has shown that whereas the logic of language structure-its syntax-submits to programming, the problem of meaning, or semantics, lies far deeper, in the direction of true AI.

In medicine, programs have been developed that analyze the disease symptoms, medical history, and laboratory test results of a patient, and then suggest a diagnosis to the physician. The diagnostic program is an example of so-called expert systems-programs designed to perform tasks in specialized areas as a human would. Expert systems take computers a step beyond straightforward programming, being based on a technique called rule-based inference, in which preestablished rule systems are used to process the data. Despite their sophistication, systems still do not approach the complexity of true intelligent thought.

Many scientists remain doubtful that true AI can ever be developed. The operation of the human mind is still little understood, and computer design may remain essentially incapable of analogously duplicating those unknown, complex processes. Various routes are being used in the effort to reach the goal of true AI. One approach is to apply the concept of parallel processing-interlinked and concurrent computer operations. Another is to create networks of experimental computer chips, called silicon neurons, that mimic data-processing functions of brain cells. Using analog technology, the transistors in these chips emulate nerve-cell membranes in order to operate at the speed of neurons.

LINEAR PROGRAMMING

Linear Programming, mathematical and operations-research technique, used in administrative and economic planning to maximize the linear functions of a large number of variables, subject to certain constraints. The development of high-speed electronic computers and data-processing techniques has brought about many recent advances in linear programming, and the technique is now widely used in industrial and military operations.

Linear programming is basically used to find a set of values, chosen from a prescribed set of numbers, that will maximize or minimize a given polynomial form and this is illustrated by the finished; the manufacturer knows that as many articles as are produced can be sold.

Popular Posts

Labels

Blog Archive

 
Copyright © 2013. Home Property - All Rights Reserved