Computer and mathematical

[ad_1]

algorithm

In mathematics, a method to solve the problem by repeatedly using a simpler computational method. A basic example is the process of long division in math. The term algorithm is now applied to many kinds of problem solving that employ mechanical series of steps, such as setting up a computer program. The order may appear in the form of a flow chart to make it easier to follow.

As with algorithms used in mathematics, algorithms for computers can range from simple to very complex. In all cases, however, the task of the algorithm is to achieve a definable. That is, the definition can include math or logic terms or compilation of data and written instructions, but the task itself will be one that can be carried out in any way. Regarding the general use of computers, this means that the algorithm will be programmable, even though the projects themselves turn out to have no solution.

in computational devices with a built-in microcomputer logic, this logic is a form of algorithms. As computers increase in complexity, more and more software-program algorithms take the form of what is called hard software. That is, they are increasingly becoming part of the basic circuitry of computers or are easily attached adjuncts, and standing alone in special devices, such as office payroll machines. Many different applications algorithms are now available, and highly advanced systems such as artificial intelligence algorithms can become common in the future.

Artificial Intelligence

Artificial Intelligence (AI), a term in the broadest sense would indicate the ability of the artifact to perform the same kind of features that characterize human thought. The possibility of developing some such artifact has intrigued people since ancient times. With the growth of modern science, the search for AI has taken two main directions. Psychological and physiological studies on the nature of human thought, and technology increasingly sophisticated computing systems

In the latter sense, the term AI has been applied to computer systems and programs that can perform tasks more complex than a simple programming, though still far from the realm of the real thought. The most important areas of research in this field are information processing, pattern recognition, game-playing computers, and applied fields such as diagnostic. Current research in data processing deals with a program that allows the computer to understand written or spoken information and to produce summaries, specific questions, or distribute information to users who are interested in certain areas of this information. Essential for such projects is the system’s ability to create grammatically correct sentences and to establish relationships between words, ideas, and associations with other ideas. Studies have shown that the logic of language structure-its syntax-delivered programming, the problem of meaning or semantics, lies much deeper, in the direction of true AI.

In medicine, programs have been developed to diagnose the disease symptoms, medical history and the results of a patient, and then suggest a diagnosis to the doctor. Analyses program is an example of so-called expert systems applications that are designed to perform tasks in specialized areas such as human would. Expert systems take computers a step beyond simple programming based on a technology called rule-based inference where preestablished management systems are used to process the data. Despite their sophistication, the system still does not approach the complexity of a true intelligent thinking.

Many scientists are still doubtful that true AI can ever develop. The operation of the human mind is still little understood, and computer design can be effectively unable to similar repeat them unknown, complex process. Various methods are used in an effort to achieve the goal of true AI. One approach is to apply the concept of parallel processing-connected and interactive computer activities. Another is to create a network of experimental computer chips, called silicon neurons that mimic the processing task functions brain cells. Using analog technology, transistors in these chips mimic nerve-cell membranes in order to operate at the speed of neurons.

Best

Best, mathematics and operational research techniques used in the administrative and economic planning to optimize linear functions number of variables, to certain restrictions. Development of high-speed electronic computers and data-processing technology has brought about many recent advances in linear programming, and the technology is now widely used in industrial and military operations.

Linear programming is basically used to find the set of values, selected from a prescribed set of numbers, that will maximize or minimize a given polynomial model and this is illustrated by complete; manufacturer knows that as many articles that are produced can be sold.

[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *