Soft computing, as opposed to traditional computing, deals with approximate models and gives solutions to complex real-life problems. Unlike hard computing, soft computing is tolerant of imprecision, uncertainty, partial truth, and approximations. In effect, the role model for soft computing is the human mind. Soft computing is based on techniques such as fuzzy logic, genetic algorithms, artificial neural networks, machine learning, and expert systems. Although soft computing theory and techniques were first introduced in 1980s, it has now become a major research and study area in automatic control engineering. The techniques of soft computing are nowadays being used successfully in many domestic, commercial, and industrial applications. With the advent of the low-cost and very high performance digital processors and the reduction of the cost of memory chips it is clear that the techniques and application areas of soft computing will continue to expand.
Application of Soft Computing:
1. Consumer appliance like AC, Refrigerator, Heaters, Washing machine.
2. Robotic works in the form of Emotional Pet robots.
3. Food preparation devices are Microwave and Rice cookers.
4. For amusing gaming playing product like Checker and Poker etc.
5. Recognition for Handwriting.
6. Data compression/Image Processing
7. For Architecture
8. System to Decision-support
Automata theory is the study of abstract machines and automata, as well as the computational problems that can be solved using them. It is a theory in theoretical computer science. The word automata (the plural of automaton) come from the Greek word αὐτόματα, which means "self-making".
Automata Theory is a branch of computer science that deals with designing abstract self propelled computing devices that follow a predetermined sequence of operations automatically. An automaton with a finite number of states is called a Finite Automaton.
- Languages and automata are elegant and robust concepts that you will find in every area of computer science. Here are some practical, fairly complicated, practical problems that are approached via language theory.
- You want to spot duplicate occurrences of a phrase in a document and delete the second occurrence. In essence, you want to substitute a sequence in a language.
- Does a program contain an assertion violation? Does a device driver respect certain protocols when interacting with the kernel? The behaviour of a program is a set of executions; in other words, a language. The correctness property is another language. The program correctness problem amounts to a language inclusion check.
- Can your software be stuck in an infinite loop? Does a distributed algorithm contain a livelock? We need languages over infinite words, but the language inclusion view still applies.
- Run-time monitoring of reactive and mission-critical systems. You want to design a software monitor that oversees the operation of your chemical process or track updates to a financial database. These are at heart language inclusion and intersection problems.
- Pattern recognition with its numerous applications. You want to detect patterns in genomic data, in text, in a series of bug reports. These are problems where we are given words from an unknown language and have to guess the language. These are language inference problems.
- Given a set of XML documents, you want to reverse engineer a schema that applies to these documents. XML documents can be idealised a trees. A schema is then a specification of a tree language and the schema inference problem is a language inference problem over tree languages.
- Many applications require automated arithmetic reasoning. Suppose we fix a logical theory such as Presburger arithmetic, in which we have the natural numbers, addition and the less-than predicate. A formula with n variables represents a set of n-dimensional vectors. A vector is a sequence of digits and can be encoded as a word. A predicate is then a set of words; a language. Logical operations such as conjunction, disjunction and negation become intersection, union and complement of languages (existential quantification is a kind of projection).
Many other branches of science also involve unbelievable levels of complexity, impossibly large degrees of variation, and apparently random processes, so it makes sense that automata theory can contribute to a better scientific understanding of these areas as well. The modern-day pioneer of cellular automata applications is Stephen Wolfram, who argues that the entire universe might eventually be describable as a machine with finite sets of states and rules and a single initial condition. He relates automata theory to a wide variety of scientific pursuits, including:
- Fluid Flow
- Snowflake and crystal formation
- Chaos theory
- Financial analysis
Automata theory is the basis for the theory of formal languages. Automata theory is the study of abstract machines and automata, as well as the computational problems that can be solved using them.
Automata theory is closely related to formal language theory. An automaton is a finite representation of a formal language that may be an infinite set.