Uncertainty is the shortage of precise knowledge, incomplete information and uncertain data which describes the state of the environment regardless what is the cause of this insufficient data.1 Different models have been proposed to deal with uncertainty for solving real life problems. 2 The fuzzy theory was introduced by Professor Lotfi Zadeh in 1965 which considers the degree of the membership of elements in a set.3 Zadeh also introduced type 2 fuzzy theory in 1975 which membership grades themselves is fuzzy. 4 In 1983, Intuitionistic fuzzy set theory presented by Attanssov as an extension of the standard fuzzy sets.5 The vague set was first proposed by Gau and Buehrer in 1993. It depends on concept of the degree of membership for truth and the complement of false are not the same.6 The uncertain problems don't need precise models to deal with as it decreases the understanding of the outcome, different applications are deep rooted with uncertain data; the interest of research on effective and efficient methods to deal with uncertainties has increased.7 One of the key problems of artificial intelligence is modeling uncertainty for solving real life problems.8 The characteristics of …show more content…
The deficient of the precise knowledge that enable reaching a reliable result is uncertainty. Unlike, classical logic supposes that complete and accurate knowledge always subsists 14. Uncertainty affects decision making in unwanted sides while trying to reduce uncertainty risks are accompanied with uncertainty 21.Uncertainty can be represented by three methods which are numeric where a scale with two extreme number from 0 to 100 is used; graphical where gradient bar is used to express expert's opinion in certain events and symbolic where linguistic scale and ranking complemented with numbers can be used 11.This section explores different models for dealing with uncertainty and their
The knowledge base consists of a collection of fuzzy if-then rules of the following form: $R^{l}$: if $x_1$ is $F_1^{l}$ and $x_2$ is $F_2^{l}$ and $ldots$ and $x_n$ is $F_n^{l}$, then is $G^{l},~l=1,2, cdots ,n$, where $x=[x_1,cdots,x_i]^{T}$ and $y$ are the FLS input and output, respectively. Fuzzy sets $F_i^{l}$ and $G^{l}$, associated with the fuzzy functions $mu_{{F_i}^{l}}(x_i)$ and $mu_{{G}^{l}}(y)$, respectively. $N$ is the rules inference number. \Through singleton function, center average defuzzification and product inference cite{shaocheng2000fuzzy}, the FLS can be expressed as: For any continuous function $f(x)$ defined on a compact set $Omegain R^n$, there exists a fuzzy system $y(x) = heta ^T
However, since there was the 2 percent of uncertainty, that would be enough to cause
Probability Individuals make choices every day from the moment they wake up to the minute they go to sleep. People generate probability decisions on a daily basis without them realizing it. A few people elect to take a different route to work, hoping to encounter less traffic while others are conformable taking less risk as well as traveling familiar territory. Probability is the chance or likelihood of an event occurring (Mirabella, 2011). The focus will be on the various types of possibilities such as simple, joint, additional and conditional probabilities in answering five distinct practicability questions.
However, to quantify information means not only involving numbers or
Precision: I use Precision first. I make sure I have all the details I need when completing a task. I never feel that I have too much information. I always have the supporting documentation that I need to prove that my information is accurate.
A state is considered to be correct if no computation dependent on the mis-speculated data value has been performed in its predecessors. Computation which now restarted from a correct state can be guaranteed to be as correct.
One of the most common action that businesses as well as individuals needs to face on a daily basis is a decision making process. Some of the choices can be difficult, other very simple , yet no matter on the situation these choices can have a large impact on our future life. As we are all aware, conflict can occur very fast and easily, so for the same reason it is important to learn how to effectively deal with these kind of problems. There are many different techniques which we can use while reducing the tension , yet the six step model process is known as the most effective.
However, not all data collected will be numbers, there
Summary Of Argument, Methods: In 1968, stop and frisk was based on strict guidelines that explained how far an officer can frisk someone according to the Fourth and Fourteenth Amendments. Behind the police officers’ stop and frisks, the strategies of broken windows policing and the zero-tolerance policy were introduced. Broken windows theory began in New York during the year of 1982, and former Mayor Giuliani of New York created zero-tolerance policy in 1997. Broken windows was a known policing strategy throughout all departments in the nation.
Moreover, people get the wrong idea about data and facts that permit them to create incorrect
Without a formal procedure, the contributory factors to the process are difficult to conclude. Preferences and values of decision-makers vary and are inconsistent. The discussion may be hindered and the effectiveness of the model is limited (Guy,
This information raged from mathematical data to scientific data. Depending on that information research that student (after doing the research) must evaluate the validity of that information found. I don’t
WORKPLACE DECISION MAKING- MY REFLECTION 'S Dealing with ambiguity comes naturally to us humans, and it starts with learning our first language as an infant. The effective decision making is an iterative learning process acquired by relying on experiences from our own past reflexes and also learning from those of others. The past couple of years presented an interesting mix of challenges in my workplace, including complex projects with tight deadlines, budget cuts and building out teams at short notice later. One such project involved automation of calculations for economic risk capital, which is a loss buffer maintained by banks.