Artificial Intelligence & Expert Systems Prof. Sandeep Kaur Dept. of Computer Science, Mata Sahib Kaur Girls College, Gehal, India email@example.com Abstract: Artificial Intelligence is a branch of Science which deals with helping machines finds solutions to complex problems in a more human-like fashion. This generally involves borrowing characteristics from human intelligence, and applying them as algorithms in a computer friendly way in the words Artificial Intelligence’s scientific goal is to understand intelligence by building computer programs that exhibit intelligent behavior. This paper presents some back ground and potential of Artificial Intelligence and its implementation in various fields and its history. We discuss issues
Even the term \interface" suggests the primary role of the face in communication between two entities. Studies have shown that interpreting facial expressions can significantly alter the interpretation of what is spoken as well as control the °ow of a conversation . Im-proving the communication between humans and computers has been one of the driving forces in computer science over the years. This research has involved coordination between several fields of research including: psychology, biology, computer science, and engineer-ing. The key to creating a better communication between humans and machines involves developing more sophisticated techniques for interpreting
I’m very interested in studying the peculiarities and general structure of language, but I’m also very interested in the concept of artificial intelligence. I want to solve extraordinarily challenging, yet simple challenges in artificial intelligence and linguistics. challenges like, “If people, whether they be Japanese or English native speakers, don’t completely understand the meaning of words like ‘です’, how can we teach robots or computers to speak, listen, and comprehend difficult concepts, like the meaning of
we are designing a method which does the segmentation of handwritten characters and recognition using neural network. The attempt is to improve the performance in terms of time and to get better accuracy. It has been found that recognition of handwritten Devanagari character is quite difficult due to presence of shirorekha,similarity in shapes for multiple characters. VIII. ACKNOWLEDGEMENT I am very thank full to Prof. Sushila Aghav sfor their grateful guidelines to us and Prof. R. K Bedi Head of Computer Department for good support to us.
During training such data is stored in the database where, during classification, a character is placed in the appropriate class to which it belongs. Handwritten Character Recognition has been a challenging research domain due to its diverse applicable environment . Handwriting, as has always been, is assumed to be continued as preferred means of communication. Effective HCR systems need to be designed to convert these handwritten documents into an editable format. HCR systems aim at higher accuracy, with considerably reduced computational and storage space requirements.
Chapter 2 Literature Survey 2.1 General Word sense disambiguation was one of the important problem during the early days of machine translation. WSD is the task to determine the proper meaning of word and use it in particular context. WSD can be considered as classication problem because the word senses can be the classes.Moreover the automatic classication techniques can be used to recognize and assign each occurrence of the word to classes from external knowledge sources. 2.2 Literature Review 1. Gaona, Gelbukh A Bandyopadhyay advocate to use knowledge based appraches for better word sense disambiguation.
A. Data preprocessing Text mining is the process of seeking or extracting the useful information from the textual data. Our data is preprocessed with the help of NLP (Natural Language Processing). Natural Language Processing (NLP) is an area of research and application that explores how computers can be used to understand and manipulate natural language text. Fig.2 Block diagram of citation recommendation system 1.
NLP is a field of computer science, artificial intelligence, and computational linguistics concerned with the interactions between computers and human (natural) languages . Automatic text summarization is a process of producing the summary of original text automatically by using machine . One important task in this field is to reduce the size of a text while preserving its information content , . A summarizer is a system that produces a summarized representation of its input’s for user consumption . Summary construction is, in general, a complex task which ideally would involve deep natural language processing capacities
CHAPTER TWO REVIEW OF LITERATURE 2.1 INTRODUCTION An attempt has been made to use CALL and CALT .Second language acquirement depends mainly on the individual learning motivation, memory and the difficulties faced due to psychological and character background. .Since the computer has the capacity for storage language teaching and research started. Retrieval and monitoring of vast data, it has become an ideal processing tool for most language teaching and research activities in English world wide. Language skills are learned more productively if the items to be learned are in the target language and presented in the spoken form before they are seen in the written form. 2.2 HISTORICAL PHASES OF CALL Three historical phases of CALL
Therefore, it is often described as being a relatively new approach that investigates language in use with the aid of computers. However, since computer is merely an automaton, for its interpretation still human's interaction becomes the primary means. Corpus linguistics has largely been accepted as an important way of analyzing language in different fields such as lexicography (Hanks, 2012), syntax (Roland, Dick, & Elman, 2007), cognitive linguistics (Gries & Stefanowitsch, 2007), and applied linguistics (Hunston,