Search results for: inverse Laplace transform techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3420

Search results for: inverse Laplace transform techniques

330 Modern Detection and Description Methods for Natural Plants Recognition

Authors: Masoud Fathi Kazerouni, Jens Schlemper, Klaus-Dieter Kuhnert

Abstract:

Green planet is one of the Earth’s names which is known as a terrestrial planet and also can be named the fifth largest planet of the solar system as another scientific interpretation. Plants do not have a constant and steady distribution all around the world, and even plant species’ variations are not the same in one specific region. Presence of plants is not only limited to one field like botany; they exist in different fields such as literature and mythology and they hold useful and inestimable historical records. No one can imagine the world without oxygen which is produced mostly by plants. Their influences become more manifest since no other live species can exist on earth without plants as they form the basic food staples too. Regulation of water cycle and oxygen production are the other roles of plants. The roles affect environment and climate. Plants are the main components of agricultural activities. Many countries benefit from these activities. Therefore, plants have impacts on political and economic situations and future of countries. Due to importance of plants and their roles, study of plants is essential in various fields. Consideration of their different applications leads to focus on details of them too. Automatic recognition of plants is a novel field to contribute other researches and future of studies. Moreover, plants can survive their life in different places and regions by means of adaptations. Therefore, adaptations are their special factors to help them in hard life situations. Weather condition is one of the parameters which affect plants life and their existence in one area. Recognition of plants in different weather conditions is a new window of research in the field. Only natural images are usable to consider weather conditions as new factors. Thus, it will be a generalized and useful system. In order to have a general system, distance from the camera to plants is considered as another factor. The other considered factor is change of light intensity in environment as it changes during the day. Adding these factors leads to a huge challenge to invent an accurate and secure system. Development of an efficient plant recognition system is essential and effective. One important component of plant is leaf which can be used to implement automatic systems for plant recognition without any human interface and interaction. Due to the nature of used images, characteristic investigation of plants is done. Leaves of plants are the first characteristics to select as trusty parts. Four different plant species are specified for the goal to classify them with an accurate system. The current paper is devoted to principal directions of the proposed methods and implemented system, image dataset, and results. The procedure of algorithm and classification is explained in details. First steps, feature detection and description of visual information, are outperformed by using Scale invariant feature transform (SIFT), HARRIS-SIFT, and FAST-SIFT methods. The accuracy of the implemented methods is computed. In addition to comparison, robustness and efficiency of results in different conditions are investigated and explained.

Keywords: SIFT combination, feature extraction, feature detection, natural images, natural plant recognition, HARRIS-SIFT, FAST-SIFT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 729
329 Manipulation of Ideological Items in the Audiovisual Translation of Voiced-Over Documentaries in the Arab World

Authors: S. Chabbak

Abstract:

In a widely globalized world, the influence of audiovisual translation on the culture and identity of audiences is unmistakable. However, in the Arab World, there is a noticeable disproportion between this growing influence and the research carried out in the field. As a matter of fact, the voiced-over documentary is one of the most abundantly translated genres in the Arab World that carries lots of ideological elements which are in many cases rendered by manipulation. However, voiced-over documentaries have hardly received any focused attention from researchers in the Arab World. This paper attempts to scrutinize the process of translation of voiced-over documentaries in the Arab World, from French into Arabic in the present case study, by sub-categorizing the ideological items subject to manipulation, identifying the techniques utilized in their translation and exploring the potential extra-linguistic factors that prompt translation agents to opt for manipulative translation. The investigation is based on a corpus of 94 episodes taken from a series entitled 360° GEO Reports, produced by the French German network ARTE in French, and acquired, translated and aired by Al Jazeera Documentary Channel for Arab audiences. The results yielded 124 cases of manipulation in four sub-categories of ideological items, and the use of 10 different oblique procedures in the process of manipulative translation. The study also revealed that manipulation is in most of the instances dictated by the editorial line of the broadcasting channel, in addition to the religious, geopolitical and socio-cultural peculiarities of the target culture.

Keywords: Audiovisual translation, ideological items, manipulation, voiced-over documentaries.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1051
328 The Effectiveness of Cognitive Behavioural Intervention in Alleviating Social Avoidance for Blind Students

Authors: Mohamed M. Elsherbiny

Abstract:

Social Avoidance is one of the most important problems that face a good number of disabled students. It results from the negative attitudes of non-disabled students, teachers and others. Some of the past research has shown that non-disabled individuals hold negative attitudes toward persons with disabilities. The present study aims to alleviate Social Avoidance by applying the Cognitive Behavioral Intervention. 24 Blind students aged 19–24 (university students) were randomly chosen we compared an experimental group (consisted of 12 students) who went through the intervention program, with a control group (12 students also) who did not go through such intervention. We used the Social Avoidance and Distress Scale (SADS) to assess social anxiety and distress behavior. The author used many techniques of cognitive behavioral intervention such as modeling, cognitive restructuring, extension, contingency contracts, selfmonitoring, assertiveness training, role play, encouragement and others. Statistically, T-test was employed to test the research hypothesis. Result showed that there is a significance difference between the experimental group and the control group after the intervention and also at the follow up stages of the Social Avoidance and Distress Scale. Also for the experimental group, there is a significance difference before the intervention and the follow up stages for the scale. Results showed that, there is a decrease in social avoidance. Accordingly, cognitive behavioral intervention program was successful in decreasing social avoidance for blind students.

Keywords: Social avoidance, cognitive behavioral intervention, blind disability, disability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1993
327 A Novel SVM-Based OOK Detector in Low SNR Infrared Channels

Authors: J. P. Dubois, O. M. Abdul-Latif

Abstract:

Support Vector Machine (SVM) is a recent class of statistical classification and regression techniques playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM is applied to an infrared (IR) binary communication system with different types of channel models including Ricean multipath fading and partially developed scattering channel with additive white Gaussian noise (AWGN) at the receiver. The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these channel stochastic models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to classical binary signal maximum likelihood detection using a matched filter driven by On-Off keying (OOK) modulation. We found that the performance of SVM is superior to that of the traditional optimal detection schemes used in statistical communication, especially for very low signal-to-noise ratio (SNR) ranges. For large SNR, the performance of the SVM is similar to that of the classical detectors. The implication of these results is that SVM can prove very beneficial to IR communication systems that notoriously suffer from low SNR at the cost of increased computational complexity.

Keywords: Least square-support vector machine, on-off keying, matched filter, maximum likelihood detector, wireless infrared communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1953
326 Solving Process Planning, Weighted Earliest Due Date Scheduling and Weighted Due Date Assignment Using Simulated Annealing and Evolutionary Strategies

Authors: Halil Ibrahim Demir, Abdullah Hulusi Kokcam, Fuat Simsir, Özer Uygun

Abstract:

Traditionally, three important manufacturing functions which are process planning, scheduling and due-date assignment are performed sequentially and separately. Although there are numerous works on the integration of process planning and scheduling and plenty of works focusing on scheduling with due date assignment, there are only a few works on integrated process planning, scheduling and due-date assignment. Although due-dates are determined without taking into account of weights of the customers in the literature, here weighted due-date assignment is employed to get better performance. Jobs are scheduled according to weighted earliest due date dispatching rule and due dates are determined according to some popular due date assignment methods by taking into account of the weights of each job. Simulated Annealing, Evolutionary Strategies, Random Search, hybrid of Random Search and Simulated Annealing, and hybrid of Random Search and Evolutionary Strategies, are applied as solution techniques. Three important manufacturing functions are integrated step-by-step and higher integration levels are found better. Search meta-heuristics are found to be very useful while improving performance measure.

Keywords: Evolutionary strategies, hybrid searches, process planning, simulated annealing, weighted due-date assignment, weighted scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1160
325 The Effectiveness of Video Clips to Enhance Students’ Achievement and Motivation on History Learning and Facilitation

Authors: L. Bih Ni, D. Norizah Ag Kiflee, T. Choon Keong, R. Talip, S. Singh Bikar Singh, M. Noor Mad Japuni, R. Talin

Abstract:

The purpose of this study is to determine the effectiveness of video clips to enhance students' achievement and motivation towards learning and facilitating of history. We use narrative literature studies to illustrate the current state of the two art and science in focused areas of inquiry. We used experimental method. The experimental method is a systematic scientific research method in which the researchers manipulate one or more variables to control and measure any changes in other variables. For this purpose, two experimental groups have been designed: one experimental and one groups consisting of 30 lower secondary students. The session is given to the first batch using a computer presentation program that uses video clips to be considered as experimental group, while the second group is assigned as the same class using traditional methods using dialogue and discussion techniques that are considered a control group. Both groups are subject to pre and post-trial in matters that are handled by the class. The findings show that the results of the pre-test analysis did not show statistically significant differences, which in turn proved the equality of the two groups. Meanwhile, post-test analysis results show that there was a statistically significant difference between the experimental group and the control group at an importance level of 0.05 for the benefit of the experimental group.

Keywords: Video clips, Historical Learning and Facilitation, Achievement, Motivation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 948
324 The Application of Real Options to Capital Budgeting

Authors: George Yungchih Wang

Abstract:

Real options theory suggests that managerial flexibility embedded within irreversible investments can account for a significant value in project valuation. Although the argument has become the dominant focus of capital investment theory over decades, yet recent survey literature in capital budgeting indicates that corporate practitioners still do not explicitly apply real options in investment decisions. In this paper, we explore how real options decision criteria can be transformed into equivalent capital budgeting criteria under the consideration of uncertainty, assuming that underlying stochastic process follows a geometric Brownian motion (GBM), a mixed diffusion-jump (MX), or a mean-reverting process (MR). These equivalent valuation techniques can be readily decomposed into conventional investment rules and “option impacts", the latter of which describe the impacts on optimal investment rules with the option value considered. Based on numerical analysis and Monte Carlo simulation, three major findings are derived. First, it is shown that real options could be successfully integrated into the mindset of conventional capital budgeting. Second, the inclusion of option impacts tends to delay investment. It is indicated that the delay effect is the most significant under a GBM process and the least significant under a MR process. Third, it is optimal to adopt the new capital budgeting criteria in investment decision-making and adopting a suboptimal investment rule without considering real options could lead to a substantial loss in value.

Keywords: real options, capital budgeting, geometric Brownianmotion, mixed diffusion-jump, mean-reverting process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2770
323 A Concept to Assess the Economic Importance of the On-Site Activities of ETICS

Authors: V. Sulakatko, F. U. Vogdt, I. Lill

Abstract:

Construction technology and on-site construction activities have a direct influence on the life cycle costs of energy efficiently renovated apartment buildings. The systematic inadequacies of the External Thermal Insulation Composite System (ETICS) which occur during the construction phase increase the risk for all stakeholders, reduce mechanical durability and increase the life cycle costs of the building. The economic effect of these shortcomings can be minimised if the risk of the most significant on-site activities is recognised. The objective of the presented ETICS economic assessment concept is to evaluate the economic influence of on-site shortcomings and reveal their significance to the foreseeable future repair costs. The model assembles repair techniques, discusses their direct cost calculation methods, argues over the proper usage of net present value over the life cycle of the building, and proposes a simulation tool to evaluate the risk of on-site activities. As the technique is dependent on the selected real interest rate, a sensitivity analysis is anticipated to determine the validity of the recommendations. After the verification of the model on the sample buildings by the industry, it is expected to increase economic rationality of resource allocation and reduce high-risk systematic shortcomings during the construction process of ETICS.

Keywords: Activity-based cost estimating, Cost estimation, ETICS, Life cycle costing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 831
322 Embryo Transfer as an Assisted Reproductive Technology in Farm Animals

Authors: Diah Tri Widayati

Abstract:

Various assisted reproductive techniques have been developed and refined to obtain a large number of offspring from genetically superior animals or obtain offspring from infertile (or subfertile) animals. The embryo transfer is one assisted reproductive technique developed well, aimed at increased productivity of selected females, disease control, importation and exportation of livestock, rapid screening of AI sires for genetically recessive characteristics, treatment or circumvention of certain types of infertility. Embryo transfer also is a useful research tool for evaluating fetal and maternal interactions. This technique has been applied to nearly every species of domestic animal and many species of wildlife and exotic animals, including humans and non-human primates. The successful of embryo transfers have been limited to within-animal, homologous replacement of the embryos. There are several examples of interspecific and intergeneric embryo transfers in which embryos implanted but did not develop to term: sheep and goat, mouse and rat. An immunological rejections and placental incompatibility between the embryo and the surrogate mother appear to restrict interspecific embryo transfer/interspecific pregnancy. Recently, preimplantation embryo manipulation procedures have been applied, such as technique of inner cell mass transfer. This technique will possible to overcome the reproductive barrier interspecific embryo transfer/interspecific pregnancy, if there is a protective mechanism which prevents recognition of the foreign fetus by the mother of the other species

Keywords: Embryo Transfer, Assisted Reproductive Techology, Intraspesific-Interspesific Pregnancy, Inner cell mass.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4724
321 Fast Painting with Different Colors Using Cross Correlation in the Frequency Domain

Authors: Hazem M. El-Bakry

Abstract:

In this paper, a new technique for fast painting with different colors is presented. The idea of painting relies on applying masks with different colors to the background. Fast painting is achieved by applying these masks in the frequency domain instead of spatial (time) domain. New colors can be generated automatically as a result from the cross correlation operation. This idea was applied successfully for faster specific data (face, object, pattern, and code) detection using neural algorithms. Here, instead of performing cross correlation between the input input data (e.g., image, or a stream of sequential data) and the weights of neural networks, the cross correlation is performed between the colored masks and the background. Furthermore, this approach is developed to reduce the computation steps required by the painting operation. The principle of divide and conquer strategy is applied through background decomposition. Each background is divided into small in size subbackgrounds and then each sub-background is processed separately by using a single faster painting algorithm. Moreover, the fastest painting is achieved by using parallel processing techniques to paint the resulting sub-backgrounds using the same number of faster painting algorithms. In contrast to using only faster painting algorithm, the speed up ratio is increased with the size of the background when using faster painting algorithm and background decomposition. Simulation results show that painting in the frequency domain is faster than that in the spatial domain.

Keywords: Fast Painting, Cross Correlation, Frequency Domain, Parallel Processing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
320 Hiding Data in Images Using PCP

Authors: Souvik Bhattacharyya, Gautam Sanyal

Abstract:

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Keywords: Cover Image, LSB, Pixel Coordinate Position (PCP), Stego Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
319 Off-Policy Q-learning Technique for Intrusion Response in Network Security

Authors: Zheni S. Stefanova, Kandethody M. Ramachandran

Abstract:

With the increasing dependency on our computer devices, we face the necessity of adequate, efficient and effective mechanisms, for protecting our network. There are two main problems that Intrusion Detection Systems (IDS) attempt to solve. 1) To detect the attack, by analyzing the incoming traffic and inspect the network (intrusion detection). 2) To produce a prompt response when the attack occurs (intrusion prevention). It is critical creating an Intrusion detection model that will detect a breach in the system on time and also challenging making it provide an automatic and with an acceptable delay response at every single stage of the monitoring process. We cannot afford to adopt security measures with a high exploiting computational power, and we are not able to accept a mechanism that will react with a delay. In this paper, we will propose an intrusion response mechanism that is based on artificial intelligence, and more precisely, reinforcement learning techniques (RLT). The RLT will help us to create a decision agent, who will control the process of interacting with the undetermined environment. The goal is to find an optimal policy, which will represent the intrusion response, therefore, to solve the Reinforcement learning problem, using a Q-learning approach. Our agent will produce an optimal immediate response, in the process of evaluating the network traffic.This Q-learning approach will establish the balance between exploration and exploitation and provide a unique, self-learning and strategic artificial intelligence response mechanism for IDS.

Keywords: Intrusion prevention, network security, optimal policy, Q-learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1022
318 Object Identification with Color, Texture, and Object-Correlation in CBIR System

Authors: Awais Adnan, Muhammad Nawaz, Sajid Anwar, Tamleek Ali, Muhammad Ali

Abstract:

Needs of an efficient information retrieval in recent years in increased more then ever because of the frequent use of digital information in our life. We see a lot of work in the area of textual information but in multimedia information, we cannot find much progress. In text based information, new technology of data mining and data marts are now in working that were started from the basic concept of database some where in 1960. In image search and especially in image identification, computerized system at very initial stages. Even in the area of image search we cannot see much progress as in the case of text based search techniques. One main reason for this is the wide spread roots of image search where many area like artificial intelligence, statistics, image processing, pattern recognition play their role. Even human psychology and perception and cultural diversity also have their share for the design of a good and efficient image recognition and retrieval system. A new object based search technique is presented in this paper where object in the image are identified on the basis of their geometrical shapes and other features like color and texture where object-co-relation augments this search process. To be more focused on objects identification, simple images are selected for the work to reduce the role of segmentation in overall process however same technique can also be applied for other images.

Keywords: Object correlation, Geometrical shape, Color, texture, features, contents.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028
317 Reasons for Doing Job outside Household and Difficulties Faced by the Working Women of Bangladesh

Authors: Md. Sayeed Akhter, Md. Akhtar Hossain Mazumder, Syeda Afreena Mamun

Abstract:

Bangladesh is a patriarchal and male dominated country. Traditional, cultural, social, and religious values and practices have reinforced the lower status of women accorded to them in society and have limited their opportunities for education, technical and vocational training, and involvement with earning activities outside their households. After independence numbers of women are doing job outside their households. This study attempts to find out the reasons of engaging in earning activities outside households and difficulties faced by upper and lower class working women in Bangladesh. To explore the objectives and research questions of the study descriptive techniques had been used. Survey was conducted among the women who were working in Rajshahi city of Bangladesh and face-to-face interviews were conducted to collect data. Findings of the study illustrates that most of the upper class working women engaged into job because they wanted to utilized their education and to bring solvency in the family, and they spend their income for meeting the needs of all the members of the family. On the other hand, most of the lower class working women involved into earning activities outside their households because they want to bring solvency in their families and spend their income on household expenditure. Both classes became tensed for their children because they had to stay at their working place for long time. Therefore, day care center should be established besides their working place for their children.

Keywords: Working Women, Reasons for Doing Jobs, Working Environment, Difficulties Faced.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1791
316 VHL, PBRM1 and SETD2 Genes in Kidney Cancer: A Molecular Investigation

Authors: Rozhgar A. Khailany, Mehri Igci, Emine Bayraktar, Sakip Erturhan, Metin Karakok, Ahmet Arslan

Abstract:

Kidney cancer is the most lethal urological cancer accounting for 3% of adult malignancies. VHL, a tumor-suppressor gene, is best known to be associated with renal cell carcinoma (RCC). The VHL functions as negative regulator of hypoxia inducible factors. Recent sequencing efforts have identified several novel frequent mutations of histone modifying and chromatin remodeling genes in ccRCC (clear cell RCC) including PBRM1 and SETD2. The PBRM1 gene encodes the BAF180 protein, which involved in transcriptional activation and repression of selected genes. SETD2 encodes a histone methyltransferase, which may play a role in suppressing tumor development. In this study, RNAs of 30 paired tumor and normal samples that were grouped according to the types of kidney cancer and clinical characteristics of patients, including gender and average age were examined by RT-PCR, SSCP and sequencing techniques. VHL, PBRM1 and SETD2 expressions were relatively down-regulated. However, statistically no significance was found (Wilcoxon signed rank test, p>0.05). Interestingly, no mutation was observed on the contrary of previous studies. Understanding the molecular mechanisms involved in the pathogenesis of RCC has aided the development of molecular-targeted drugs for kidney cancer. Further analysis is required to identify the responsible genes rather than VHL, PBRM1 and SETD2 in kidney cancer.

Keywords: Kidney cancer, molecular biomarker, expression analysis, mutation screening.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2013
315 Organic Agriculture Harmony in Nutrition, Environment and Health: Case Study in Iran

Authors: Sara Jelodarian

Abstract:

Organic agriculture is a kind of living and dynamic agriculture that was introduced in the early 20th century. The fundamental basis for organic agriculture is in harmony with nature. This version of farming emphasizes removing growth hormones, chemical fertilizers, toxins, radiation, genetic manipulation and instead, integration of modern scientific techniques (such as biologic and microbial control) that leads to the production of healthy food and the preservation of the environment and use of agricultural products such as forage and manure. Supports from governments for the markets producing organic products and taking advantage of the experiences from other successful societies in this field can help progress the positive and effective aspects of this technology, especially in developing countries. This research proves that till 2030, 25% of the global agricultural lands would be covered by organic farming. Consequently Iran, due to its rich genetic resources and various climates, can be a pioneer in promoting organic products. In addition, for sustainable farming, blend of organic and other innovative systems is needed. Important limitations exist to accept these systems, also a diversity of policy instruments will be required to comfort their development and implementation. The paper was conducted to results of compilation of reports, issues, books, articles related to the subject with library studies and research. Likewise we combined experimental and survey to get data.

Keywords: Development, production markets, progress, strategic role, technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 499
314 Classification of Potential Biomarkers in Breast Cancer Using Artificial Intelligence Algorithms and Anthropometric Datasets

Authors: Aref Aasi, Sahar Ebrahimi Bajgani, Erfan Aasi

Abstract:

Breast cancer (BC) continues to be the most frequent cancer in females and causes the highest number of cancer-related deaths in women worldwide. Inspired by recent advances in studying the relationship between different patient attributes and features and the disease, in this paper, we have tried to investigate the different classification methods for better diagnosis of BC in the early stages. In this regard, datasets from the University Hospital Centre of Coimbra were chosen, and different machine learning (ML)-based and neural network (NN) classifiers have been studied. For this purpose, we have selected favorable features among the nine provided attributes from the clinical dataset by using a random forest algorithm. This dataset consists of both healthy controls and BC patients, and it was noted that glucose, BMI, resistin, and age have the most importance, respectively. Moreover, we have analyzed these features with various ML-based classifier methods, including Decision Tree (DT), K-Nearest Neighbors (KNN), eXtreme Gradient Boosting (XGBoost), Logistic Regression (LR), Naive Bayes (NB), and Support Vector Machine (SVM) along with NN-based Multi-Layer Perceptron (MLP) classifier. The results revealed that among different techniques, the SVM and MLP classifiers have the most accuracy, with amounts of 96% and 92%, respectively. These results divulged that the adopted procedure could be used effectively for the classification of cancer cells, and also it encourages further experimental investigations with more collected data for other types of cancers.

Keywords: Breast cancer, health diagnosis, Machine Learning, biomarker classification, Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 322
313 A New Model for Question Answering Systems

Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour

Abstract:

Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems. If this module doesn't work properly, it will make problems for other sections. Moreover answer processing module is an emerging topic in Question Answering, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic classification. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. Answer processing module, consists of candidate answer filtering, candidate answer ordering components and also it has a validation section for interacting with user. This module makes it more suitable to find exact answer. In this paper we have described question and answer processing modules with modeling, implementing and evaluating the system. System implemented in two versions. Results show that 'Version No.1' gave correct answer to 70% of questions (30 correct answers to 50 asked questions) and 'version No.2' gave correct answers to 94% of questions (47 correct answers to 50 asked questions).

Keywords: Answer Processing, Classification, QuestionAnswering and Query Reformulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2125
312 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: Artificial Neural Network, Data Mining, Electroencephalogram, Epilepsy, Feature Extraction, Seizure Detection, Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1315
311 Heat Treatment and Rest-Inserted Exercise Enhances EMG Activity of the Lower Limb

Authors: Jae Kyun Bang, Sung Jae Hwang, Chang Yong Ko, Chi Hyun Kim

Abstract:

Prolonged immobilization leads to significant weakness and atrophy of the skeletal muscle and can also impair the recovery of muscle strength following injury. Therefore, it is important to minimize the period under immobilization and accelerate the return to normal activity. This study examined the effects of heat treatment and rest-inserted exercise on the muscle activity of the lower limb during knee flexion/extension. Twelve healthy subjects were assigned to 4 groups that included: (1) heat treatment + rest-inserted exercise; (2) heat + continuous exercise; (3) no heat + rest-inserted exercise; and (4) no heat + continuous exercise. Heat treatment was applied for 15 mins prior to exercise. Continuous exercise groups performed knee flexion/extension at 0.5 Hz for 300 cycles without rest whereas rest-inserted exercise groups performed the same exercise but with 2 mins rest inserted every 60 cycles of continuous exercise. Changes in the rectus femoris and hamstring muscle activities were assessed at 0, 1, and 2 weeks of treatment by measuring the electromyography signals of isokinetic maximum voluntary contraction. Significant increases in both the rectus femoris and hamstring muscles were observed after 2 weeks of treatment only when both heat treatment and rest-inserted exercise were performed. These results suggest that combination of various treatment techniques, such as heat treatment and rest-inserted exercise, may expedite the recovery of muscle strength following immobilization.

Keywords: Electromyography, Heat Treatment, Muscle, Rest-Inserted Exercise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1876
310 Utilization of Whey for the Production of β-Galactosidase Using Yeast and Fungal Culture

Authors: Rupinder Kaur, Parmjit S. Panesar, Ram S. Singh

Abstract:

Whey is the lactose rich by-product of the dairy industry, having good amount of nutrient reservoir. Most abundant nutrients are lactose, soluble proteins, lipids and mineral salts. Disposing of whey by most of milk plants which do not have proper pre-treatment system is the major issue. As a result of which, there can be significant loss of potential food and energy source. Thus, whey has been explored as the substrate for the synthesis of different value added products such as enzymes. β-galactosidase is one of the important enzymes and has become the major focus of research due to its ability to catalyze both hydrolytic as well as transgalactosylation reaction simultaneously. The enzyme is widely used in dairy industry as it catalyzes the transformation of lactose to glucose and galactose, making it suitable for the lactose intolerant people. The enzyme is intracellular in both bacteria and yeast, whereas for molds, it has an extracellular location. The present work was carried to utilize the whey for the production of β-galactosidase enzyme using both yeast and fungal cultures. The yeast isolate Kluyveromyces marxianus WIG2 and various fungal strains have been used in the present study. Different disruption techniques have also been investigated for the extraction of the enzyme produced intracellularly from yeast cells. Among the different methods tested for the disruption of yeast cells, SDS-chloroform showed the maximum β-galactosidase activity. In case of the tested fungal cultures, Aureobasidium pullulans NCIM 1050 was observed to be the maximum extracellular enzyme producer.

Keywords: β-galactosidase, fungus, yeast, whey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5579
309 Nonlinear Sensitive Control of Centrifugal Compressor

Authors: F. Laaouad, M. Bouguerra, A. Hafaifa, A. Iratni

Abstract:

In this work, we treat the problems related to chemical and petrochemical plants of a certain complex process taking the centrifugal compressor as an example, a system being very complex by its physical structure as well as its behaviour (surge phenomenon). We propose to study the application possibilities of the recent control approaches to the compressor behaviour, and consequently evaluate their contribution in the practical and theoretical fields. Facing the studied industrial process complexity, we choose to make recourse to fuzzy logic for analysis and treatment of its control problem owing to the fact that these techniques constitute the only framework in which the types of imperfect knowledge can jointly be treated (uncertainties, inaccuracies, etc..) offering suitable tools to characterise them. In the particular case of the centrifugal compressor, these imperfections are interpreted by modelling errors, the neglected dynamics, no modelisable dynamics and the parametric variations. The purpose of this paper is to produce a total robust nonlinear controller design method to stabilize the compression process at its optimum steady state by manipulating the gas rate flow. In order to cope with both the parameter uncertainty and the structured non linearity of the plant, the proposed method consists of a linear steady state regulation that ensures robust optimal control and of a nonlinear compensation that achieves the exact input/output linearization.

Keywords: Compressor, Fuzzy logic, Surge control, Bilinearcontroller, Stability analysis, Nonlinear plant.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2145
308 Received Signal Strength Indicator Based Localization of Bluetooth Devices Using Trilateration: An Improved Method for the Visually Impaired People

Authors: Muhammad Irfan Aziz, Thomas Owens, Uzair Khaleeq uz Zaman

Abstract:

The instantaneous and spatial localization for visually impaired people in dynamically changing environments with unexpected hazards and obstacles, is the most demanding and challenging issue faced by the navigation systems today. Since Bluetooth cannot utilize techniques like Time Difference of Arrival (TDOA) and Time of Arrival (TOA), it uses received signal strength indicator (RSSI) to measure Receive Signal Strength (RSS). The measurements using RSSI can be improved significantly by improving the existing methodologies related to RSSI. Therefore, the current paper focuses on proposing an improved method using trilateration for localization of Bluetooth devices for visually impaired people. To validate the method, class 2 Bluetooth devices were used along with the development of a software. Experiments were then conducted to obtain surface plots that showed the signal interferences and other environmental effects. Finally, the results obtained show the surface plots for all Bluetooth modules used along with the strong and weak points depicted as per the color codes in red, yellow and blue. It was concluded that the suggested improved method of measuring RSS using trilateration helped to not only measure signal strength affectively but also highlighted how the signal strength can be influenced by atmospheric conditions such as noise, reflections, etc.

Keywords: Bluetooth, indoor/outdoor localization, received signal strength indicator, visually impaired.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 784
307 A Software Framework for Predicting Oil-Palm Yield from Climate Data

Authors: Mohd. Noor Md. Sap, A. Majid Awan

Abstract:

Intelligent systems based on machine learning techniques, such as classification, clustering, are gaining wide spread popularity in real world applications. This paper presents work on developing a software system for predicting crop yield, for example oil-palm yield, from climate and plantation data. At the core of our system is a method for unsupervised partitioning of data for finding spatio-temporal patterns in climate data using kernel methods which offer strength to deal with complex data. This work gets inspiration from the notion that a non-linear data transformation into some high dimensional feature space increases the possibility of linear separability of the patterns in the transformed space. Therefore, it simplifies exploration of the associated structure in the data. Kernel methods implicitly perform a non-linear mapping of the input data into a high dimensional feature space by replacing the inner products with an appropriate positive definite function. In this paper we present a robust weighted kernel k-means algorithm incorporating spatial constraints for clustering the data. The proposed algorithm can effectively handle noise, outliers and auto-correlation in the spatial data, for effective and efficient data analysis by exploring patterns and structures in the data, and thus can be used for predicting oil-palm yield by analyzing various factors affecting the yield.

Keywords: Pattern analysis, clustering, kernel methods, spatial data, crop yield

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1980
306 Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore

Authors: Ronal Muresano, Andrea Pagano

Abstract:

Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.

Keywords: Algorithm optimization, Bank Failures, OpenMP, Parallel Techniques, Statistical tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1902
305 A Mobile Multihop Relay Dynamic TDD Scheme for Cellular Networks

Authors: Jong-Moon Chung, Hyung-Weon Cho, Ki-Yong Jin, Min-Hee Cho

Abstract:

In this paper, we present an analytical framework for the evaluation of the uplink performance of multihop cellular networks based on dynamic time division duplex (TDD). New wireless broadband protocols, such as WiMAX, WiBro, and 3G-LTE apply TDD, and mobile communication protocols under standardization (e.g., IEEE802.16j) are investigating mobile multihop relay (MMR) as a future technology. In this paper a novel MMR TDD scheme is presented, where the dynamic range of the frame is shared to traffic resources of asymmetric nature and multihop relaying. The mobile communication channel interference model comprises of inner and co-channel interference (CCI). The performance analysis focuses on the uplink due to the fact that the effects of dynamic resource allocation show significant performance degradation only in the uplink compared to time division multiple access (TDMA) schemes due to CCI [1-3], where the downlink results to be the same or better.The analysis was based on the signal to interference power ratio (SIR) outage probability of dynamic TDD (D-TDD) and TDMA systems,which are the most widespread mobile communication multi-user control techniques. This paper presents the uplink SIR outage probability with multihop results and shows that the dynamic TDD scheme applying MMR can provide a performance improvement compared to single hop applications if executed properly.

Keywords: Co-Channel Interference, Dynamic TDD, MobileMultihop Reply, Cellular Network, Time Division Multiple Access.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2345
304 Thermal Management of Space Power Electronics using TLM-3D

Authors: R. Hocine, K. Belkacemi, A. Boukortt, A. Boudjemai

Abstract:

When designing satellites, one of the major issues aside for designing its primary subsystems is to devise its thermal. The thermal management of satellites requires solving different sets of issues with regards to modelling. If the satellite is well conditioned all other parts of the satellite will have higher temperature no matter what. The main issue of thermal modelling for satellite design is really making sure that all the other points of the satellite will be within the temperature limits they are designed. The insertion of power electronics in aerospace technologies is becoming widespread and the modern electronic systems used in space must be reliable and efficient with thermal management unaffected by outer space constraints. Many advanced thermal management techniques have been developed in recent years that have application in high power electronic systems. This paper presents a Three-Dimensional Modal Transmission Line Matrix (3D-TLM) implementation of transient heat flow in space power electronics. In such kind of components heat dissipation and good thermal management are essential. Simulation provides the cheapest tool to investigate all aspects of power handling. The 3DTLM has been successful in modeling heat diffusion problems and has proven to be efficient in terms of stability and complex geometry. The results show a three-dimensional visualisation of self-heating phenomena in the device affected by outer space constraints, and will presents possible approaches for increasing the heat dissipation capability of the power modules.

Keywords: Thermal management, conduction, heat dissipation, CTE, ceramic, heat spreader, nodes, 3D-TLM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2785
303 ECG Based Reliable User Identification Using Deep Learning

Authors: R. N. Begum, Ambalika Sharma, G. K. Singh

Abstract:

Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and electrocardiogram (ECG)-based systems are unquestionably the best choice due to their appealing inherent characteristics. The Convolutional Neural Networks (CNNs) are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the caliber of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest False Acceptance Rate (FAR)  of 0.04% and the highest False Rejection Rate (FRR)  of 5%, the best performing network achieved an identification accuracy of 99.94%. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable, but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.

Keywords: Biometrics, dense networks, identification rate, train/test split ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 543
302 Weighted-Distance Sliding Windows and Cooccurrence Graphs for Supporting Entity-Relationship Discovery in Unstructured Text

Authors: Paolo Fantozzi, Luigi Laura, Umberto Nanni

Abstract:

The problem of Entity relation discovery in structured data, a well covered topic in literature, consists in searching within unstructured sources (typically, text) in order to find connections among entities. These can be a whole dictionary, or a specific collection of named items. In many cases machine learning and/or text mining techniques are used for this goal. These approaches might be unfeasible in computationally challenging problems, such as processing massive data streams. A faster approach consists in collecting the cooccurrences of any two words (entities) in order to create a graph of relations - a cooccurrence graph. Indeed each cooccurrence highlights some grade of semantic correlation between the words because it is more common to have related words close each other than having them in the opposite sides of the text. Some authors have used sliding windows for such problem: they count all the occurrences within a sliding windows running over the whole text. In this paper we generalise such technique, coming up to a Weighted-Distance Sliding Window, where each occurrence of two named items within the window is accounted with a weight depending on the distance between items: a closer distance implies a stronger evidence of a relationship. We develop an experiment in order to support this intuition, by applying this technique to a data set consisting in the text of the Bible, split into verses.

Keywords: Cooccurrence graph, entity relation graph, unstructured text, weighted distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 685
301 Estimation of the Bit Side Force by Using Artificial Neural Network

Authors: Mohammad Heidari

Abstract:

Horizontal wells are proven to be better producers because they can be extended for a long distance in the pay zone. Engineers have the technical means to forecast the well productivity for a given horizontal length. However, experiences have shown that the actual production rate is often significantly less than that of forecasted. It is a difficult task, if not impossible to identify the real reason why a horizontal well is not producing what was forecasted. Often the source of problem lies in the drilling of horizontal section such as permeability reduction in the pay zone due to mud invasion or snaky well patterns created during drilling. Although drillers aim to drill a constant inclination hole in the pay zone, the more frequent outcome is a sinusoidal wellbore trajectory. The two factors, which play an important role in wellbore tortuosity, are the inclination and side force at bit. A constant inclination horizontal well can only be drilled if the bit face is maintained perpendicular to longitudinal axis of bottom hole assembly (BHA) while keeping the side force nil at the bit. This approach assumes that there exists no formation force at bit. Hence, an appropriate BHA can be designed if bit side force and bit tilt are determined accurately. The Artificial Neural Network (ANN) is superior to existing analytical techniques. In this study, the neural networks have been employed as a general approximation tool for estimation of the bit side forces. A number of samples are analyzed with ANN for parameters of bit side force and the results are compared with exact analysis. Back Propagation Neural network (BPN) is used to approximation of bit side forces. Resultant low relative error value of the test indicates the usability of the BPN in this area.

Keywords: Artificial Neural Network, BHA, Horizontal Well, Stabilizer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1978