Search results for: Texture Feature
208 Predicting Protein-Protein Interactions from Protein Sequences Using Phylogenetic Profiles
Authors: Omer Nebil Yaveroglu, Tolga Can
Abstract:
In this study, a high accuracy protein-protein interaction prediction method is developed. The importance of the proposed method is that it only uses sequence information of proteins while predicting interaction. The method extracts phylogenetic profiles of proteins by using their sequence information. Combining the phylogenetic profiles of two proteins by checking existence of homologs in different species and fitting this combined profile into a statistical model, it is possible to make predictions about the interaction status of two proteins. For this purpose, we apply a collection of pattern recognition techniques on the dataset of combined phylogenetic profiles of protein pairs. Support Vector Machines, Feature Extraction using ReliefF, Naive Bayes Classification, K-Nearest Neighborhood Classification, Decision Trees, and Random Forest Classification are the methods we applied for finding the classification method that best predicts the interaction status of protein pairs. Random Forest Classification outperformed all other methods with a prediction accuracy of 76.93%Keywords: Protein Interaction Prediction, Phylogenetic Profile, SVM , ReliefF, Decision Trees, Random Forest Classification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1613207 Road Safety in Great Britain: An Exploratory Data Analysis
Authors: Jatin Kumar Choudhary, Naren Rayala, Abbas Eslami Kiasari, Fahimeh Jafari
Abstract:
Great Britain has one of the safest road networks in the world. However, the consequences of any death or serious injury are devastating for loved ones, as well as for those who help the severely injured. This paper aims to analyse Great Britain's road safety situation and show the response measures for areas where the total damage caused by accidents can be significantly and quickly reduced. For the past 30 years, the UK has had a good record in reducing fatalities over the past 30 years, there is still a considerable number of road deaths. The government continues to scale back road deaths empowering responsible road users by identifying and prosecuting the parameters that make the roads less safe. This study represents an exploratory analysis with deep insights which could provide policy makers with invaluable insights into how accidents happen and how they can be mitigated. We use STATS19 data published by the UK government. Since we need more information about locations which is not provided in STATA19, we first expand the features of the dataset using OpenStreetMap and Visual Crossing. This paper also provides a discussion regarding new road safety methods.
Keywords: Road safety, data analysis, OpenStreetMap, feature expanding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 348206 Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach
Authors: Parvinder S. Sandhu, Hardeep Singh
Abstract:
Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.Keywords: Clustering, ID3, LSA, Neuro-fuzzy System, SVD
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662205 A Hybrid Classification Method using Artificial Neural Network Based Decision Tree for Automatic Sleep Scoring
Authors: Haoyu Ma, Bin Hu, Mike Jackson, Jingzhi Yan, Wen Zhao
Abstract:
In this paper we propose a new classification method for automatic sleep scoring using an artificial neural network based decision tree. It attempts to treat sleep scoring progress as a series of two-class problems and solves them with a decision tree made up of a group of neural network classifiers, each of which uses a special feature set and is aimed at only one specific sleep stage in order to maximize the classification effect. A single electroencephalogram (EEG) signal is used for our analysis rather than depending on multiple biological signals, which makes greatly simplifies the data acquisition process. Experimental results demonstrate that the average epoch by epoch agreement between the visual and the proposed method in separating 30s wakefulness+S1, REM, S2 and SWS epochs was 88.83%. This study shows that the proposed method performed well in all the four stages, and can effectively limit error propagation at the same time. It could, therefore, be an efficient method for automatic sleep scoring. Additionally, since it requires only a small volume of data it could be suited to pervasive applications.
Keywords: Sleep, Sleep stage, Automatic sleep scoring, Electroencephalography, Decision tree, Artificial neural network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2072204 Design of a Service-Enabled Dependable Integration Environment
Authors: Fuyang Peng, Donghong Li
Abstract:
The aim of information systems integration is to make all the data sources, applications and business flows integrated into the new environment so that unwanted redundancies are reduced and bottlenecks and mismatches are eliminated. Two issues have to be dealt with to meet such requirements: the software architecture that supports resource integration, and the adaptor development tool that help integration and migration of legacy applications. In this paper, a service-enabled dependable integration environment (SDIE), is presented, which has two key components, i.e., a dependable service integration platform and a legacy application integration tool. For the dependable platform for service integration, the service integration bus, the service management framework, the dependable engine for service composition, and the service registry and discovery components are described. For the legacy application integration tool, its basic organization, functionalities and dependable measures taken are presented. Due to its service-oriented integration model, the light-weight extensible container, the service component combination-oriented p-lattice structure, and other features, SDIE has advantages in openness, flexibility, performance-price ratio and feature support over commercial products, is better than most of the open source integration software in functionality, performance and dependability support.Keywords: Application integration, dependability, legacy, SOA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1179203 Control Chart Pattern Recognition Using Wavelet Based Neural Networks
Authors: Jun Seok Kim, Cheong-Sool Park, Jun-Geol Baek, Sung-Shick Kim
Abstract:
Control chart pattern recognition is one of the most important tools to identify the process state in statistical process control. The abnormal process state could be classified by the recognition of unnatural patterns that arise from assignable causes. In this study, a wavelet based neural network approach is proposed for the recognition of control chart patterns that have various characteristics. The procedure of proposed control chart pattern recognizer comprises three stages. First, multi-resolution wavelet analysis is used to generate time-shape and time-frequency coefficients that have detail information about the patterns. Second, distance based features are extracted by a bi-directional Kohonen network to make reduced and robust information. Third, a back-propagation network classifier is trained by these features. The accuracy of the proposed method is shown by the performance evaluation with numerical results.
Keywords: Control chart pattern recognition, Multi-resolution wavelet analysis, Bi-directional Kohonen network, Back-propagation network, Feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2480202 Leveraging Quality Metrics in Voting Model Based Thread Retrieval
Authors: Atefeh Heydari, Mohammadali Tavakoli, Zuriati Ismail, Naomie Salim
Abstract:
Seeking and sharing knowledge on online forums have made them popular in recent years. Although online forums are valuable sources of information, due to variety of sources of messages, retrieving reliable threads with high quality content is an issue. Majority of the existing information retrieval systems ignore the quality of retrieved documents, particularly, in the field of thread retrieval. In this research, we present an approach that employs various quality features in order to investigate the quality of retrieved threads. Different aspects of content quality, including completeness, comprehensiveness, and politeness, are assessed using these features, which lead to finding not only textual, but also conceptual relevant threads for a user query within a forum. To analyse the influence of the features, we used an adopted version of voting model thread search as a retrieval system. We equipped it with each feature solely and also various combinations of features in turn during multiple runs. The results show that incorporating the quality features enhances the effectiveness of the utilised retrieval system significantly.Keywords: Content quality, Forum search, Thread retrieval, Voting techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762201 A Hidden Markov Model-Based Isolated and Meaningful Hand Gesture Recognition
Authors: Mahmoud Elmezain, Ayoub Al-Hamadi, Jörg Appenrodt, Bernd Michaelis
Abstract:
Gesture recognition is a challenging task for extracting meaningful gesture from continuous hand motion. In this paper, we propose an automatic system that recognizes isolated gesture, in addition meaningful gesture from continuous hand motion for Arabic numbers from 0 to 9 in real-time based on Hidden Markov Models (HMM). In order to handle isolated gesture, HMM using Ergodic, Left-Right (LR) and Left-Right Banded (LRB) topologies is applied over the discrete vector feature that is extracted from stereo color image sequences. These topologies are considered to different number of states ranging from 3 to 10. A new system is developed to recognize the meaningful gesture based on zero-codeword detection with static velocity motion for continuous gesture. Therefore, the LRB topology in conjunction with Baum-Welch (BW) algorithm for training and forward algorithm with Viterbi path for testing presents the best performance. Experimental results show that the proposed system can successfully recognize isolated and meaningful gesture and achieve average rate recognition 98.6% and 94.29% respectively.Keywords: Computer Vision & Image Processing, Gesture Recognition, Pattern Recognition, Application
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2250200 An Efficient Algorithm for Delay Delay-variation Bounded Least Cost Multicast Routing
Authors: Manas Ranjan Kabat, Manoj Kumar Patel, Chita Ranjan Tripathy
Abstract:
Many multimedia communication applications require a source to transmit messages to multiple destinations subject to quality of service (QoS) delay constraint. To support delay constrained multicast communications, computer networks need to guarantee an upper bound end-to-end delay from the source node to each of the destination nodes. This is known as multicast delay problem. On the other hand, if the same message fails to arrive at each destination node at the same time, there may arise inconsistency and unfairness problem among users. This is related to multicast delayvariation problem. The problem to find a minimum cost multicast tree with delay and delay-variation constraints has been proven to be NP-Complete. In this paper, we propose an efficient heuristic algorithm, namely, Economic Delay and Delay-Variation Bounded Multicast (EDVBM) algorithm, based on a novel heuristic function, to construct an economic delay and delay-variation bounded multicast tree. A noteworthy feature of this algorithm is that it has very high probability of finding the optimal solution in polynomial time with low computational complexity.Keywords: EDVBM, Heuristic algorithm, Multicast tree, QoS routing, Shortest path.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643199 PAPR Reduction Method for OFDM Signalby Using Dummy Sub-carriers
Authors: Pisit Boonsrimuang, Arjin Numsomran, Tawil Paungma, Hideo Kobayashi
Abstract:
One of the disadvantages of using OFDM is the larger peak to averaged power ratio (PAPR) in its time domain signal. The larger PAPR signal would course the fatal degradation of bit error rate performance (BER) due to the inter-modulation noise in the nonlinear channel. This paper proposes an improved DSI (Dummy Sequence Insertion) method, which can achieve the better PAPR and BER performances. The feature of proposed method is to optimize the phase of each dummy sub-carrier so as to reduce the PAPR performance by changing all predetermined phase coefficients in the time domain signal, which is calculated for data sub-carriers and dummy sub-carriers separately. To achieve the better PAPR performance, this paper also proposes to employ the time-frequency domain swapping algorithm for fine adjustment of phase coefficient of the dummy subcarriers, which can achieve the less complexity of processing and achieves the better PAPR and BER performances than those for the conventional DSI method. This paper presents various computer simulation results to verify the effectiveness of proposed method as comparing with the conventional methods in the non-linear channel.Keywords: OFDM, PAPR, dummy sub-carriers, non-linear
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545198 On Pattern-Based Programming towards the Discovery of Frequent Patterns
Authors: Kittisak Kerdprasop, Nittaya Kerdprasop
Abstract:
The problem of frequent pattern discovery is defined as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a database. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages. Such paradigm is inefficient when set of patterns is large and the frequent pattern is long. We suggest a high-level declarative style of programming apply to the problem of frequent pattern discovery. We consider two languages: Haskell and Prolog. Our intuitive idea is that the problem of finding frequent patterns should be efficiently and concisely implemented via a declarative paradigm since pattern matching is a fundamental feature supported by most functional languages and Prolog. Our frequent pattern mining implementation using the Haskell and Prolog languages confirms our hypothesis about conciseness of the program. The comparative performance studies on line-of-code, speed and memory usage of declarative versus imperative programming have been reported in the paper.Keywords: Frequent pattern mining, functional programming, pattern matching, logic programming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1343197 Feature Extraction from Aerial Photos
Authors: Mesut Gündüz, Ferruh Yildiz, Ayşe Onat
Abstract:
In Geographic Information System, one of the sources of obtaining needed geographic data is digitizing analog maps and evaluation of aerial and satellite photos. In this study, a method will be discussed which can be used to extract vectorial features and creating vectorized drawing files for aerial photos. At the same time a software developed for these purpose. Converting from raster to vector is also known as vectorization and it is the most important step when creating vectorized drawing files. In the developed algorithm, first of all preprocessing on the aerial photo is done. These are; converting to grayscale if necessary, reducing noise, applying some filters and determining the edge of the objects etc. After these steps, every pixel which constitutes the photo are followed from upper left to right bottom by examining its neighborhood relationship and one pixel wide lines or polylines obtained. The obtained lines have to be erased for preventing confusion while continuing vectorization because if not erased they can be perceived as new line, but if erased it can cause discontinuity in vector drawing so the image converted from 2 bit to 8 bit and the detected pixels are expressed as a different bit. In conclusion, the aerial photo can be converted to vector form which includes lines and polylines and can be opened in any CAD application.Keywords: Vectorization, Aerial Photos, Vectorized DrawingFile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608196 How Celebrities can be used in Advertising to the Best Advantage?
Authors: Laimona Sliburyte
Abstract:
The ever increasing product diversity and competition on the market of goods and services has dictated the pace of growth in the number of advertisements. Despite their admittedly diminished effectiveness over the recent years, advertisements remain the favored method of sales promotion. Consequently, the challenge for an advertiser is to explore every possible avenue of making an advertisement more noticeable, attractive and impellent for consumers. One way to achieve this is through invoking celebrity endorsements. On the one hand, the use of a celebrity to endorse a product involves substantial costs, however, on the other hand, it does not immediately guarantee the success of an advertisement. The question of how celebrities can be used in advertising to the best advantage is therefore of utmost importance. Celebrity endorsements have become commonplace: empirical evidence indicates that approximately 20 to 25 per cent of advertisements feature some famous person as a product endorser. The popularity of celebrity endorsements demonstrates the relevance of the topic, especially in the context of the current global economic downturn, when companies are forced to save in order to survive, yet simultaneously to heavily invest in advertising and sales promotion. The issue of the effective use of celebrity endorsements also figures prominently in the academic discourse. The study presented below is thus aimed at exploring what qualities (characteristics) of a celebrity endorser have an impact on the ffectiveness of the advertisement in which he/she appears and how.
Keywords: Advertising, celebrity, celebrity endorsements, effectiveness of celebrity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3687195 Automated Heart Sound Classification from Unsegmented Phonocardiogram Signals Using Time Frequency Features
Authors: Nadia Masood Khan, Muhammad Salman Khan, Gul Muhammad Khan
Abstract:
Cardiologists perform cardiac auscultation to detect abnormalities in heart sounds. Since accurate auscultation is a crucial first step in screening patients with heart diseases, there is a need to develop computer-aided detection/diagnosis (CAD) systems to assist cardiologists in interpreting heart sounds and provide second opinions. In this paper different algorithms are implemented for automated heart sound classification using unsegmented phonocardiogram (PCG) signals. Support vector machine (SVM), artificial neural network (ANN) and cartesian genetic programming evolved artificial neural network (CGPANN) without the application of any segmentation algorithm has been explored in this study. The signals are first pre-processed to remove any unwanted frequencies. Both time and frequency domain features are then extracted for training the different models. The different algorithms are tested in multiple scenarios and their strengths and weaknesses are discussed. Results indicate that SVM outperforms the rest with an accuracy of 73.64%.Keywords: Pattern recognition, machine learning, computer aided diagnosis, heart sound classification, and feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1283194 Food for Thought: Preparing the Brain to Eat New Foods through “Messy” Play
Authors: L. Bernabeo, T. Loftus
Abstract:
Many children often experience phases of picky eating, food aversions and/or avoidance. For families with children who have special needs, these experiences are often exacerbated, which can lead to feelings that negatively impact a caregiver’s relationship with their child. Within the scope of speech language pathology practice, knowledge of both emotional and feeding development is key. This paper will explore the significance of “messy play” within typical feeding development, and the challenges that may arise if a child does not have the opportunity to engage in this type of exploratory play. This paper will consider several contributing factors that can result in a “picky eater.” Further, research has shown that individuals with special needs, including autism, possess a neurological makeup that differs from that of a typical individual. Because autism is a disorder of relating and communicating due to differences in the limbic system, an individual with special needs may respond to a typical feeding experience as if it is a traumatic event. As a result, broadening one’s dietary repertoire may seem to be an insurmountable challenge. This paper suggests that introducing new foods through exploratory play can help broaden and strengthen diets, as well as improve the feeding experience, of individuals with autism. The DIRFloortimeⓇ methodology stresses the importance of following a child's lead. Within this developmental model, there is a special focus on a person’s individual differences, including the unique way they process the world around them, as well as the significance of therapy occurring within the context of a strong and motivating relationship. Using this child-centered approach, we can support our children in expanding their diets, while simultaneously building upon their cognitive and creative development through playful and respectful interactions that include exposure to foods that differ in color, texture, and smell. Further, this paper explores the importance of exploration, self-feeding and messy play on brain development, both in the context of typically developing individuals and those with disordered development.
Keywords: Autism, development, exploration, feeding, play.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1578193 Sustainable Renovation and Restoration of the Rural Based on the View Point of Psychology
Abstract:
Countryside has been generally recognized and regarded as a characteristic symbol which presents in human memory for a long time. As a result of the change of times, because of it is failure to meet the growing needs of the growing life and mental decline, the vast rural area began to decline. But their history feature image which accumulated by the ancient tradition provides people with the origins of existence on the spiritual level, such as "identity" and "belonging", makes people closer to the others in the spiritual and psychological aspects of a common experience about the past, thus the sense of a lack of culture caused by the losing of memory symbols is weakened. So, in the modernization process, how to repair its vitality and transform and planning it in a sustainable way has become a hot topics in architectural and urban planning. This paper aims to break the constraints of disciplines, from the perspective of interdiscipline, using the research methods of systems science to analyze and discuss the theories and methods of rural form factors, which based on the viewpoint of memory in psychology. So we can find a right way to transform the Rural to give full play to the role of the countryside in the actual use and the shape of history spirits.Keywords: The rural, sustainable renovation, restoration, psychology, memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477192 A Local Invariant Generalized Hough Transform Method for Integrated Circuit Visual Positioning
Authors: Fei Long Wei, Hua Yang, Hai Tao Zhang, Zhou Ping Yin
Abstract:
In this study, an local invariant generalized Houghtransform (LI-GHT) method is proposed for integrated circuit (IC) visual positioning. The original generalized Hough transform (GHT) is robust to external noise; however, it is not suitable for visual positioning of IC chips due to the four-dimensionality (4D) of parameter space which leads to the substantial storage requirement and high computational complexity. The proposed LI-GHT method can reduce the dimensionality of parameter space to 2D thanks to the rotational invariance of local invariant geometric feature and it can estimate the accuracy position and rotation angle of IC chips in real-time under noise and blur influence. The experiment results show that the proposed LI-GHT can estimate position and rotation angle of IC chips with high accuracy and fast speed. The proposed LI-GHT algorithm was implemented in IC visual positioning system of radio frequency identification (RFID) packaging equipment.
Keywords: Integrated Circuit Visual Positioning, Generalized Hough Transform, Local invariant Generalized Hough Transform, ICpacking equipment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2209191 Analytical Cutting Forces Model of Helical Milling Operations
Authors: Changyi Liu, Gui Wang, Matthew Dargusch
Abstract:
Helical milling operations are used to generate or enlarge boreholes by means of a milling tool. The bore diameter can be adjusted through the diameter of the helical path. The kinematics of helical milling on a three axis machine tool is analysed firstly. The relationships between processing parameters, cutting tool geometry characters with machined hole feature are formulated. The feed motion of the cutting tool has been decomposed to plane circular feed and axial linear motion. In this paper, the time varying cutting forces acted on the side cutting edges and end cutting edges of the flat end cylinder miller is analysed using a discrete method separately. These two components then are combined to produce the cutting force model considering the complicated interaction between the cutters and workpiece. The time varying cutting force model describes the instantaneous cutting force during processing. This model could be used to predict cutting force, calculate statics deflection of cutter and workpiece, and also could be the foundation of dynamics model and predicting chatter limitation of the helical milling operations.Keywords: Helical milling, Hole machining, Cutting force, Analytical model, Time domain
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3148190 Geometric Simplification Method of Building Energy Model Based on Building Performance Simulation
Authors: Yan Lyu, Yiqun Pan, Zhizhong Huang
Abstract:
In the design stage of a new building, the energy model of this building is often required for the analysis of the performance on energy efficiency. In practice, a certain degree of geometric simplification should be done in the establishment of building energy models, since the detailed geometric features of a real building are hard to be described perfectly in most energy simulation engine, such as ESP-r, eQuest or EnergyPlus. Actually, the detailed description is not necessary when the result with extremely high accuracy is not demanded. Therefore, this paper analyzed the relationship between the error of the simulation result from building energy models and the geometric simplification of the models. Finally, the following two parameters are selected as the indices to characterize the geometric feature of in building energy simulation: the southward projected area and total side surface area of the building. Based on the parameterization method, the simplification from an arbitrary column building to a typical shape (a cuboid) building can be made for energy modeling. The result in this study indicates that no more than 7% prediction error of annual cooling/heating load will be caused by the geometric simplification for those buildings with the ratio of southward projection length to total perimeter of the bottom of 0.25~0.35, which means this method is applicable for building performance simulation.
Keywords: building energy model, simulation, geometric simplification, design, regression
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 624189 Enhancing the Connectedness in Ad–hoc Mesh Networks using the Terranet Technology
Authors: Obeidat I., Bsoul M., Khasawneh A., Kilani Y.
Abstract:
This paper simulates the ad-hoc mesh network in rural areas, where such networks receive great attention due to their cost, since installing the infrastructure for regular networks in these areas is not possible due to the high cost. The distance between the communicating nodes is the most obstacles that the ad-hoc mesh network will face. For example, in Terranet technology, two nodes can communicate if they are only one kilometer far from each other. However, if the distance between them is more than one kilometer, then each node in the ad-hoc mesh networks has to act as a router that forwards the data it receives to other nodes. In this paper, we try to find the critical number of nodes which makes the network fully connected in a particular area, and then propose a method to enhance the intermediate node to accept to be a router to forward the data from the sender to the receiver. Much work was done on technological changes on peer to peer networks, but the focus of this paper will be on another feature which is to find the minimum number of nodes needed for a particular area to be fully connected and then to enhance the users to switch on their phones and accept to work as a router for other nodes. Our method raises the successful calls to 81.5% out of 100% attempt calls.
Keywords: Adjacency matrix, Ad-hoc mesh network, Connectedness, Terranet technology
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619188 Visualization of Flow Behaviour in Micro-Cavities during Micro Injection Moulding
Authors: Reza Gheisari, Paulo J. Bartolo, Nicholas Goddard
Abstract:
Polymeric micro-cantilevers (Cs) are rapidly becoming popular for MEMS applications such as chemo- and biosensing as well as purely electromechanical applications such as microrelays. Polymer materials present suitable physical and chemical properties combined with low-cost mass production. Hence, micro-cantilevers made of polymers indicate much more biocompatibility and adaptability of rapid prototyping along with mechanical properties. This research studies the effects of three process and one size factors on the filling behaviour in micro cavity, and the role of each in the replication of micro parts using different polymer materials i.e. polypropylene (PP) SABIC 56M10 and acrylonitrile butadiene styrene (ABS) Magnum 8434 . In particular, the following factors are considered: barrel temperature, mould temperature, injection speed and the thickness of micro features. The study revealed that the barrel temperature and the injection speed are the key factors affecting the flow length of micro features replicated in PP and ABS. For both materials, an increase of feature sizes improves the melt flow. However, the melt fill of micro features does not increase linearly with the increase of their thickness.Keywords: Flow length, micro-cantilevers, micro injection moulding, microfabrication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1970187 Conduction Accompanied With Transient Radiative Heat Transfer Using Finite Volume Method
Authors: A. Ashok, K.Satapathy, B. Prerana Nashine
Abstract:
The objective of this research work is to investigate for one dimensional transient radiative transfer equations with conduction using finite volume method. Within the infrastructure of finite-volume, we obtain the conservative discretization of the terms in order to preserve the overall conservative property of finitevolume schemes. Coupling of conductive and radiative equation resulting in fluxes is governed by the magnitude of emissivity, extinction coefficient, and temperature of the medium as well as geometry of the problem. The problem under consideration has been solved, for a slab dominating radiation coupled with transient conduction based on finite volume method. The boundary conditions are also chosen so as to give a good model of the discretized form of radiation transfer equation. The important feature of the present method is flexibility in specifying the control angles in the FVM, while keeping the simplicity in the solution procedure. Effects of various model parameters are examined on the distributions of temperature, radiative and conductive heat fluxes and incident radiation energy etc. The finite volume method is considered to effectively evaluate the propagation of radiation intensity through a participating medium.Keywords: Radiative transfer equation, finite volume method, conduction, transient radiation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1544186 Synthetic Aperture Radar Remote Sensing Classification Using the Bag of Visual Words Model to Land Cover Studies
Authors: Reza Mohammadi, Mahmod R. Sahebi, Mehrnoosh Omati, Milad Vahidi
Abstract:
Classification of high resolution polarimetric Synthetic Aperture Radar (PolSAR) images plays an important role in land cover and land use management. Recently, classification algorithms based on Bag of Visual Words (BOVW) model have attracted significant interest among scholars and researchers in and out of the field of remote sensing. In this paper, BOVW model with pixel based low-level features has been implemented to classify a subset of San Francisco bay PolSAR image, acquired by RADARSAR 2 in C-band. We have used segment-based decision-making strategy and compared the result with the result of traditional Support Vector Machine (SVM) classifier. 90.95% overall accuracy of the classification with the proposed algorithm has shown that the proposed algorithm is comparable with the state-of-the-art methods. In addition to increase in the classification accuracy, the proposed method has decreased undesirable speckle effect of SAR images.
Keywords: Bag of Visual Words, classification, feature extraction, land cover management, Polarimetric Synthetic Aperture Radar.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 774185 Effects of Natural Frequency and Rotational Speed on Dynamic Stress in Spur Gear
Authors: Ali Raad Hassan, G. Thanigaiyarasu, V. Ramamurti
Abstract:
Natural frequencies and dynamic response of a spur gear sector are investigated using a two dimensional finite element model that offers significant advantages for dynamic gear analyses. The gear teeth are analyzed for different operating speeds. A primary feature of this modeling is determination of mesh forces using a detailed contact analysis for each time step as the gears roll through the mesh. Transient mode super position method has been used to find horizontal and vertical components of displacement and dynamic stress. The finite element analysis software ANSYS has been used on the proposed model to find the natural frequencies by Block Lanczos technique and displacements and dynamic stresses by transient mode super position method. A comparison of theoretical (natural frequency and static stress) results with the finite element analysis results has also been done. The effect of rotational speed of the gears on the dynamic response of gear tooth has been studied and design limits have been discussed.Keywords: Natural frequency, Modal and transientanalysis, Spur gear, Dynamic stress.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3657184 Development of Basic Patternmaking Using Parametric Modelling and AutoLISP
Authors: Haziyah Hussin, Syazwan Abdul Samad, Rosnani Jusoh
Abstract:
This study is aimed towards the automisation of basic patternmaking for traditional clothes for the purpose of mass production using AutoCAD to apply AutoLISP feature under software Hazi Attire. A standard dress form (industrial form) with the size of small (S), medium (M) and large (L) size is measured using full body scanning machine. Later, the pattern for the clothes is designed parametrically based on the measured dress form. Hazi Attire program is used within the framework of AutoCAD to generate the basic pattern of front bodice, back bodice, front skirt, back skirt and sleeve block (sloper). The generation of pattern is based on the parameters inputted by user, whereby in this study, the parameters were determined based on the measured size of dress form. The finalized pattern parameter shows that the pattern fit perfectly on the dress form. Since the pattern is generated almost instantly, these proved that using the AutoLISP programming, the manufacturing lead time for the mass production of the traditional clothes can be decreased.
Keywords: Apparel, AutoLISP, Malay Traditional Clothes, Pattern Ganeration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2379183 Assessment of the Adaptive Pushover Analysis Using Displacement-based Loading in Prediction the Seismic Behaviour of the Unsymmetric-Plan Buildings
Authors: M.O. Makhmalbaf, F. Mohajeri Nav, M. Zabihi Samani
Abstract:
The recent drive for use of performance-based methodologies in design and assessment of structures in seismic areas has significantly increased the demand for the development of reliable nonlinear inelastic static pushover analysis tools. As a result, the adaptive pushover methods have been developed during the last decade, which unlike their conventional pushover counterparts, feature the ability to account for the effect that higher modes of vibration and progressive stiffness degradation might have on the distribution of seismic storey forces. Even in advanced pushover methods, little attention has been paid to the Unsymmetric structures. This study evaluates the seismic demands for three dimensional Unsymmetric-Plan buildings determined by the Displacement-based Adaptive Pushover (DAP) analysis, which has been introduced by Antoniou and Pinho [2004]. The capability of DAP procedure in capturing the torsional effects due to the irregularities of the structures, is investigated by comparing its estimates to the exact results, obtained from Incremental Dynamic Analysis (IDA). Also the capability of the procedure in prediction the seismic behaviour of the structure is discussed.
Keywords: Nonlinear static procedures, Unsymmetric-PlanBuildings, Torsional effects, IDA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2770182 An Automation of Check Focusing on CRUD for Requirements Analysis Model in UML
Authors: Shinpei Ogata, Yoshitaka Aoki, Hirotaka Okuda, Saeko Matsuura
Abstract:
A key to success of high quality software development is to define valid and feasible requirements specification. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface mock-up from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the mock-up. This paper proposes a support method to check the validity of a data life cycle by using a model checking tool “UPPAAL" focusing on CRUD (Create, Read, Update and Delete). Exhaustive checking improves the quality of requirements analysis model which are validated by the customers through automatically generated mock-up. The effectiveness of our method is discussed by a case study of requirements modeling of two small projects which are a library management system and a supportive sales system for text books in a university.Keywords: CRUD, Model Checking, Model Driven Development, Requirements Analysis, Unified Modeling Language, UPPAAL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673181 Emotion Classification by Incremental Association Language Features
Authors: Jheng-Long Wu, Pei-Chann Chang, Shih-Ling Chang, Liang-Chih Yu, Jui-Feng Yeh, Chin-Sheng Yang
Abstract:
The Major Depressive Disorder has been a burden of medical expense in Taiwan as well as the situation around the world. Major Depressive Disorder can be defined into different categories by previous human activities. According to machine learning, we can classify emotion in correct textual language in advance. It can help medical diagnosis to recognize the variance in Major Depressive Disorder automatically. Association language incremental is the characteristic and relationship that can discovery words in sentence. There is an overlapping-category problem for classification. In this paper, we would like to improve the performance in classification in principle of no overlapping-category problems. We present an approach that to discovery words in sentence and it can find in high frequency in the same time and can-t overlap in each category, called Association Language Features by its Category (ALFC). Experimental results show that ALFC distinguish well in Major Depressive Disorder and have better performance. We also compare the approach with baseline and mutual information that use single words alone or correlation measure.Keywords: Association language features, Emotion Classification, Overlap-Category Feature, Nature Language Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897180 A Recognition Method for Spatio-Temporal Background in Korean Historical Novels
Authors: Seo-Hee Kim, Kee-Won Kim, Seung-Hoon Kim
Abstract:
The most important elements of a novel are the characters, events and background. The background represents the time, place and situation that character appears, and conveys event and atmosphere more realistically. If readers have the proper knowledge about background of novels, it may be helpful for understanding the atmosphere of a novel and choosing a novel that readers want to read. In this paper, we are targeting Korean historical novels because spatio-temporal background especially performs an important role in historical novels among the genre of Korean novels. To the best of our knowledge, we could not find previous study that was aimed at Korean novels. In this paper, we build a Korean historical national dictionary. Our dictionary has historical places and temple names of kings over many generations as well as currently existing spatial words or temporal words in Korean history. We also present a method for recognizing spatio-temporal background based on patterns of phrasal words in Korean sentences. Our rules utilize postposition for spatial background recognition and temple names for temporal background recognition. The knowledge of the recognized background can help readers to understand the flow of events and atmosphere, and can use to visualize the elements of novels.
Keywords: Data mining, Korean historical novels, Korean linguistic feature, spatio-temporal background.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1123179 Developing a Customizable Serious Game and Its Applicability in the Classroom
Authors: Anita Kéri
Abstract:
Recent developments in the field of education have led to a renewed interest in teaching methodologies and practices. Gamification is fast becoming a key instrument in the education of new generations and besides other methods, serious games have become the center of attention. Ready-built serious games are available for most higher education institutions to buy and implement. However, monetary restraints and the unalterable nature of the games might deter most higher education institutions from the application of these serious games. Therefore, there is a continuously growing need for a customizable serious game that has been developed based on a concrete need analysis and experts’ opinion. There has been little evidence so far of serious games that have been created based on relevant and current need analysis from higher education institution teachers, professional practitioners and students themselves. Therefore, the aim of this current paper is to analyze the needs of higher education institution educators with special emphasis on their needs, the applicability of serious games in their classrooms, and exploring options for the development of a customizable serious game framework. The paper undertakes to analyze workshop discussions on implementing serious games in education and propose a customizable serious game framework applicable in the education of the new generation. Research results show that the most important feature of a serious game is its customizability. The fact that practitioners are able to manage different scenarios and upload their own content to a game seems to be a key to the increasingly widespread application of serious games in the classroom.Keywords: Education, gamification, game-based learning, serious games.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 881