Search results for: Text Features.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2056

Search results for: Text Features.

1096 Humanoid Personalized Avatar Through Multiple Natural Language Processing

Authors: Jin Hou, Xia Wang, Fang Xu, Viet Dung Nguyen, Ling Wu

Abstract:

There has been a growing interest in implementing humanoid avatars in networked virtual environment. However, most existing avatar communication systems do not take avatars- social backgrounds into consideration. This paper proposes a novel humanoid avatar animation system to represent personalities and facial emotions of avatars based on culture, profession, mood, age, taste, and so forth. We extract semantic keywords from the input text through natural language processing, and then the animations of personalized avatars are retrieved and displayed according to the order of the keywords. Our primary work is focused on giving avatars runtime instruction from multiple natural languages. Experiments with Chinese, Japanese and English input based on the prototype show that interactive avatar animations can be displayed in real time and be made available online. This system provides a more natural and interesting means of human communication, and therefore is expected to be used for cross-cultural communication, multiuser online games, and other entertainment applications.

Keywords: personalized avatar, mutiple natural luanguage processing, social backgrounds, anmimation, human computer interaction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1969
1095 Nonlinear Power Measurement Algorithm of the Input Mix Components of the Noise Signal and Pulse Interference

Authors: Alexey V. Klyuev, Valery P. Samarin, Viktor F. Klyuev, Andrey V. Klyuev

Abstract:

A power measurement algorithm of the input mix components of the noise signal and pulse interference is considered. The algorithm efficiency analysis has been carried out for different interference-to-signal ratio. Algorithm performance features have been explored by numerical experiment results.

Keywords: Noise signal, pulse interference, signal power, spectrum width, detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1472
1094 Fundamental Concepts of Theory of Constraints: An Emerging Philosophy

Authors: Ajay Gupta, Arvind Bhardwaj, Arun Kanda

Abstract:

Dr Eliyahu Goldratt has done the pioneering work in the development of Theory of Constraints. Since then, many more researchers around the globe are working to enhance this body of knowledge. In this paper, an attempt has been made to compile the salient features of this theory from the work done by Goldratt and other researchers. This paper will provide a good starting point to the potential researchers interested to work in Theory of Constraints. The paper will also help the practicing managers by clarifying their concepts on the theory and will facilitate its successful implementation in their working areas.

Keywords: Drum-Buffer-Rope, Goldratt, ProductionScheduling, Theory of Constraints.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3523
1093 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or underestimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improve accuracies. This requires standard measurement methods to be structured in ontological and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, Construction projects, Cost estimation, NRM, Ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4441
1092 An Empirical Analysis of the Impact of Selected Macroeconomic Variables on Capital Formation in Libya (1970–2010)

Authors: Khaled Ramadan Elbeydi

Abstract:

This study is carried out to provide an insight into the analysis of the impact of selected macro-economic variables on gross fixed capital formation in Libya using annual data over the period (1970-2010). The importance of this study comes from the ability to show the relative important factors that impact the Libyan gross fixed capital formation. This understanding would give indications to decision makers on which policy they must focus to stimulate the economy. An Autoregressive Distributed Lag (ARDL) modeling process is employed to investigate the impact of the Gross Domestic Product, Monetary Base and Trade Openness on Gross Fixed Capital Formation in Libya. The results of this study reveal that there is an equilibrium relationship between capital formation and its determinants. The results also indicate that GDP and trade openness largely explain the pattern of capital formation in Libya. The findings and recommendations provide vital information relevant for policy formulation and implementation aimed to improve capital formation in Libya.

Keywords: ARDL, Bounds test, capital formation, Cointegration, Libya.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727
1091 A Simplified Distribution for Nonlinear Seas

Authors: M. A. Tayfun, M. A. Alkhalidi

Abstract:

The exact theoretical expression describing the probability distribution of nonlinear sea-surface elevations derived from the second-order narrowband model has a cumbersome form that requires numerical computations, not well-disposed to theoretical or practical applications. Here, the same narrowband model is reexamined to develop a simpler closed-form approximation suitable for theoretical and practical applications. The salient features of the approximate form are explored, and its relative validity is verified with comparisons to other readily available approximations, and oceanic data.

Keywords: Ocean waves, probability distributions, second-order nonlinearities, skewness coefficient, wave steepness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2095
1090 Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach

Authors: Parvinder S. Sandhu, Hardeep Singh

Abstract:

Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.

Keywords: Clustering, ID3, LSA, Neuro-fuzzy System, SVD

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1661
1089 Hydraulic Studies on Core Components of PFBR

Authors: G. K. Pandey, D. Ramadasu, I. Banerjee, V. Vinod, G. Padmakumar, V. Prakash, K. K. Rajan

Abstract:

Detailed thermal hydraulic investigations are very  essential for safe and reliable functioning of liquid metal cooled fast  breeder reactors. These investigations are further more important for  components with complex profile, since there is no direct correlation  available in literature to evaluate the hydraulic characteristics of such  components directly. In those cases available correlations for similar  profile or geometries may lead to significant uncertainty in the  outcome. Hence experimental approach can be adopted to evaluate  these hydraulic characteristics more precisely for better prediction in  reactor core components.  Prototype Fast Breeder Reactor (PFBR), a sodium cooled pool  type reactor is under advanced stage of construction at Kalpakkam,  India. Several components of this reactor core require hydraulic  investigation before its usage in the reactor. These hydraulic  investigations on full scale models, carried out by experimental  approaches using water as simulant fluid are discussed in the paper. 

Keywords: Fast Breeder Reactor, Cavitation, pressure drop, Reactor components.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2934
1088 Supply Chain Decarbonisation – A Cost-Based Decision Support Model in Slow Steaming Maritime Operations

Authors: Eugene Y. C. Wong, Henry Y. K. Lau, Mardjuki Raman

Abstract:

CO2 emissions from maritime transport operations represent a substantial part of the total greenhouse gas emission. Vessels are designed with better energy efficiency. Minimizing CO2 emission in maritime operations plays an important role in supply chain decarbonisation. This paper reviews the initiatives on slow steaming operations towards the reduction of carbon emission. It investigates the relationship and impact among slow steaming cost reduction, carbon emission reduction, and shipment delay. A scenario-based cost-driven decision support model is developed to facilitate the selection of the optimal slow steaming options, considering the cost on bunker fuel consumption, available speed, carbon emission, and shipment delay. The incorporation of the social cost of cargo is reviewed and suggested. Additional measures on the effect of vessels sizes, routing, and type of fuels towards decarbonisation are discussed.

Keywords: Slow steaming, carbon emission, maritime logistics, sustainability, green supply chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2674
1087 Digital Preservation in Nigeria Universities Libraries: A Comparison between University of Nigeria Nsukka and Ahmadu Bello University Zaria

Authors: Suleiman Musa, Shuaibu Sidi Safiyanu

Abstract:

This study examined the digital preservation in Nigeria university libraries. A comparison between the university of Nigeria Nsukka (UNN) and Ahmadu Bello University Zaria (ABU, Zaria). The study utilized primary source of data obtained from two selected institution librarians. Finding revealed varying results in terms of skills acquired by librarians before and after digitization of the two institutions. The study reports that journals publication, text book, CD-ROMS, conference papers and proceedings, theses, dissertations and seminar papers are among the information resources available for digitization. The study further documents that copyright issue, power failure, and unavailability of needed materials are among the challenges facing the digitization of library of the institution. On the basis of the finding, the study concluded that digitization of library enhances efficiency in organization and retrieval of information services. The study therefore recommended that software should be upgraded with backup, training of the librarians on digital process, installation of antivirus and enhancement of technical collaboration between the library and MIS.

Keywords: Digitalization, preservation, libraries, comparison.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
1086 AI-based Radio Resource and Transmission Opportunity Allocation for 5G-V2X HetNets: NR and NR-U networks

Authors: Farshad Zeinali, Sajedeh Norouzi, Nader Mokari, Eduard A. Jorswieck

Abstract:

The capacity of fifth-generation (5G)vehicle-to-everything (V2X) networks poses significant challenges.To address this challenge, this paper utilizes New Radio (NR) and New Radio Unlicensed (NR-U) networks to develop a vehicular heterogeneous network (HetNet). We propose a framework, named joint BS assignment and resource allocation (JBSRA) for mobile V2X users and also consider coexistence schemes based on flexible duty cycle (DC) mechanism for unlicensed bands. Our objective is to maximize the average throughput of vehicles, while guarantying the WiFi users throughput. In simulations based on deep reinforcement learning (DRL) algorithms such as deep deterministic policy gradient (DDPG) and deep Q network (DQN), our proposed framework outperforms existing solutions that rely on fixed DC or schemes without consideration of unlicensed bands.

Keywords: Vehicle-to-everything, resource allocation, BS assignment, new radio, new radio unlicensed, coexistence NR-U and WiFi, deep deterministic policy gradient, Deep Q-network, Duty cycle mechanism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 323
1085 IVE: Virtual Humans AI Prototyping Toolkit

Authors: Cyril Brom, Zuzana Vlckova

Abstract:

IVE toolkit has been created for facilitating research,education and development in the ?eld of virtual storytelling andcomputer games. Primarily, the toolkit is intended for modellingaction selection mechanisms of virtual humans, investigating level-of-detail AI techniques for large virtual environments, and for exploringjoint behaviour and role-passing technique (Sec. V). Additionally, thetoolkit can be used as an AI middleware without any changes. Themain facility of IVE is that it serves for prototyping both the AI andvirtual worlds themselves. The purpose of this paper is to describeIVE?s features in general and to present our current work - includingan educational game - on this platform.Keywords? AI middleware, simulation, virtual world.

Keywords: AI middleware, simulation, virtual world

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
1084 Analytical Modeling of Globular Protein-Ferritin in α-Helical Conformation: A White Noise Functional Approach

Authors: Vernie C. Convicto, Henry P. Aringa, Wilson I. Barredo

Abstract:

This study presents a conformational model of the helical structures of globular protein particularly ferritin in the framework of white noise path integral formulation by using Associated Legendre functions, Bessel and convolution of Bessel and trigonometric functions as modulating functions. The model incorporates chirality features of proteins and their helix-turn-helix sequence structural motif.

Keywords: Globular protein, modulating function, white noise, winding probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954
1083 Extracting Road Signs using the Color Information

Authors: Wen-Yen Wu, Tsung-Cheng Hsieh, Ching-Sung Lai

Abstract:

In this paper, we propose a method to extract the road signs. Firstly, the grabbed image is converted into the HSV color space to detect the road signs. Secondly, the morphological operations are used to reduce noise. Finally, extract the road sign using the geometric property. The feature extraction of road sign is done by using the color information. The proposed method has been tested for the real situations. From the experimental results, it is seen that the proposed method can extract the road sign features effectively.

Keywords: Color information, image processing, road sign.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2240
1082 An Ontology Based Question Answering System on Software Test Document Domain

Authors: Meltem Serhatli, Ferda N. Alpaslan

Abstract:

Processing the data by computers and performing reasoning tasks is an important aim in Computer Science. Semantic Web is one step towards it. The use of ontologies to enhance the information by semantically is the current trend. Huge amount of domain specific, unstructured on-line data needs to be expressed in machine understandable and semantically searchable format. Currently users are often forced to search manually in the results returned by the keyword-based search services. They also want to use their native languages to express what they search. In this paper, an ontology-based automated question answering system on software test documents domain is presented. The system allows users to enter a question about the domain by means of natural language and returns exact answer of the questions. Conversion of the natural language question into the ontology based query is the challenging part of the system. To be able to achieve this, a new algorithm regarding free text to ontology based search engine query conversion is proposed. The algorithm is based on investigation of suitable question type and parsing the words of the question sentence.

Keywords: Description Logics, ontology, question answering, reasoning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2148
1081 Empirical Process Monitoring Via Chemometric Analysis of Partially Unbalanced Data

Authors: Hyun-Woo Cho

Abstract:

Real-time or in-line process monitoring frameworks are designed to give early warnings for a fault along with meaningful identification of its assignable causes. In artificial intelligence and machine learning fields of pattern recognition various promising approaches have been proposed such as kernel-based nonlinear machine learning techniques. This work presents a kernel-based empirical monitoring scheme for batch type production processes with small sample size problem of partially unbalanced data. Measurement data of normal operations are easy to collect whilst special events or faults data are difficult to collect. In such situations, noise filtering techniques can be helpful in enhancing process monitoring performance. Furthermore, preprocessing of raw process data is used to get rid of unwanted variation of data. The performance of the monitoring scheme was demonstrated using three-dimensional batch data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: Process Monitoring, kernel methods, multivariate filtering, data-driven techniques, quality improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1745
1080 An Adaptive Virtual Desktop Service in Cloud Computing Platform

Authors: Shuen-Tai Wang, Hsi-Ya Chang

Abstract:

Cloud computing is becoming more and more matured over the last few years and consequently the demands for better cloud services is increasing rapidly. One of the research topics to improve cloud services is the desktop computing in virtualized environment. This paper aims at the development of an adaptive virtual desktop service in cloud computing platform based on our previous research on the virtualization technology. We implement cloud virtual desktop and application software streaming technology that make it possible for providing Virtual Desktop as a Service (VDaaS). Given the development of remote desktop virtualization, it allows shifting the user’s desktop from the traditional PC environment to the cloud-enabled environment, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. Users no longer need to burden the platform maintenances and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote desktop service represents the next significant step to the mobile workplace, and it lets users access their desktop environments from virtually anywhere.

Keywords: Cloud Computing, Virtualization, Virtual Desktop, VDaaS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2485
1079 EEG Spikes Detection, Sorting, and Localization

Authors: Mazin Z. Othman, Maan M. Shaker, Mohammed F. Abdullah

Abstract:

This study introduces a new method for detecting, sorting, and localizing spikes from multiunit EEG recordings. The method combines the wavelet transform, which localizes distinctive spike features, with Super-Paramagnetic Clustering (SPC) algorithm, which allows automatic classification of the data without assumptions such as low variance or Gaussian distributions. Moreover, the method is capable of setting amplitude thresholds for spike detection. The method makes use of several real EEG data sets, and accordingly the spikes are detected, clustered and their times were detected.

Keywords: EEG time localizations, EEG spike detection, superparamagnetic algorithm, wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2548
1078 The Ability of Forecasting the Term Structure of Interest Rates Based On Nelson-Siegel and Svensson Model

Authors: Tea Poklepović, Zdravka Aljinović, Branka Marasović

Abstract:

Due to the importance of yield curve and its estimation it is inevitable to have valid methods for yield curve forecasting in cases when there are scarce issues of securities and/or week trade on a secondary market. Therefore in this paper, after the estimation of weekly yield curves on Croatian financial market from October 2011 to August 2012 using Nelson-Siegel and Svensson models, yield curves are forecasted using Vector autoregressive model and Neural networks. In general, it can be concluded that both forecasting methods have good prediction abilities where forecasting of yield curves based on Nelson Siegel estimation model give better results in sense of lower Mean Squared Error than forecasting based on Svensson model Also, in this case Neural networks provide slightly better results. Finally, it can be concluded that most appropriate way of yield curve prediction is Neural networks using Nelson-Siegel estimation of yield curves.

Keywords: Nelson-Siegel model, Neural networks, Svensson model, Vector autoregressive model, Yield curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3247
1077 Kinematic Hardening Parameters Identification with Respect to Objective Function

Authors: Marina Franulovic, Robert Basan, Bozidar Krizan

Abstract:

Constitutive modeling of material behavior is becoming increasingly important in prediction of possible failures in highly loaded engineering components, and consequently, optimization of their design. In order to account for large number of phenomena that occur in the material during operation, such as kinematic hardening effect in low cycle fatigue behavior of steels, complex nonlinear material models are used ever more frequently, despite of the complexity of determination of their parameters. As a method for the determination of these parameters, genetic algorithm is good choice because of its capability to provide very good approximation of the solution in systems with large number of unknown variables. For the application of genetic algorithm to parameter identification, inverse analysis must be primarily defined. It is used as a tool to fine-tune calculated stress-strain values with experimental ones. In order to choose proper objective function for inverse analysis among already existent and newly developed functions, the research is performed to investigate its influence on material behavior modeling.

Keywords: Genetic algorithm, kinematic hardening, material model, objective function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3800
1076 Smartphone-Based Human Activity Recognition by Machine Learning Methods

Authors: Yanting Cao, Kazumitsu Nawata

Abstract:

As smartphones are continually upgrading, their software and hardware are getting smarter, so the smartphone-based human activity recognition will be described more refined, complex and detailed. In this context, we analyzed a set of experimental data, obtained by observing and measuring 30 volunteers with six activities of daily living (ADL). Due to the large sample size, especially a 561-feature vector with time and frequency domain variables, cleaning these intractable features and training a proper model become extremely challenging. After a series of feature selection and parameters adjustments, a well-performed SVM classifier has been trained. 

Keywords: smart sensors, human activity recognition, artificial intelligence, SVM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 634
1075 Bleeding Detection Algorithm for Capsule Endoscopy

Authors: Yong-Gyu Lee, Gilwon Yoon

Abstract:

Automatic detection of bleeding is of practical importance since capsule endoscopy produces an extremely large number of images. Algorithm development of bleeding detection in the digestive tract is difficult due to different contrasts among the images, food dregs, secretion and others. In this study, were assigned weighting factors derived from the independent features of the contrast and brightness between bleeding and normality. Spectral analysis based on weighting factors was fast and accurate. Results were a sensitivity of 87% and a specificity of 90% when the accuracy was determined for each pixel out of 42 endoscope images.

Keywords: bleeding, capsule endoscopy, image analysis, weighted spectrum

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117
1074 An Improved k Nearest Neighbor Classifier Using Interestingness Measures for Medical Image Mining

Authors: J. Alamelu Mangai, Satej Wagle, V. Santhosh Kumar

Abstract:

The exponential increase in the volume of medical image database has imposed new challenges to clinical routine in maintaining patient history, diagnosis, treatment and monitoring. With the advent of data mining and machine learning techniques it is possible to automate and/or assist physicians in clinical diagnosis. In this research a medical image classification framework using data mining techniques is proposed. It involves feature extraction, feature selection, feature discretization and classification. In the classification phase, the performance of the traditional kNN k nearest neighbor classifier is improved using a feature weighting scheme and a distance weighted voting instead of simple majority voting. Feature weights are calculated using the interestingness measures used in association rule mining. Experiments on the retinal fundus images show that the proposed framework improves the classification accuracy of traditional kNN from 78.57 % to 92.85 %.

Keywords: Medical Image Mining, Data Mining, Feature Weighting, Association Rule Mining, k nearest neighbor classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3307
1073 A Knowledge-Based E-mail System Using Semantic Categorization and Rating Mechanisms

Authors: Azleena Mohd Kassim, Muhamad Rashidi A. Rahman, Yu-N. Cheah

Abstract:

Knowledge-based e-mail systems focus on incorporating knowledge management approach in order to enhance the traditional e-mail systems. In this paper, we present a knowledgebased e-mail system called KS-Mail where people do not only send and receive e-mail conventionally but are also able to create a sense of knowledge flow. We introduce semantic processing on the e-mail contents by automatically assigning categories and providing links to semantically related e-mails. This is done to enrich the knowledge value of each e-mail as well as to ease the organization of the e-mails and their contents. At the application level, we have also built components like the service manager, evaluation engine and search engine to handle the e-mail processes efficiently by providing the means to share and reuse knowledge. For this purpose, we present the KS-Mail architecture, and elaborate on the details of the e-mail server and the application server. We present the ontology mapping technique used to achieve the e-mail content-s categorization as well as the protocols that we have developed to handle the transactions in the e-mail system. Finally, we discuss further on the implementation of the modules presented in the KS-Mail architecture.

Keywords: E-mail rating, knowledge-based system, ontology mapping, text categorization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1447
1072 An Investigation on Vegetable Oils as Potential Insulating Liquid

Authors: C. Kocatepe, E. Taslak, C. F. Kumru, O. Arıkan

Abstract:

While choosing insulating oil, characteristic features such as thermal cooling, endurance, efficiency and being environment-friendly should be considered. Mineral oils are referred as petroleum-based oil. In this study, vegetable oils investigated as an alternative insulating liquid to mineral oil. Dissipation factor, breakdown voltage, relative dielectric constant and resistivity changes with the frequency and voltage of mineral, rapeseed and nut oils were measured. Experimental studies were performed according to ASTM D924 and IEC 60156 standards.

Keywords: Breakdown voltage, dielectric dissipation factor, mineral oil, vegetable oils.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2595
1071 Indoor Localization by Pattern Matching Method Based On Extended Database

Authors: Gyumin Hwang, Jihong Lee

Abstract:

This paper studied the CSS-based indoor localization system which is easy to implement, inexpensive to compose the systems, additionally CSS-based indoor localization system covers larger area than other system. However, this system has problem which is affected by reflected distance data. This problem in localization is caused by the multi-path effect. Error caused by multi-path is difficult to be corrected because the indoor environment cannot be described. In this paper, in order to solve the problem by multi-path, we have supplemented the localization system by using pattern matching method based on extended database. Thereby, this method improves precision of estimated. Also this method is verified by experiments in gymnasium. Database was constructed by 1m intervals, and 16 sample data were collected from random position inside the region of DB points. As a result, this paper shows higher accuracy than existing method through graph and table.

Keywords: Chirp Spread Spectrum (CSS), Indoor Localization, Pattern-Matching, Time of Arrival (ToA), Multi-Path, Mahalanobis Distance, Reception Rate, Simultaneous Localization and Mapping (SLAM), Laser Range Finder (LRF).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890
1070 The Design and Development of Multimedia Pronunciation Learning Management System

Authors: Fei Ping Por, Soon Fook Fong

Abstract:

The proposed Multimedia Pronunciation Learning Management System (MPLMS) in this study is a technology with profound potential for inducing improvement in pronunciation learning. The MPLMS optimizes the digitised phonetic symbols with the integration of text, sound and mouth movement video. The components are designed and developed in an online management system which turns the web to a dynamic user-centric collection of consistent and timely information for quality sustainable learning. The aim of this study is to design and develop the MPLMS which serves as an innovative tool to improve English pronunciation. This paper discusses the iterative methodology and the three-phase Alessi and Trollip model in the development of MPLMS. To align with the flexibility of the development of educational software, the iterative approach comprises plan, design, develop, evaluate and implement is followed. To ensure the instructional appropriateness of MPLMS, the instructional system design (ISD) model of Alessi and Trollip serves as a platform to guide the important instructional factors and process. It is expected that the results of future empirical research will support the efficacy of MPLMS and its place as the premier pronunciation learning system.

Keywords: Design, development, multimedia, pronunciation, learning management system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2443
1069 Importance of Mobile Technology in Successful Adoption and Sustainability of a Chronic Disease Support System

Authors: Reza Ariaeinejad, Norm Archer

Abstract:

Self-management is becoming a new emphasis for healthcare systems around the world. But there are many different problems with adoption of new health-related intervention systems. The situation is even more complicated for chronically ill patients with disabilities, illiteracy, and impairment in judgment in addition to their conditions, or having multiple co-morbidities. Providing online decision support to manage patient health and to provide better support for chronically ill patients is a new way of dealing with chronic disease management. In this study, the importance of mobile technology through an m-Health system that supports self-management interventions including the care provider, family and social support, education and training, decision support, recreation, and ongoing patient motivation to promote adherence and sustainability of the intervention are discussed. A proposed theoretical model for adoption and sustainability of system use is developed, based on UTAUT2 and IS Continuance of Use models, both of which have been pre-validated through longitudinal studies. The objective of this paper is to show the importance of using mobile technology in adoption and sustainability of use of an m-Health system which will result in commercially sustainable self-management support for chronically ill patients.

Keywords: M-health, e-health, self-management, disease.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2828
1068 A Trends Analysis of Image Processing in Unmanned Aerial Vehicle

Authors: Jae-Neung Lee, Keun-Chang Kwak

Abstract:

This paper describes an analysis of domestic and international trends of image processing for data in UAV (unmanned aerial vehicle) and also explains about UAV and Quadcopter. Overseas examples of image processing using UAV include image processing for totaling the total numberof vehicles, edge/target detection, detection and evasion algorithm, image processing using SIFT(scale invariant features transform) matching, and application of median filter and thresholding. In Korea, many studies are underway including visualization of new urban buildings.

Keywords: Image Processing, UAV, Quadcopter, Target detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7672
1067 An Automation of Check Focusing on CRUD for Requirements Analysis Model in UML

Authors: Shinpei Ogata, Yoshitaka Aoki, Hirotaka Okuda, Saeko Matsuura

Abstract:

A key to success of high quality software development is to define valid and feasible requirements specification. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface mock-up from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the mock-up. This paper proposes a support method to check the validity of a data life cycle by using a model checking tool “UPPAAL" focusing on CRUD (Create, Read, Update and Delete). Exhaustive checking improves the quality of requirements analysis model which are validated by the customers through automatically generated mock-up. The effectiveness of our method is discussed by a case study of requirements modeling of two small projects which are a library management system and a supportive sales system for text books in a university.

Keywords: CRUD, Model Checking, Model Driven Development, Requirements Analysis, Unified Modeling Language, UPPAAL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672