Search results for: Expert evaluations
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 408

Search results for: Expert evaluations

168 Big Bang – Big Crunch Learning Method for Fuzzy Cognitive Maps

Authors: Engin Yesil, Leon Urbas

Abstract:

Modeling of complex dynamic systems, which are very complicated to establish mathematical models, requires new and modern methodologies that will exploit the existing expert knowledge, human experience and historical data. Fuzzy cognitive maps are very suitable, simple, and powerful tools for simulation and analysis of these kinds of dynamic systems. However, human experts are subjective and can handle only relatively simple fuzzy cognitive maps; therefore, there is a need of developing new approaches for an automated generation of fuzzy cognitive maps using historical data. In this study, a new learning algorithm, which is called Big Bang-Big Crunch, is proposed for the first time in literature for an automated generation of fuzzy cognitive maps from data. Two real-world examples; namely a process control system and radiation therapy process, and one synthetic model are used to emphasize the effectiveness and usefulness of the proposed methodology.

Keywords: Big Bang-Big Crunch optimization, Dynamic Systems, Fuzzy Cognitive Maps, Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1802
167 Determinants of Brand Equity: Offering a Model to Chocolate Industry

Authors: Emari Hossien

Abstract:

This study examined the underlying dimensions of brand equity in the chocolate industry. For this purpose, researchers developed a model to identify which factors are influential in building brand equity. The second purpose was to assess brand loyalty and brand images mediating effect between brand attitude, brand personality, brand association with brand equity. The study employed structural equation modeling to investigate the causal relationships between the dimensions of brand equity and brand equity itself. It specifically measured the way in which consumers’ perceptions of the dimensions of brand equity affected the overall brand equity evaluations. Data were collected from a sample of consumers of chocolate industry in Iran. The results of this empirical study indicate that brand loyalty and brand image are important components of brand equity in this industry. Moreover, the role of brand loyalty and brand image as mediating factors in the intention of brand equity are supported. The principal contribution of the present research is that it provides empirical evidence of the multidimensionality of consumer based brand equity, supporting Aaker´s and Keller´s conceptualization of brand equity. The present research also enriched brand equity building by incorporating the brand personality and brand image, as recommended by previous researchers. Moreover, creating the brand equity index in chocolate industry of Iran particularly is novel.

Keywords: brand equity, brand personality, structural equationmodeling, Iran.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3561
166 Performance Study on Audio Codec and Session Transfer of Open Source VoIP applications

Authors: Cheng-Suan Lee, Khong Neng Choong, So Gean Koh, Chee Onn Chow, Mazlan Abbas

Abstract:

Voice over Internet Protocol (VoIP) application or commonly known as softphone has been developing an increasingly large market in today-s telecommunication world and the trend is expected to continue with the enhancement of additional features. This includes leveraging on the existing presence services, location and contextual information to enable more ubiquitous and seamless communications. In this paper, we discuss the concept of seamless session transfer for real-time application such as VoIP and IPTV, and our prototype implementation of such concept on a selected open source VoIP application. The first part of this paper is about conducting performance evaluation and assessments across some commonly found open source VoIP applications that are Ekiga, Kphone, Linphone and Twinkle so as to identify one of them for implementing our design of seamless session transfer. Subjective testing has been carried out to evaluate the audio performance on these VoIP applications and rank them according to their Mean Opinion Score (MOS) results. The second part of this paper is to discuss on the performance evaluations of our prototype implementation of session transfer using Linphone.

Keywords: audio codec, softphone, session transfer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
165 The Spiral_OWL Model – Towards Spiral Knowledge Engineering

Authors: Hafizullah A. Hashim, Aniza. A

Abstract:

The Spiral development model has been used successfully in many commercial systems and in a good number of defense systems. This is due to the fact that cost-effective incremental commitment of funds, via an analogy of the spiral model to stud poker and also can be used to develop hardware or integrate software, hardware, and systems. To support adaptive, semantic collaboration between domain experts and knowledge engineers, a new knowledge engineering process, called Spiral_OWL is proposed. This model is based on the idea of iterative refinement, annotation and structuring of knowledge base. The Spiral_OWL model is generated base on spiral model and knowledge engineering methodology. A central paradigm for Spiral_OWL model is the concentration on risk-driven determination of knowledge engineering process. The collaboration aspect comes into play during knowledge acquisition and knowledge validation phase. Design rationales for the Spiral_OWL model are to be easy-to-implement, well-organized, and iterative development cycle as an expanding spiral.

Keywords: Domain Expert, Knowledge Base, Ontology, Software Process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
164 An Efficient Technique for Extracting Fuzzy Rulesfrom Neural Networks

Authors: Besa Muslimi, Miriam A. M. Capretz, Jagath Samarabandu

Abstract:

Artificial neural networks (ANN) have the ability to model input-output relationships from processing raw data. This characteristic makes them invaluable in industry domains where such knowledge is scarce at best. In the recent decades, in order to overcome the black-box characteristic of ANNs, researchers have attempted to extract the knowledge embedded within ANNs in the form of rules that can be used in inference systems. This paper presents a new technique that is able to extract a small set of rules from a two-layer ANN. The extracted rules yield high classification accuracy when implemented within a fuzzy inference system. The technique targets industry domains that possess less complex problems for which no expert knowledge exists and for which a simpler solution is preferred to a complex one. The proposed technique is more efficient, simple, and applicable than most of the previously proposed techniques.

Keywords: fuzzy rule extraction, fuzzy systems, knowledgeacquisition, pattern recognition, artificial neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
163 Managing the Cloud Procurement Process – Findings from a Case Study

Authors: Andreas Jede, Frank Teuteberg

Abstract:

Cloud computing (CC) has already gained overall appreciation in research and practice. Whereas the willingness to integrate cloud services in various IT environments is still unbroken, the previous CC procurement processes run mostly in an unorganized and non-standardized way. In practice, a sufficiently specific, yet applicable business process for the important acquisition phase is often lacking. And research does not appropriately remedy this deficiency yet. Therefore, this paper introduces a field-tested approach for CC procurement. Based on an extensive literature review and augmented by expert interviews, we designed a model that is validated and further refined through an in-depth real-life case study. For the detailed process description, we apply the event-driven process chain notation (EPC). The gained valuable insights into the case study may help CC research to shift to a more socio-technical area. For practice, next to giving useful organizational instructions we will provide extended checklists and lessons learned.

Keywords: Cloud Procurement Process, IT-Organization, Event-driven Process Chain, In-depth Case Study.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2441
162 Decision Support System for a Pilot Flash Flood Early Warning System in Central Chile

Authors: D. Pinto, L. Castro, M.L. Cruzat, S. Barros, J. Gironás, C. Oberli, M. Torres, C. Escauriaza, A. Cipriano

Abstract:

Flash Floods, together with landslides, are a common natural threat for people living in mountainous regions and foothills. One way to deal with this constant menace is the use of Early Warning Systems, which have become a very important mitigation strategy for natural disasters. In this work we present our proposal for a pilot Flash Flood Early Warning System for Santiago, Chile, the first stage of a more ambitious project that in a future stage shall also include early warning of landslides. To give a context for our approach, we first analyze three existing Flash Flood Early Warning Systems, focusing on their general architectures. We then present our proposed system, with main focus on the decision support system, a system that integrates empirical models and fuzzy expert systems to achieve reliable risk estimations.

Keywords: Decision Support System, Early Warning Systems, Flash Flood, Natural Hazard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2454
161 Thermodynamic Approach of Lanthanide-Iron Double Oxides Formation

Authors: Vera Varazashvili, Murman Tsarakhov, Tamar Mirianashvili, Teimuraz Pavlenishvili, Tengiz Machaladze, Mzia Khundadze

Abstract:

Standard Gibbs energy of formation ΔGfor(298.15) of lanthanide-iron double oxides of garnet-type crystal structure R3Fe5O12 - RIG (R – are rare earth ions) from initial oxides are evaluated. The calculation is based on the data of standard entropies S298.15 and standard enthalpies ΔH298.15 of formation of compounds which are involved in the process of garnets synthesis. Gibbs energy of formation is presented as temperature function ΔGfor(T) for the range 300-1600K. The necessary starting thermodynamic data were obtained from calorimetric study of heat capacity – temperature functions and by using the semi-empirical method for calculation of ΔH298.15 of formation. Thermodynamic functions for standard temperature – enthalpy, entropy and Gibbs energy - are recommended as reference data for technological evaluations. Through the structural series of rare earth-iron garnets the correlation between thermodynamic properties and characteristics of lanthanide ions are elucidated.

Keywords: Calorimetry, entropy, enthalpy, heat capacity, gibbs energy of formation, rare earth iron garnets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1889
160 Preliminary Toxicological Evaluations of Polypeptide-K Isolated from Momordica Charantia in Laboratory Rats

Authors: M Nazrul-Hakim, A Yaacob, Y Adam, A Zuraini

Abstract:

This study examined the toxicological effects and safety of polypeptide k isolated from the seeds of Momordica charantia in laboratory rats. 30 male Sprague Dawley rats (12 weeks old, bodyweight 180-200 g) were randomly divided into 3 groups (1000 mg/kg, 500 mg and 0 mg/kg). Rats were acclimatized to laboratory conditions for 7 days and at day 8 rats were dosed orally with polypeptide k (in 2% DMSO/normal saline) and the controls received the dosed vehicle only. Rats were then observed for 72 hours before sacrificed. Rats were anaesthetized by pentobarbital (50 mg/kg ip) and 2-3.0 mL of blood was taken by cardiac puncture and rats were scarified by anaesthetic overdose. Immediately, organs (heart, lungs, liver, kidneys) were weigh and taken for histology. Organ sections were then evaluated by a histopathologist. Serum samples were assayed for liver functions (ALT and γ-GT) and kidney functions (BUN and creatinine). All rats showed normal behavior after the dosing and no statistical changes were observed in all blood parameters and organ weight. Histological examinations revealed normal organ structures. In conclusion, dosing of rats up to 1000 mg/kg did not have any effects on the rat behavior, liver or kidney functions nor histology of the selected organs.

Keywords: Polypeptide k, safety, histology, toxicology, Momordica charantia

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
159 A Complexity-Based Approach in Image Compression using Neural Networks

Authors: Hadi Veisi, Mansour Jamzad

Abstract:

In this paper we present an adaptive method for image compression that is based on complexity level of the image. The basic compressor/de-compressor structure of this method is a multilayer perceptron artificial neural network. In adaptive approach different Back-Propagation artificial neural networks are used as compressor and de-compressor and this is done by dividing the image into blocks, computing the complexity of each block and then selecting one network for each block according to its complexity value. Three complexity measure methods, called Entropy, Activity and Pattern-based are used to determine the level of complexity in image blocks and their ability in complexity estimation are evaluated and compared. In training and evaluation, each image block is assigned to a network based on its complexity value. Best-SNR is another alternative in selecting compressor network for image blocks in evolution phase which chooses one of the trained networks such that results best SNR in compressing the input image block. In our evaluations, best results are obtained when overlapping the blocks is allowed and choosing the networks in compressor is based on the Best-SNR. In this case, the results demonstrate superiority of this method comparing with previous similar works and JPEG standard coding.

Keywords: Adaptive image compression, Image complexity, Multi-layer perceptron neural network, JPEG Standard, PSNR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2174
158 Heuristic Search Algorithm (HSA) for Enhancing the Lifetime of Wireless Sensor Networks

Authors: Tripatjot S. Panag, J. S. Dhillon

Abstract:

The lifetime of a wireless sensor network can be effectively increased by using scheduling operations. Once the sensors are randomly deployed, the task at hand is to find the largest number of disjoint sets of sensors such that every sensor set provides complete coverage of the target area. At any instant, only one of these disjoint sets is switched on, while all other are switched off. This paper proposes a heuristic search method to find the maximum number of disjoint sets that completely cover the region. A population of randomly initialized members is made to explore the solution space. A set of heuristics has been applied to guide the members to a possible solution in their neighborhood. The heuristics escalate the convergence of the algorithm. The best solution explored by the population is recorded and is continuously updated. The proposed algorithm has been tested for applications which require sensing of multiple target points, referred to as point coverage applications. Results show that the proposed algorithm outclasses the existing algorithms. It always finds the optimum solution, and that too by making fewer number of fitness function evaluations than the existing approaches.

Keywords: Coverage, disjoint sets, heuristic, lifetime, scheduling, wireless sensor networks, WSN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
157 Effect of Oxygen on Biochar Yield and Properties

Authors: Ramlan Zailani, Halim Ghafar, Mohamad Sofian So'aib

Abstract:

Air infiltration in mass scale industrial applications of bio char production is inevitable. The presence of oxygen during the carbonization process is detrimental to the production of biochar yield and properties. The experiment was carried out on several wood species in a fixed-bed pyrolyser under various fractions of oxygen ranging from 0% to 11% by varying nitrogen and oxygen composition in the pyrolysing gas mixtures at desired compositions. The bed temperature and holding time were also varied. Process optimization was carried out by Response Surface Methodology (RSM) by employing Central Composite Design (CCD) using Design Expert 6.0 Software. The effect of oxygen ratio and holding time on biochar yield within the range studied were statistically significant. From the analysis result, optimum condition of 15.2% biochar yield of mangrove wood was predicted at pyrolysis temperature of 403 oC, oxygen percentage of 2.3% and holding time of two hours. This prediction agreed well with the experiment finding of 15.1% biochar yield.

Keywords: Mangrove wood, slow pyrolysis, oxygen infiltration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3403
156 DEA ANN Approach in Supplier Evaluation System

Authors: Dilek Özdemir, Gül Tekin Temur

Abstract:

In Supply Chain Management (SCM), strengthening partnerships with suppliers is a significant factor for enhancing competitiveness. Hence, firms increasingly emphasize supplier evaluation processes. Supplier evaluation systems are basically developed in terms of criteria such as quality, cost, delivery, and flexibility. Because there are many variables to be analyzed, this process becomes hard to execute and needs expertise. On this account, this study aims to develop an expert system on supplier evaluation process by designing Artificial Neural Network (ANN) that is supported with Data Envelopment Analysis (DEA). The methods are applied on the data of 24 suppliers, which have longterm relationships with a medium sized company from German Iron and Steel Industry. The data of suppliers consists of variables such as material quality (MQ), discount of amount (DOA), discount of cash (DOC), payment term (PT), delivery time (DT) and annual revenue (AR). Meanwhile, the efficiency that is generated by using DEA is added to the supplier evaluation system in order to use them as system outputs.

Keywords: Artificial Neural Network (ANN), DataEnvelopment Analysis (DEA), Supplier Evaluation System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2115
155 Comparative Studies of Support Vector Regression between Reproducing Kernel and Gaussian Kernel

Authors: Wei Zhang, Su-Yan Tang, Yi-Fan Zhu, Wei-Ping Wang

Abstract:

Support vector regression (SVR) has been regarded as a state-of-the-art method for approximation and regression. The importance of kernel function, which is so-called admissible support vector kernel (SV kernel) in SVR, has motivated many studies on its composition. The Gaussian kernel (RBF) is regarded as a “best" choice of SV kernel used by non-expert in SVR, whereas there is no evidence, except for its superior performance on some practical applications, to prove the statement. Its well-known that reproducing kernel (R.K) is also a SV kernel which possesses many important properties, e.g. positive definiteness, reproducing property and composing complex R.K by simpler ones. However, there are a limited number of R.Ks with explicit forms and consequently few quantitative comparison studies in practice. In this paper, two R.Ks, i.e. SV kernels, composed by the sum and product of a translation invariant kernel in a Sobolev space are proposed. An exploratory study on the performance of SVR based general R.K is presented through a systematic comparison to that of RBF using multiple criteria and synthetic problems. The results show that the R.K is an equivalent or even better SV kernel than RBF for the problems with more input variables (more than 5, especially more than 10) and higher nonlinearity.

Keywords: admissible support vector kernel, reproducing kernel, reproducing kernel Hilbert space, support vector regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1540
154 An Intelligent System Framework for Generating Activity List of a Project Using WBS Mind map and Semantic Network

Authors: H. Iranmanesh, M. Madadi

Abstract:

Work Breakdown Structure (WBS) is one of the most vital planning processes of the project management since it is considered to be the fundamental of other processes like scheduling, controlling, assigning responsibilities, etc. In fact WBS or activity list is the heart of a project and omission of a simple task can lead to an irrecoverable result. There are some tools in order to generate a project WBS. One of the most powerful tools is mind mapping which is the basis of this article. Mind map is a method for thinking together and helps a project manager to stimulate the mind of project team members to generate project WBS. Here we try to generate a WBS of a sample project involving with the building construction using the aid of mind map and the artificial intelligence (AI) programming language. Since mind map structure can not represent data in a computerized way, we convert it to a semantic network which can be used by the computer and then extract the final WBS from the semantic network by the prolog programming language. This method will result a comprehensive WBS and decrease the probability of omitting project tasks.

Keywords: Expert System, Mind map, Semantic network, Work breakdown structure,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2547
153 Context Generation with Image Based Sensors: An Interdisciplinary Enquiry on Technical and Social Issues and their Implications for System Design

Authors: Julia Moehrmann, Gunter Heidemann, Oliver Siemoneit, Christoph Hubig, Uwe-Philipp Kaeppeler, Paul Levi

Abstract:

Image data holds a large amount of different context information. However, as of today, these resources remain largely untouched. It is thus the aim of this paper to present a basic technical framework which allows for a quick and easy exploitation of context information from image data especially by non-expert users. Furthermore, the proposed framework is discussed in detail concerning important social and ethical issues which demand special requirements in system design. Finally, a first sensor prototype is presented which meets the identified requirements. Additionally, necessary implications for the software and hardware design of the system are discussed, rendering a sensor system which could be regarded as a good, acceptable and justifiable technical and thereby enabling the extraction of context information from image data.

Keywords: Context-aware computing, ethical and social issues, image recognition, requirements in system design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1623
152 Segmenting Ultrasound B-Mode Images Using RiIG Distributions and Stochastic Optimization

Authors: N. Mpofu, M. Sears

Abstract:

In this paper, we propose a novel algorithm for delineating the endocardial wall from a human heart ultrasound scan. We assume that the gray levels in the ultrasound images are independent and identically distributed random variables with different Rician Inverse Gaussian (RiIG) distributions. Both synthetic and real clinical data will be used for testing the algorithm. Algorithm performance will be evaluated using the expert radiologist evaluation of a soft copy of an ultrasound scan during the scanning process and secondly, doctor’s conclusion after going through a printed copy of the same scan. Successful implementation of this algorithm should make it possible to differentiate normal from abnormal soft tissue and help disease identification, what stage the disease is in and how best to treat the patient. We hope that an automated system that uses this algorithm will be useful in public hospitals especially in Third World countries where problems such as shortage of skilled radiologists and shortage of ultrasound machines are common. These public hospitals are usually the first and last stop for most patients in these countries.

Keywords: Endorcardial Wall, Rician Inverse Distributions, Segmentation, Ultrasound Images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1534
151 Determination of the Content of Teachers’ Presentism through a Web-Based Delphi Method

Authors: Tsai-Hsiu Lin

Abstract:

Presentism is one of the orientations of teachers’ teaching culture. However, there are few researchers to explore it in Taiwan. The objective of this study is to establish an expert-based determination of the content of teachers’ presentism in Taiwan. The author reviewed the works of Jackson, Lortie, and Hargreaves and employed Hargreaves’ three forms of teachers’ presentism as a framework to design the questionnaire of this study. The questionnaire of teachers’ presentism comprised of 42 statements. A three-round web-based Delphi survey was proposed to 14 participants (two teacher educators, two educational administrators, three school principals, and seven schoolteachers), 13 participants (92.86%) completed the three-rounds of the study. The participants were invited to indicate the importance of each statement. The Delphi study used means and standard deviation to present information concerning the collective judgments of respondents. Finally, the author obtained consensual results for 67% (28/42). However, the outcome of this study could be the result of identifying a series of general statements rather than an in-depth exposition of the topic.

Keywords: Delphi Technique, teachers’ presentism, sociology of teaching, teaching culture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 823
150 Total Lipid of Mutant Synechococcus sp. PCC 7002

Authors: Azlin S Azmi, Mus’ab Zainal, Sarina Sulaiman, Azura Amid, Zaki Zainudin

Abstract:

Microalgae lipid is a promising feedstock for biodiesel production. The objective of this work was to study growth factors affecting marine mutant Synechococcus sp. (PCC 7002) for high lipid production. Four growth factors were investigated; nitrogen-phosporus-potassium (NPK) concentration, light intensity, temperature and NaNO3 concentration on mutant strain growth and lipid production were studied. Design Expert v8.0 was used to design the experimental and analyze the data. The experimental design selected was Min-Run Res IV which consists of 12 runs and the response surfaces measured were specific growth rate and lipid concentration. The extraction of lipid was conducted by chloroform/methanol solvents system. Based on the study, mutant Synechococcus sp. PCC 7002 gave the highest specific growth rate of 0.0014 h-1 at 0% NPK, 2500 lux, 40oC and 0% NaNO3. On the other hand, the highest lipid concentration was obtained at 0% NPK, 3500 lux, 30oC and 1% NaNO3.

Keywords: Cyanobacteria, lipid, mutant, marine Synechococcus sp. PCC 7002, specific growth rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2614
149 Dengue Death Review: A Tool to Adjudge the Cause of Dengue Mortality and Use of the Tool for Prevention of Dengue Deaths

Authors: Gagandeep Singh Grover, Vini Mahajan, Bhagmal, Priti Thaware, Jaspreet Takkar

Abstract:

Dengue is a mosquito-borne viral disease endemic in many countries in the tropics and sub-tropics. The state of Punjab in India shows cyclical and seasonal variation in dengue cases. The Case Fatality Rate of Dengue has ranged from 0.6 to 1.0 in the past years. The department has initiated review of the cases that have died due to dengue in order to know the exact cause of the death in a case of dengue. The study has been undertaken to know the other associated co-morbidities and factors causing death in a case of dengue. The study used the predesigned proforma on which the records (medical and Lab) were recorded and reviewed by the expert committee of the doctors. This study has revealed that cases of dengue having co-morbidities have longer stay in hospital. Fluid overload and co-morbidities have been found as major factors leading to death, however, in a confirmed case of dengue hepatorenal shutdown was found to be major cause of mortality. The data obtained will help in sensitizing the treating physicians in order to decrease the mortality due to dengue in future.

Keywords: Dengue, death, morbidities, DHF, DSS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2029
148 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances

Authors: Violeta Damjanovic-Behrendt

Abstract:

This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.

Keywords: Security, internet of things, cloud computing, Stackelberg security game, machine learning, Naïve Q-learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1483
147 High-Accuracy Satellite Image Analysis and Rapid DSM Extraction for Urban Environment Evaluations (Tripoli-Libya)

Authors: Abdunaser Abduelmula, Maria Luisa M. Bastos, José A. Gonçalves

Abstract:

Modelling of the earth's surface and evaluation of urban environment, with 3D models, is an important research topic. New stereo capabilities of high resolution optical satellites images, such as the tri-stereo mode of Pleiades, combined with new image matching algorithms, are now available and can be applied in urban area analysis. In addition, photogrammetry software packages gained new, more efficient matching algorithms, such as SGM, as well as improved filters to deal with shadow areas, can achieve more dense and more precise results. This paper describes a comparison between 3D data extracted from tri-stereo and dual stereo satellite images, combined with pixel based matching and Wallis filter. The aim was to improve the accuracy of 3D models especially in urban areas, in order to assess if satellite images are appropriate for a rapid evaluation of urban environments. The results showed that 3D models achieved by Pleiades tri-stereo outperformed, both in terms of accuracy and detail, the result obtained from a Geo-eye pair. The assessment was made with reference digital surface models derived from high resolution aerial photography. This could mean that tri-stereo images can be successfully used for the proposed urban change analyses.

Keywords: 3D Models, Environment, Matching, Pleiades.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2647
146 Artificial Neural Network Development by means of Genetic Programming with Graph Codification

Authors: Daniel Rivero, Julián Dorado, Juan R. Rabuñal, Alejandro Pazos, Javier Pereira

Abstract:

The development of Artificial Neural Networks (ANNs) is usually a slow process in which the human expert has to test several architectures until he finds the one that achieves best results to solve a certain problem. This work presents a new technique that uses Genetic Programming (GP) for automatically generating ANNs. To do this, the GP algorithm had to be changed in order to work with graph structures, so ANNs can be developed. This technique also allows the obtaining of simplified networks that solve the problem with a small group of neurons. In order to measure the performance of the system and to compare the results with other ANN development methods by means of Evolutionary Computation (EC) techniques, several tests were performed with problems based on some of the most used test databases. The results of those comparisons show that the system achieves good results comparable with the already existing techniques and, in most of the cases, they worked better than those techniques.

Keywords: Artificial Neural Networks, Evolutionary Computation, Genetic Programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419
145 DTMF Based Robot Assisted Tele Surgery

Authors: Vikas Pandey, T. L. Joshy, Vyshak Vijayan, N. Babu

Abstract:

A new and cost effective robotic device was designed for remote tele surgery using dual tone multi frequency technology (DTMF). Tele system with Dual Tone Multiple Frequency has a large capability in sending and receiving of data in hardware and software. The robot consists of DC motors for arm movements and it is controlled manually through a mobile phone through DTMF Technology. The system enables the surgeon from base station to send commands through mobile phone to the patient’s robotic system which includes two robotic arms that translate the input into actual instrument manipulation. A mobile phone attached to the microcontroller 8051 which can activate robot through relays. The Remote robot-assisted tele surgery eliminates geographic constraints for getting surgical expertise where it is needed and allows an expert surgeon to teach or proctor the performance of surgical technique by real-time intervention.

Keywords: Robot, Microcontroller, DTMF, Tele surgery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2547
144 Multidimensional and Data Mining Analysis for Property Investment Risk Analysis

Authors: Nur Atiqah Rochin Demong, Jie Lu, Farookh Khadeer Hussain

Abstract:

Property investment in the real estate industry has a high risk due to the uncertainty factors that will affect the decisions made and high cost. Analytic hierarchy process has existed for some time in which referred to an expert-s opinion to measure the uncertainty of the risk factors for the risk analysis. Therefore, different level of experts- experiences will create different opinion and lead to the conflict among the experts in the field. The objective of this paper is to propose a new technique to measure the uncertainty of the risk factors based on multidimensional data model and data mining techniques as deterministic approach. The propose technique consist of a basic framework which includes four modules: user, technology, end-user access tools and applications. The property investment risk analysis defines as a micro level analysis as the features of the property will be considered in the analysis in this paper.

Keywords: Uncertainty factors, data mining, multidimensional data model, risk analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2871
143 Knowledge Management Challenges within Traditional Procurement System

Authors: M. Takhtravanchi, C. Pathirage

Abstract:

In the construction industry, project members are conveyor of project knowledge which is, often, not managed properly to be used in future projects. As construction projects are temporary and unique, project members are willing to be recruited once a project is completed. Therefore, poor management of knowledge across construction projects will lead to a considerable amount of knowledge loss; the ignoring of which would be detrimental to project performance. This issue is more prominent in projects undertaken through the traditional procurement system, as this system does not incentives project members for integration. Thus, disputes exist between the design and construction phases based on the poor management of knowledge between those two phases. This paper aims to highlight the challenges of the knowledge management that exists within the traditional procurement system. Expert interviews were conducted and challenges were identified and analysed by the Interpretive Structural Modelling (ISM) approach in order to summarise the relationships among them. Two identified key challenges are the Culture of an Organisation and Knowledge Management Policies. A knowledge of the challenges and their relationships will help project manager and stakeholders to have a better understanding of the importance of knowledge management.

Keywords: Challenges, construction industry, knowledge management, traditional procurement system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1200
142 Uncertainty Analysis of a Hardware in Loop Setup for Testing Products Related to Building Technology

Authors: Balasundaram Prasaant, Ploix Stephane, Delinchant Benoit, Muresan Cristian

Abstract:

Hardware in Loop (HIL) testing is done to test and validate a particular product especially in building technology. When it comes to building technology, it is more important to test the products for their efficiency. The test rig in the HIL simulator may contribute to some uncertainties on measured efficiency. The uncertainties include physical uncertainties and scenario-based uncertainties. In this paper, a simple uncertainty analysis framework for an HIL setup is shown considering only the physical uncertainties. The entire modeling of the HIL setup is done in Dymola. The uncertain sources are considered based on available knowledge of the components and also on expert knowledge. For the propagation of uncertainty, Monte Carlo Simulation is used since it is the most reliable and easy to use. In this article it is shown how an HIL setup can be modeled and how uncertainty propagation can be performed on it. Such an approach is not common in building energy analysis.

Keywords: Energy in Buildings, Hardware in Loop, Modelica (Dymola), Monte Carlo Simulation, Uncertainty Propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 510
141 Visualization of Quantitative Thresholds in Stocks

Authors: Siddhant Sahu, P. James Daniel Paul

Abstract:

Technical analysis comprised by various technical indicators is a holistic way of representing price movement of stocks in the market. Various forms of indicators have evolved from the primitive ones in the past decades. There have been many attempts to introduce volume as a major determinant to determine strong patterns in market forecasting. The law of demand defines the relationship between the volume and price. Most of the traders are familiar with the volume game. Including the time dimension to the law of demand provides a different visualization to the theory. While attempting the same, it was found that there are different thresholds in the market for different companies. These thresholds have a significant influence on the price. This article is an attempt in determining the thresholds for companies using the three dimensional graphs for optimizing the portfolios. It also emphasizes on the magnitude of importance of volumes as a key factor for determining of predicting strong price movements, bullish and bearish markets. It uses a comprehensive data set of major companies which form a major chunk of the Indian automotive sector and are thus used as an illustration.

Keywords: Technical Analysis, Expert System, Law of demand, Stocks, Portfolio Analysis, Indian Automotive Sector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2046
140 Experimental Investigation and Optimization of Nanoparticle Mass Concentration and Heat Input of Loop Heat Pipe

Authors: P. Gunnasegaran, M. Z. Abdullah, M. Z. Yusoff, Nur Irmawati

Abstract:

This study presents experimental and optimization of nanoparticle mass concentration and heat input based on the total thermal resistance (Rth) of loop heat pipe (LHP), employed for PCCPU cooling. In this study, silica nanoparticles (SiO2) in water with particle mass concentration ranged from 0% (pure water) to 1% is considered as the working fluid within the LHP. The experimental design and optimization is accomplished by the design of experimental tool, Response Surface Methodology (RSM). The results show that the nanoparticle mass concentration and the heat input have significant effect on the Rth of LHP. For a given heat input, the Rth is found to decrease with the increase of the nanoparticle mass concentration up to 0.5% and increased thereafter. It is also found that the Rth is decreased when the heat input is increased from 20W to 60W. The results are optimized with the objective of minimizing the Rth, using Design-Expert software, and the optimized nanoparticle mass concentration and heat input are 0.48% and 59.97W, respectively, the minimum thermal resistance being 2.66 (ºC/W).

Keywords: Loop heat pipe, nanofluid, optimization, thermal resistance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775
139 Ontology-Driven Generation of Radiation Protection Procedures

Authors: Chamseddine Barki, Salam Labidi, Hanen Boussi Rahmouni

Abstract:

In this article, we present the principle and suitable methodology for the design of a medical ontology that highlights the radiological and dosimetric knowledge, applied in diagnostic radiology and radiation-therapy. Our ontology, which we named «Onto.Rap», is the subject of radiation protection in medical and radiology centers by providing a standardized regulatory oversight. Thanks to its added values of knowledge-sharing, reuse and the ease of maintenance, this ontology tends to solve many problems. Of which we name the confusion between radiological procedures a practitioner might face while performing a patient radiological exam. Adding to it, the difficulties they might have in interpreting applicable patient radioprotection standards. Here, the ontology, thanks to its concepts simplification and expressiveness capabilities, can ensure an efficient classification of radiological procedures. It also provides an explicit representation of the relations between the different components of the studied concept. In fact, an ontology based-radioprotection expert system, when used in radiological center, could implement systematic radioprotection best practices during patient exam and a regulatory compliance service auditing afterwards.

Keywords: Ontology, radiology, medicine, knowledge, radiation protection, audit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1235