Search results for: internet data center.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8134

Search results for: internet data center.

6394 On the Mechanism Broadening of Optical Spectrum of a Solvated Electron in Ammonia

Authors: V.K. Mukhomorov

Abstract:

The solvated electron is self-trapped (polaron) owing to strong interaction with the quantum polarization field. If the electron and quantum field are strongly coupled then the collective localized state of the field and quasi-particle is formed. In such a formation the electron motion is rather intricate. On the one hand the electron oscillated within a rather deep polarization potential well and undergoes the optical transitions, and on the other, it moves together with the center of inertia of the system and participates in the thermal random walk. The problem is to separate these motions correctly, rigorously taking into account the conservation laws. This can be conveniently done using Bogolyubov-Tyablikov method of canonical transformation to the collective coordinates. This transformation removes the translational degeneracy and allows one to develop the successive approximation algorithm for the energy and wave function while simultaneously fulfilling the law of conservation of total momentum of the system. The resulting equations determine the electron transitions and depend explicitly on the translational velocity of the quasi-particle as whole. The frequency of optical transition is calculated for the solvated electron in ammonia, and an estimate is made for the thermal-induced spectral bandwidth.

Keywords: Canonical transformations, solvated electron, width of the optical spectrum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1294
6393 Building an Integrated Relational Database from Swiss Nutrition National Survey and Swiss Health Datasets for Data Mining Purposes

Authors: Ilona Mewes, Helena Jenzer, Farshideh Einsele

Abstract:

Objective: The objective of the study was to integrate two big databases from Swiss nutrition national survey (menuCH) and Swiss health national survey 2012 for data mining purposes. Each database has a demographic base data. An integrated Swiss database is built to later discover critical food consumption patterns linked with lifestyle diseases known to be strongly tied with food consumption. Design: Swiss nutrition national survey (menuCH) with approx. 2000 respondents from two different surveys, one by Phone and the other by questionnaire along with Swiss health national survey 2012 with 21500 respondents were pre-processed, cleaned and finally integrated to a unique relational database. Results: The result of this study is an integrated relational database from the Swiss nutritional and health databases.

Keywords: Health informatics, data mining, nutritional and health databases, nutritional and chronical databases.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667
6392 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 875
6391 Estimation of Geotechnical Parameters by Comparing Monitoring Data with Numerical Results: Case Study of Arash–Esfandiar-Niayesh Under-Passing Tunnel, Africa Tunnel, Tehran, Iran

Authors: Aliakbar Golshani, Seyyed Mehdi Poorhashemi, Mahsa Gharizadeh

Abstract:

The under passing tunnels are strongly influenced by the soils around. There are some complexities in the specification of real soil behavior, owing to the fact that lots of uncertainties exist in soil properties, and additionally, inappropriate soil constitutive models. Such mentioned factors may cause incompatible settlements in numerical analysis with the obtained values in actual construction. This paper aims to report a case study on a specific tunnel constructed by NATM. The tunnel has a depth of 11.4 m, height of 12.2 m, and width of 14.4 m with 2.5 lanes. The numerical modeling was based on a 2D finite element program. The soil material behavior was modeled by hardening soil model. According to the field observations, the numerical estimated settlement at the ground surface was approximately four times more than the measured one, after the entire installation of the initial lining, indicating that some unknown factors affect the values. Consequently, the geotechnical parameters are accurately revised by a numerical back-analysis using laboratory and field test data and based on the obtained monitoring data. The obtained result confirms that typically, the soil parameters are conservatively low-estimated. And additionally, the constitutive models cannot be applied properly for all soil conditions.

Keywords: NATM tunnel, initial lining, field test data, laboratory test data, monitoring data, numerical back-analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 703
6390 Application of Neural Networks in Financial Data Mining

Authors: Defu Zhang, Qingshan Jiang, Xin Li

Abstract:

This paper deals with the application of a well-known neural network technique, multilayer back-propagation (BP) neural network, in financial data mining. A modified neural network forecasting model is presented, and an intelligent mining system is developed. The system can forecast the buying and selling signs according to the prediction of future trends to stock market, and provide decision-making for stock investors. The simulation result of seven years to Shanghai Composite Index shows that the return achieved by this mining system is about three times as large as that achieved by the buy and hold strategy, so it is advantageous to apply neural networks to forecast financial time series, the different investors could benefit from it.

Keywords: Data mining, neural network, stock forecasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3567
6389 Adaptive Early Packet Discarding Policy Based on Two Traffic Classes

Authors: Rawya Rizk, Rehab Abdel-Kader, Rabab Ramadan

Abstract:

Unlike the best effort service provided by the internet today, next-generation wireless networks will support real-time applications. This paper proposes an adaptive early packet discard (AEPD) policy to improve the performance of the real time TCP traffic over ATM networks and avoid the fragmentation problem. Three main aspects are incorporated in the proposed policy. First, providing quality-of-service (QoS) guaranteed for real-time applications by implementing a priority scheduling. Second, resolving the partially corrupted packets problem by differentiating the buffered cells of one packet from another. Third, adapting a threshold dynamically using Fuzzy logic based on the traffic behavior to maintain a high throughput under a variety of load conditions. The simulation is run for two priority classes of the input traffic: real time and non-real time classes. Simulation results show that the proposed AEPD policy improves throughput and fairness over that using static threshold under the same traffic conditions.

Keywords: Early packet discard, Fuzzy logic, packet dropping policies, quality-of-service (QoS), TCP over ATM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1408
6388 Adaptive Kalman Filter for Noise Estimation and Identification with Bayesian Approach

Authors: Farhad Asadi, S. Hossein Sadati

Abstract:

Bayesian approach can be used for parameter identification and extraction in state space models and its ability for analyzing sequence of data in dynamical system is proved in different literatures. In this paper, adaptive Kalman filter with Bayesian approach for identification of variances in measurement parameter noise is developed. Next, it is applied for estimation of the dynamical state and measurement data in discrete linear dynamical system. This algorithm at each step time estimates noise variance in measurement noise and state of system with Kalman filter. Next, approximation is designed at each step separately and consequently sufficient statistics of the state and noise variances are computed with a fixed-point iteration of an adaptive Kalman filter. Different simulations are applied for showing the influence of noise variance in measurement data on algorithm. Firstly, the effect of noise variance and its distribution on detection and identification performance is simulated in Kalman filter without Bayesian formulation. Then, simulation is applied to adaptive Kalman filter with the ability of noise variance tracking in measurement data. In these simulations, the influence of noise distribution of measurement data in each step is estimated, and true variance of data is obtained by algorithm and is compared in different scenarios. Afterwards, one typical modeling of nonlinear state space model with inducing noise measurement is simulated by this approach. Finally, the performance and the important limitations of this algorithm in these simulations are explained. 

Keywords: adaptive filtering, Bayesian approach Kalman filtering approach, variance tracking

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 585
6387 Investigation of Maritime Accidents with Exploratory Data Analysis in the Strait of Çanakkale (Dardanelles)

Authors: Gizem Kodak

Abstract:

The Strait of Çanakkale (Dardanelles), together with the Strait of Istanbul and the Sea of Marmara, form the Turkish Straits System. In other words, the Strait of Çanakkale is the southern gate of the system that connects the Black Sea countries with the other countries of the world. Due to the heavy maritime traffic, it is important to scientifically examine the accident characteristics in the region. In particular, the results indicated by the descriptive statistics are of critical importance in order to strengthen the safety of navigation. At this point, exploratory data analysis offers strategic outputs in terms of defining the problem and knowing the strengths and weaknesses against possible accident risk. The study aims to determine the accident characteristics in the Strait of Çanakkale with temporal and spatial analysis of historical data, using Exploratory Data Analysis (EDA) as the research method. The study's results will reveal the general characteristics of maritime accidents in the region and form the infrastructure for future studies. Therefore, the text provides a clear description of the research goals and methodology, and the study's contributions are well-defined.

Keywords: Maritime Accidents, EDA, Strait of Çanakkale, navigational safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 94
6386 Effect of Heat-Moisture Treatment on the Formation and Properties of Resistant Starches From Mung Bean (Phaseolus radiatus) Starches

Authors: Su-Ling Li, Qun-Yu Gao

Abstract:

Mung bean starches were subjected to heat-moisture treatment (HMT) by different moisture contents (15%, 20%, 25%, 30% and 35%) at 120Ôäâ for 12h. The impact on the yields of resistant starch (RS), microstructure, physicochemical and functional properties was investigated. Compared to native starch, the RS content of heat-moisture treated starches increased significantly. The RS level of HMT-20 was the highest of all the starches. Birefringence was displayed clear at the center of native starch. For HMT starches, pronounced birefringence was exhibited on the periphery of starch granules; however, birefringence disappeared at the centre of some starch granules. The shape of HMT starches hadn-t been changed and the integrity of starch granules was preserved for all the conditions. Concavity could be observed on HMT starches under scanning electronic microscopy. After HMT, apparent amylose contents were increased and starch macromolecule was degraded in comparison with those of native starch. There was a reduction in swelling power on HMT starches, but the solubility of HMT starches was higher than that of native starch. Both of native and HMT starches showed A-type X-ray diffraction pattern. Furthermore, there is a higher intensity at the peak of 15.0 and 22.9 Å than those of native starch.

Keywords: Resistant starch, mung bean (Phaseolus radiatus) starch, heat-moisture treatment, physicochemical properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3495
6385 Characteristics and Mechanical Properties of Bypass-Current MIG Welding-Brazed Dissimilar Al/Ti Joints

Authors: Bintao Wu, Xiangfang Xu, Yugang Miao, Duanfeng Han

Abstract:

Joining of 1mm thick aluminum 6061 to titanium TC4 was conducted using Bypass-current MIG welding-brazed, and stable welding process and good bead appearance were obtained. The Joint profile and microstructure of Ti/Al joints were observed by optical microscopy and SEM and then the structure of the interfacial reaction layers were analyzed in details. It was found that the intermetallic compound layer at the interfacial top is in the form of columnar crystal, which is in short and dense state. A mount of AlTi were observed at the interfacial layer near the Ti base metal while intermetallic compound like Al3Ti, TiSi3 were formed near the Al base metal, and the Al11Ti5 transition phase was found in the center of the interface layer due to the uneven distribution inside the weld pool during the welding process. Tensile test results show that the average tensile strength of joints is up to 182.6 MPa, which reaches about 97.6% of aluminum base metal. Fracture is prone to occur in the base metal with a certain amount of necking.

Keywords: Bypass-current MIG welding-brazed, Al alloy, Ti alloy, joint characteristics, mechanical properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2280
6384 Performance Study on Audio Codec and Session Transfer of Open Source VoIP applications

Authors: Cheng-Suan Lee, Khong Neng Choong, So Gean Koh, Chee Onn Chow, Mazlan Abbas

Abstract:

Voice over Internet Protocol (VoIP) application or commonly known as softphone has been developing an increasingly large market in today-s telecommunication world and the trend is expected to continue with the enhancement of additional features. This includes leveraging on the existing presence services, location and contextual information to enable more ubiquitous and seamless communications. In this paper, we discuss the concept of seamless session transfer for real-time application such as VoIP and IPTV, and our prototype implementation of such concept on a selected open source VoIP application. The first part of this paper is about conducting performance evaluation and assessments across some commonly found open source VoIP applications that are Ekiga, Kphone, Linphone and Twinkle so as to identify one of them for implementing our design of seamless session transfer. Subjective testing has been carried out to evaluate the audio performance on these VoIP applications and rank them according to their Mean Opinion Score (MOS) results. The second part of this paper is to discuss on the performance evaluations of our prototype implementation of session transfer using Linphone.

Keywords: audio codec, softphone, session transfer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1663
6383 When Construction Material Traders Goes Electronic: Analysis of SMEs in Malaysian Construction Industry

Authors: Dzul Fahmi Nordin, Rosmini Omar

Abstract:

This paper analyzed the perception of e-commerce application services by construction material traders in Malaysia. Five attributes were tested: usability, reputation, trust, privacy and familiarity. Study methodology consists of survey questionnaire and statistical analysis that includes reliability analysis, factor analysis, ANOVA and regression analysis. The respondents were construction material traders, including hardware stores in Klang Valley, Kuala Lumpur. Findings support that usability and familiarity with e-commerce services in Malaysia have insignificant influence on the acceptance of e-commerce application. However, reputation, trust and privacy attributes have significant influence on the choice of e-commerce acceptance by construction material traders. E-commerce applications studied included customer database, e-selling, emarketing, e-payment, e-buying and online advertising. Assumptions are made that traders have basic knowledge and exposure to ICT services. i.e. internet service and computers. Study concludes that reputation, privacy and trust are the three website attributes that influence the acceptance of e-commerce by construction material traders.

Keywords: Electronic Commerce (e-Commerce), Information and Communications Technology (ICT), Small Medium Enterprise (SME)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823
6382 SEM Image Classification Using CNN Architectures

Authors: G. Türkmen, Ö. Tekin, K. Kurtuluş, Y. Y. Yurtseven, M. Baran

Abstract:

A scanning electron microscope (SEM) is a type of electron microscope mainly used in nanoscience and nanotechnology areas. Automatic image recognition and classification are among the general areas of application concerning SEM. In line with these usages, the present paper proposes a deep learning algorithm that classifies SEM images into nine categories by means of an online application to simplify the process. The NFFA-EUROPE - 100% SEM data set, containing approximately 21,000 images, was used to train and test the algorithm at 80% and 20%, respectively. Validation was carried out using a separate data set obtained from the Middle East Technical University (METU) in Turkey. To increase the accuracy in the results, the Inception ResNet-V2 model was used in view of the Fine-Tuning approach. By using a confusion matrix, it was observed that the coated-surface category has a negative effect on the accuracy of the results since it contains other categories in the data set, thereby confusing the model when detecting category-specific patterns. For this reason, the coated-surface category was removed from the train data set, hence increasing accuracy by up to 96.5%.

Keywords: Convolutional Neural Networks, deep learning, image classification, scanning electron microscope.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 160
6381 Data and Control Flow Analysis of VDMµ Specifications

Authors: Mubina Nazmeen, Iram Rubab

Abstract:

Formal Specification languages are being widely used for system specification and testing. Highly critical systems such as real time systems, avionics, and medical systems are represented using Formal specification languages. Formal specifications based testing is mostly performed using black box testing approaches thus testing only the set of inputs and outputs of the system. The formal specification language such as VDMµ can be used for white box testing as they provide enough constructs as any other high level programming language. In this work, we perform data and control flow analysis of VDMµ class specifications. The proposed work is discussed with an example of SavingAccount.

Keywords: VDM-SL, VDMµ, data flow graph, control flowgraph, testing, formal specification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4365
6380 The Excess Loop Delay Calibration in a Bandpass Continuous-Time Delta Sigma Modulators Based on Q-Enhanced LC Filter

Authors: Sorore Benabid

Abstract:

The Q-enhanced LC filters are the most used architecture in the Bandpass (BP) Continuous-Time (CT) Delta-Sigma (ΣΔ) modulators, due to their: high frequencies operation, high linearity than the active filters and a high quality factor obtained by Q-enhanced technique. This technique consists of the use of a negative resistance that compensate the ohmic losses in the on-chip inductor. However, this technique introduces a zero in the filter transfer function which will affect the modulator performances in term of Dynamic Range (DR), stability and in-band noise (Signal-to-Noise Ratio (SNR)). In this paper, we study the effect of this zero and we demonstrate that a calibration of the excess loop delay (ELD) is required to ensure the best performances of the modulator. System level simulations are done for a 2ndorder BP CT (ΣΔ) modulator at a center frequency of 300MHz. Simulation results indicate that the optimal ELD should be reduced by 13% to achieve the maximum SNR and DR compared to the ideal LC-based ΣΔ modulator.

Keywords: Continuous-time bandpass delta-sigma modulators, excess loop delay, on-chip inductor, Q-enhanced LC filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 736
6379 A New Traffic Pattern Matching for DDoS Traceback Using Independent Component Analysis

Authors: Yuji Waizumi, Tohru Sato, Yoshiaki Nemoto

Abstract:

Recently, Denial of Service(DoS) attacks and Distributed DoS(DDoS) attacks which are stronger form of DoS attacks from plural hosts have become security threats on the Internet. It is important to identify the attack source and to block attack traffic as one of the measures against these attacks. In general, it is difficult to identify them because information about the attack source is falsified. Therefore a method of identifying the attack source by tracing the route of the attack traffic is necessary. A traceback method which uses traffic patterns, using changes in the number of packets over time as criteria for the attack traceback has been proposed. The traceback method using the traffic patterns can trace the attack by matching the shapes of input traffic patterns and the shape of output traffic pattern observed at a network branch point such as a router. The traffic pattern is a shapes of traffic and unfalsifiable information. The proposed trace methods proposed till date cannot obtain enough tracing accuracy, because they directly use traffic patterns which are influenced by non-attack traffics. In this paper, a new traffic pattern matching method using Independent Component Analysis(ICA) is proposed.

Keywords: Distributed Denial of Service, Independent Component Analysis, Traffic pattern

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1753
6378 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2825
6377 Advance in Monitoring and Process Control of Surface Roughness

Authors: Somkiat Tangjitsitcharoen, Siripong Damrongthaveesak

Abstract:

This paper presents an advance in monitoring and process control of surface roughness in CNC machine for the turning and milling processes. An integration of the in-process monitoring and process control of the surface roughness is proposed and developed during the machining process by using the cutting force ratio. The previously developed surface roughness models for turning and milling processes of the author are adopted to predict the inprocess surface roughness, which consist of the cutting speed, the feed rate, the tool nose radius, the depth of cut, the rake angle, and the cutting force ratio. The cutting force ratios obtained from the turning and the milling are utilized to estimate the in-process surface roughness. The dynamometers are installed on the tool turret of CNC turning machine and the table of 5-axis machining center to monitor the cutting forces. The in-process control of the surface roughness has been developed and proposed to control the predicted surface roughness. It has been proved by the cutting tests that the proposed integration system of the in-process monitoring and the process control can be used to check the surface roughness during the cutting by utilizing the cutting force ratio.

Keywords: Turning, milling, monitoring, surface roughness, cutting force ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2108
6376 A Specification-Based Approach for Retrieval of Reusable Business Component for Software Reuse

Authors: Meng Fanchao, Zhan Dechen, Xu Xiaofei

Abstract:

Software reuse can be considered as the most realistic and promising way to improve software engineering productivity and quality. Automated assistance for software reuse involves the representation, classification, retrieval and adaptation of components. The representation and retrieval of components are important to software reuse in Component-Based on Software Development (CBSD). However, current industrial component models mainly focus on the implement techniques and ignore the semantic information about component, so it is difficult to retrieve the components that satisfy user-s requirements. This paper presents a method of business component retrieval based on specification matching to solve the software reuse of enterprise information system. First, a business component model oriented reuse is proposed. In our model, the business data type is represented as sign data type based on XML, which can express the variable business data type that can describe the variety of business operations. Based on this model, we propose specification match relationships in two levels: business operation level and business component level. In business operation level, we use input business data types, output business data types and the taxonomy of business operations evaluate the similarity between business operations. In the business component level, we propose five specification matches between business components. To retrieval reusable business components, we propose the measure of similarity degrees to calculate the similarities between business components. Finally, a business component retrieval command like SQL is proposed to help user to retrieve approximate business components from component repository.

Keywords: Business component, business operation, business data type, specification matching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1390
6375 Fuzzy Multi-Component DEA with Shared and Undesirable Fuzzy Resources

Authors: Jolly Puri, Shiv Prasad Yadav

Abstract:

Multi-component data envelopment analysis (MC-DEA) is a popular technique for measuring aggregate performance of the decision making units (DMUs) along with their components. However, the conventional MC-DEA is limited to crisp input and output data which may not always be available in exact form. In real life problems, data may be imprecise or fuzzy. Therefore, in this paper, we propose (i) a fuzzy MC-DEA (FMC-DEA) model in which shared and undesirable fuzzy resources are incorporated, (ii) the proposed FMC-DEA model is transformed into a pair of crisp models using α cut approach, (iii) fuzzy aggregate performance of a DMU and fuzzy efficiencies of components are defined to be fuzzy numbers, and (iv) a numerical example is illustrated to validate the proposed approach.

Keywords: Multi-component DEA, fuzzy multi-component DEA, fuzzy resources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052
6374 Combined Safety and Cybersecurity Risk Assessment for Intelligent Distributed Grids

Authors: Anders Thorsèn, Behrooz Sangchoolie, Peter Folkesson, Ted Strandberg

Abstract:

As more parts of the power grid become connected to the internet, the risk of cyberattacks increases. To identify the cybersecurity threats and subsequently reduce vulnerabilities, the common practice is to carry out a cybersecurity risk assessment. For safety classified systems and products, there is also a need for safety risk assessments in addition to the cybersecurity risk assessment to identify and reduce safety risks. These two risk assessments are usually done separately, but since cybersecurity and functional safety are often related, a more comprehensive method covering both aspects is needed. Some work addressing this has been done for specific domains like the automotive domain, but more general methods suitable for, e.g., Intelligent Distributed Grids, are still missing. One such method from the automotive domain is the Security-Aware Hazard Analysis and Risk Assessment (SAHARA) method that combines safety and cybersecurity risk assessments. This paper presents an approach where the SAHARA method has been modified to be more suitable for larger distributed systems. The adapted SAHARA method has a more general risk assessment approach than the original SAHARA. The proposed method has been successfully applied on two use cases of an intelligent distributed grid.

Keywords: Intelligent distribution grids, threat analysis, risk assessment, safety, cybersecurity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 713
6373 Semantic Spatial Objects Data Structure for Spatial Access Method

Authors: Kalum Priyanath Udagepola, Zuo Decheng, Wu Zhibo, Yang Xiaozong

Abstract:

Modern spatial database management systems require a unique Spatial Access Method (SAM) in order solve complex spatial quires efficiently. In this case the spatial data structure takes a prominent place in the SAM. Inadequate data structure leads forming poor algorithmic choices and forging deficient understandings of algorithm behavior on the spatial database. A key step in developing a better semantic spatial object data structure is to quantify the performance effects of semantic and outlier detections that are not reflected in the previous tree structures (R-Tree and its variants). This paper explores a novel SSRO-Tree on SAM to the Topo-Semantic approach. The paper shows how to identify and handle the semantic spatial objects with outlier objects during page overflow/underflow, using gain/loss metrics. We introduce a new SSRO-Tree algorithm which facilitates the achievement of better performance in practice over algorithms that are superior in the R*-Tree and RO-Tree by considering selection queries.

Keywords: Outlier, semantic spatial object, spatial objects, SSRO-Tree, topo-semantic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1676
6372 Q-Map: Clinical Concept Mining from Clinical Documents

Authors: Sheikh Shams Azam, Manoj Raju, Venkatesh Pagidimarri, Vamsi Kasivajjala

Abstract:

Over the past decade, there has been a steep rise in the data-driven analysis in major areas of medicine, such as clinical decision support system, survival analysis, patient similarity analysis, image analytics etc. Most of the data in the field are well-structured and available in numerical or categorical formats which can be used for experiments directly. But on the opposite end of the spectrum, there exists a wide expanse of data that is intractable for direct analysis owing to its unstructured nature which can be found in the form of discharge summaries, clinical notes, procedural notes which are in human written narrative format and neither have any relational model nor any standard grammatical structure. An important step in the utilization of these texts for such studies is to transform and process the data to retrieve structured information from the haystack of irrelevant data using information retrieval and data mining techniques. To address this problem, the authors present Q-Map in this paper, which is a simple yet robust system that can sift through massive datasets with unregulated formats to retrieve structured information aggressively and efficiently. It is backed by an effective mining technique which is based on a string matching algorithm that is indexed on curated knowledge sources, that is both fast and configurable. The authors also briefly examine its comparative performance with MetaMap, one of the most reputed tools for medical concepts retrieval and present the advantages the former displays over the latter.

Keywords: Information retrieval (IR), unified medical language system (UMLS), Syntax Based Analysis, natural language processing (NLP), medical informatics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 756
6371 The Impact of Physics Taught with Simulators and Texts in Brazilian High School: A Study in the Adult and Youth Education

Authors: Leandro Marcos Alves Vaz

Abstract:

The teaching of physics in Brazilian public schools emphasizes strongly the theoretical aspects of this science, showing its philosophical and mathematical basis, but neglecting its experimental character. Perhaps the lack of science laboratories explains this practice. In this work, we present a method of teaching physics using the computer. As alternatives to real experiments, we have the trials through simulators, many of which are free software available on the internet. In order to develop a study on the use of simulators in teaching, knowing the impossibility of simulations on all topics in a given subject, we combined these programs with phenomenological and/or experimental texts in order to mitigate this limitation. This study proposes the use of simulators and the debate using phenomenological/experimental texts on electrostatic theme in groups of the 3rd year of EJA (Adult and Youth Education) in order to verify the advantages of this methodology. Some benefits of the hybridization of the traditional method with the tools used were: Greater motivation of the students in learning, development of experimental notions, proactive socialization to learning, greater easiness to understand some concepts and the creation of collaborative activities that can reduce timidity of part of the students.

Keywords: Physics teaching, simulators, youth and adult education, experimentation, electrostatic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1160
6370 Characterization Non-Deterministic of Optical Channels

Authors: V. A. C. Vale, E. T. L. Cöuras Ford

Abstract:

The use of optical technologies in the telecommunications has been increasing due to its ability to transmit large amounts of data over long distances. However, as in all systems of data transmission, optical communication channels suffer from undesirable and non-deterministic effects, being essential to know the same. Thus, this research allows the assessment of these effects, as well as their characterization and beneficial uses of these effects.

Keywords: Optical communication, optical fiber, non-deterministic effects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426
6369 Generalized Method for Estimating Best-Fit Vertical Alignments for Profile Data

Authors: Said M. Easa, Shinya Kikuchi

Abstract:

When the profile information of an existing road is missing or not up-to-date and the parameters of the vertical alignment are needed for engineering analysis, the engineer has to recreate the geometric design features of the road alignment using collected profile data. The profile data may be collected using traditional surveying methods, global positioning systems, or digital imagery. This paper develops a method that estimates the parameters of the geometric features that best characterize the existing vertical alignments in terms of tangents and the expressions of the curve, that may be symmetrical, asymmetrical, reverse, and complex vertical curves. The method is implemented using an Excel-based optimization method that minimizes the differences between the observed profile and the profiles estimated from the equations of the vertical curve. The method uses a 'wireframe' representation of the profile that makes the proposed method applicable to all types of vertical curves. A secondary contribution of this paper is to introduce the properties of the equal-arc asymmetrical curve that has been recently developed in the highway geometric design field.

Keywords: Optimization, parameters, data, reverse, spreadsheet, vertical curves

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2427
6368 Anonymous Editing Prevention Technique Using Gradient Method for High-Quality Video

Authors: Jiwon Lee, Chanho Jung, Si-Hwan Jang, Kyung-Ill Kim, Sanghyun Joo, Wook-Ho Son

Abstract:

Since the advances in digital imaging technologies have led to development of high quality digital devices, there are a lot of illegal copies of copyrighted video content on the Internet. Also, unauthorized editing is occurred frequently. Thus, we propose an editing prevention technique for high-quality (HQ) video that can prevent these illegally edited copies from spreading out. The proposed technique is applied spatial and temporal gradient methods to improve the fidelity and detection performance. Also, the scheme duplicates the embedding signal temporally to alleviate the signal reduction caused by geometric and signal-processing distortions. Experimental results show that the proposed scheme achieves better performance than previously proposed schemes and it has high fidelity. The proposed scheme can be used in unauthorized access prevention method of visual communication or traitor tracking applications which need fast detection process to prevent illegally edited video content from spreading out.

Keywords: Editing prevention technique, gradient method, high-quality video, luminance change, visual communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1909
6367 Protocol and Method for Preventing Attacks from the Web

Authors: Ryuya Uda

Abstract:

Nowadays, computer worms, viruses and Trojan horse become popular, and they are collectively called malware. Those malware just spoiled computers by deleting or rewriting important files a decade ago. However, recent malware seems to be born to earn money. Some of malware work for collecting personal information so that malicious people can find secret information such as password for online banking, evidence for a scandal or contact address which relates with the target. Moreover, relation between money and malware becomes more complex. Many kinds of malware bear bots to get springboards. Meanwhile, for ordinary internet users, countermeasures against malware come up against a blank wall. Pattern matching becomes too much waste of computer resources, since matching tools have to deal with a lot of patterns derived from subspecies. Virus making tools can automatically bear subspecies of malware. Moreover, metamorphic and polymorphic malware are no longer special. Recently there appears malware checking sites that check contents in place of users' PC. However, there appears a new type of malicious sites that avoids check by malware checking sites. In this paper, existing protocols and methods related with the web are reconsidered in terms of protection from current attacks, and new protocol and method are indicated for the purpose of security of the web.

Keywords: Information Security, Malware, Network Security, World Wide Web

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2079
6366 Aggregation Scheduling Algorithms in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.

Keywords: Data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 782
6365 Robotic Assistance in Nursing Care: Survey on Challenges and Scenarios

Authors: Pascal Gliesche, Kathrin Seibert, Christian Kowalski, Dominik Domhoff, Max Pfingsthorn, Karin Wolf-Ostermann, Andreas Hein

Abstract:

Robotic assistance in nursing care is an increasingly important area of research and development. Facing a shortage of labor and an increasing number of people in need of care, the German Nursing Care Innovation Center (Pflegeinnovationszentrum, PIZ) aims to address these challenges from the side of technology. Little is known about nurses experiences with existing robotic assistance systems. Especially nurses perspectives on starting points for the development of robotic solutions, that target recurring burdensome tasks in everyday nursing care, are of interest. This paper presents findings focusing on robotics resulting from an explanatory mixed-methods study on nurses experiences with and their expectations for innovative technologies in nursing care in stationary and ambulant care facilities and hospitals in Germany. Based on the findings, eight scenarios for robotic assistance are identified based on the real needs of practitioners. An initial system addressing a single use-case is described to show perspectives for the use of robots in nursing care.

Keywords: Robotics and automation, engineering management, engineering in medicine and biology, medical services, public healthcare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2161