Search results for: fictitious domain method
8449 A Family Cars- Life Cycle Cost (LCC)-Oriented Hybrid Modelling Approach Combining ANN and CBR
Authors: Xiaochuan Chen, Jianguo Yang, Beizhi Li
Abstract:
Design for cost (DFC) is a method that reduces life cycle cost (LCC) from the angle of designers. Multiple domain features mapping (MDFM) methodology was given in DFC. Using MDFM, we can use design features to estimate the LCC. From the angle of DFC, the design features of family cars were obtained, such as all dimensions, engine power and emission volume. At the conceptual design stage, cars- LCC were estimated using back propagation (BP) artificial neural networks (ANN) method and case-based reasoning (CBR). Hamming space was used to measure the similarity among cases in CBR method. Levenberg-Marquardt (LM) algorithm and genetic algorithm (GA) were used in ANN. The differences of LCC estimation model between CBR and artificial neural networks (ANN) were provided. ANN and CBR separately each method has its shortcomings. By combining ANN and CBR improved results accuracy was obtained. Firstly, using ANN selected some design features that affect LCC. Then using LCC estimation results of ANN could raise the accuracy of LCC estimation in CBR method. Thirdly, using ANN estimate LCC errors and correct errors in CBR-s estimation results if the accuracy is not enough accurate. Finally, economically family cars and sport utility vehicle (SUV) was given as LCC estimation cases using this hybrid approach combining ANN and CBR.Keywords: case-based reasoning, life cycle cost (LCC), artificialneural networks (ANN), family cars
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19608448 Dynamic Model of a Buck Converter with a Sliding Mode Control
Authors: S. Chonsatidjamroen , K-N. Areerak, K-L. Areerak
Abstract:
This paper presents the averaging model of a buck converter derived from the generalized state-space averaging method. The sliding mode control is used to regulate the output voltage of the converter and taken into account in the model. The proposed model requires the fast computational time compared with those of the full topology model. The intensive time-domain simulations via the exact topology model are used as the comparable model. The results show that a good agreement between the proposed model and the switching model is achieved in both transient and steady-state responses. The reported model is suitable for the optimal controller design by using the artificial intelligence techniques.Keywords: Generalized state-space averaging method, buck converter, sliding mode control, modeling, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29908447 X-ray Crystallographic Analysis of MinC N-Terminal Domain from Escherichia coli
Authors: Jun Yop An, Kyoung Ryoung Park, Jung-Gyu Lee, Hyung-Seop Youn, Jung-Yeon Kang, Gil Bu Kang, Soo Hyun Eom
Abstract:
MinC plays an important role in bacterial cell division system by inhibiting FtsZ assembly. However, the molecular mechanism of the action is poorly understood. E. coli MinC Nterminus domain was purified and crystallized using 1.4 M sodium citrate pH 6.5 as a precipitant. X-ray diffraction data was collected and processed to 2.3 Å from a native crystal. The crystal belonged to space group P212121, with the unit cell parameters a = 52.7, b = 54.0, c = 64.7 Å. Assuming the presence of two molecules in the asymmetric unit, the Matthews coefficient value is 1.94 Å3 Da-1, which corresponds to a solvent content of 36.5%. The overall structure of MinCN is observed as a dimer form through anti-parallel ß-strand interaction.Keywords: MinC, Cell division, Crystallization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14208446 Delaunay Triangulations Efficiency for Conduction-Convection Problems
Authors: Bashar Albaalbaki, Roger E. Khayat
Abstract:
This work is a comparative study on the effect of Delaunay triangulation algorithms on discretization error for conduction-convection conservation problems. A structured triangulation and many unstructured Delaunay triangulations using three popular algorithms for node placement strategies are used. The numerical method employed is the vertex-centered finite volume method. It is found that when the computational domain can be meshed using a structured triangulation, the discretization error is lower for structured triangulations compared to unstructured ones for only low Peclet number values, i.e. when conduction is dominant. However, as the Peclet number is increased and convection becomes more significant, the unstructured triangulations reduce the discretization error. Also, no statistical correlation between triangulation angle extremums and the discretization error is found using 200 samples of randomly generated Delaunay and non-Delaunay triangulations. Thus, the angle extremums cannot be an indicator of the discretization error on their own and need to be combined with other triangulation quality measures, which is the subject of further studies.
Keywords: Conduction-convection problems, Delaunay triangulation, discretization error, finite volume method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568445 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting: The WICKED Method
Authors: S. Impey, D. Berry, S. Furtado, M. Galvin, L. Grogan, O. Hardiman, L. Hederman, M. Heverin, V. Wade, L. Douris, D. O'Sullivan, G. Stephens
Abstract:
Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.
Keywords: Healthcare, knowledge acquisition, maximal data sets, action design science.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5448444 Is Management Science doing Enough to Improve Healthcare?
Authors: Lalit Garg, Sally McClean, Maria Barton
Abstract:
Healthcare issues continue to pose huge problems and incur massive costs. As a result there are many challenging problems still unresolved. In this paper, we will carry out an extensive scientific survey of different areas of management and planning in an attempt to identify where there has already been a substantial contribution from management science methods to healthcare problems and where there is a clear potential for more work to be done. The focus will be on the read-across to the healthcare domain from such approaches applied generally to management and planning and how the methods can be used to improvement patient care. We conclude that, since the healthcare domain significantly differs from traditional areas of management and planning, in some cases there is a need to modify the approaches so as to incorporate the complexities of healthcare, and fully exploit the potential for improvement.
Keywords: Management science, management and planning, transforming services, healthcare.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14758443 Thermal Fracture Analysis of Fibrous Composites with Variable Fiber Spacing Using Jk-Integral
Authors: Farid Saeidi, Serkan Dag
Abstract:
In this study, fracture analysis of a fibrous composite laminate with variable fiber spacing is carried out using Jk-integral method. The laminate is assumed to be under thermal loading. Jk-integral is formulated by using the constitutive relations of plane orthotropic thermoelasticity. Developed domain independent form of the Jk-integral is then integrated into the general purpose finite element analysis software ANSYS. Numerical results are generated so as to assess the influence of variable fiber spacing on mode I and II stress intensity factors, energy release rate, and T-stress. For verification, some of the results are compared to those obtained using displacement correlation technique (DCT).Keywords: Jk-integral, variable fiber spacing, thermoelasticity, t-stress, finite element method, fibrous composite.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10088442 Integral Domains and Their Algebras: Topological Aspects
Authors: Shai Sarussi
Abstract:
Let S be an integral domain with field of fractions F and let A be an F-algebra. An S-subalgebra R of A is called S-nice if R∩F = S and the localization of R with respect to S \{0} is A. Denoting by W the set of all S-nice subalgebras of A, and defining a notion of open sets on W, one can view W as a T0-Alexandroff space. Thus, the algebraic structure of W can be viewed from the point of view of topology. It is shown that every nonempty open subset of W has a maximal element in it, which is also a maximal element of W. Moreover, a supremum of an irreducible subset of W always exists. As a notable connection with valuation theory, one considers the case in which S is a valuation domain and A is an algebraic field extension of F; if S is indecomposed in A, then W is an irreducible topological space, and W contains a greatest element.Keywords: Algebras over integral domains, Alexandroff topology, valuation domains, integral domains.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5078441 Changes in EEG and HRV during Event-Related Attention
Authors: Sun K. Yoo, Chung K. Lee
Abstract:
Determination of attentional status is important because working performance and an unexpected accident is highly related with the attention. The autonomic nervous and the central nervous systems can reflect the changes in person’s attentional status. Reduced number of suitable pysiological parameters among autonomic and central nervous systems related signal parameters will be critical in optimum design of attentional devices. In this paper, we analyze the EEG (Electroencephalography) and HRV (Heart Rate Variability) signals to demonstrate the effective relation with brain signal and cardiovascular signal during event-related attention, which will be later used in selecting the minimum set of attentional parameters. Time and frequency domain parameters from HRV signal and frequency domain parameters from EEG signal are used as input to the optimum feature parameters selector.
Keywords: EEG, HRV, attentional status.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27908440 Addressing Scalability Issues of Named Entity Recognition Using Multi-Class Support Vector Machines
Authors: Mona Soliman Habib
Abstract:
This paper explores the scalability issues associated with solving the Named Entity Recognition (NER) problem using Support Vector Machines (SVM) and high-dimensional features. The performance results of a set of experiments conducted using binary and multi-class SVM with increasing training data sizes are examined. The NER domain chosen for these experiments is the biomedical publications domain, especially selected due to its importance and inherent challenges. A simple machine learning approach is used that eliminates prior language knowledge such as part-of-speech or noun phrase tagging thereby allowing for its applicability across languages. No domain-specific knowledge is included. The accuracy measures achieved are comparable to those obtained using more complex approaches, which constitutes a motivation to investigate ways to improve the scalability of multiclass SVM in order to make the solution more practical and useable. Improving training time of multi-class SVM would make support vector machines a more viable and practical machine learning solution for real-world problems with large datasets. An initial prototype results in great improvement of the training time at the expense of memory requirements.Keywords: Named entity recognition, support vector machines, language independence, bioinformatics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16908439 Image Haze Removal Using Scene Depth Based Spatially Varying Atmospheric Light in Haar Lifting Wavelet Domain
Authors: Prabh Preet Singh, Harpreet Kaur
Abstract:
This paper presents a method for single image dehazing based on dark channel prior (DCP). The property that the intensity of the dark channel gives an approximate thickness of the haze is used to estimate the transmission and atmospheric light. Instead of constant atmospheric light, the proposed method employs scene depth to estimate spatially varying atmospheric light as it truly occurs in nature. Haze imaging model together with the soft matting method has been used in this work to produce high quality haze free image. Experimental results demonstrate that the proposed approach produces better results than the classic DCP approach as color fidelity and contrast of haze free image are improved and no over-saturation in the sky region is observed. Further, lifting Haar wavelet transform is employed to reduce overall execution time by a factor of two to three as compared to the conventional approach.
Keywords: Depth based atmospheric light, dark channel prior, lifting wavelet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5538438 An Active Rectifier with Time-Domain Delay Compensation to Enhance the Power Conversion Efficiency
Authors: Shao-Ku Kao
Abstract:
This paper presents an active rectifier with time-domain delay compensation to enhance the efficiency. A delay calibration circuit is designed to convert delay time to voltage and adaptive control on/off delay in variable input voltage. This circuit is designed in 0.18 mm CMOS process. The input voltage range is from 2 V to 3.6 V with the output voltage from 1.8 V to 3.4 V. The efficiency can maintain more than 85% when the load from 50 Ω ~ 1500 Ω for 3.6 V input voltage. The maximum efficiency is 92.4 % at output power to be 38.6 mW for 3.6 V input voltage.Keywords: Wireless power transfer, active diode, delay compensation, time to voltage converter, PCE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7738437 Quality Estimation of Video Transmitted overan Additive WGN Channel based on Digital Watermarking and Wavelet Transform
Authors: Mohamed S. El-Mahallawy, Attalah Hashad, Hazem Hassan Ali, Heba Sami Zaky
Abstract:
This paper presents an evaluation for a wavelet-based digital watermarking technique used in estimating the quality of video sequences transmitted over Additive White Gaussian Noise (AWGN) channel in terms of a classical objective metric, such as Peak Signal-to-Noise Ratio (PSNR) without the need of the original video. In this method, a watermark is embedded into the Discrete Wavelet Transform (DWT) domain of the original video frames using a quantization method. The degradation of the extracted watermark can be used to estimate the video quality in terms of PSNR with good accuracy. We calculated PSNR for video frames contaminated with AWGN and compared the values with those estimated using the Watermarking-DWT based approach. It is found that the calculated and estimated quality measures of the video frames are highly correlated, suggesting that this method can provide a good quality measure for video frames transmitted over AWGN channel without the need of the original video.Keywords: AWGN, DWT, PSNR, Watermarking, VideoQuality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18368436 A Methodology for Creating a Conceptual Model Under Uncertainty
Authors: Bogdan Walek, Jiri Bartos, Cyril Klimes
Abstract:
This article deals with the conceptual modeling under uncertainty. First, the division of information systems with their definition will be described, focusing on those where the construction of a conceptual model is suitable for the design of future information system database. Furthermore, the disadvantages of the traditional approach in creating a conceptual model and database design will be analyzed. A comprehensive methodology for the creation of a conceptual model based on analysis of client requirements and the selection of a suitable domain model is proposed here. This article presents the expert system used for the construction of a conceptual model and is a suitable tool for database designers to create a conceptual model.
Keywords: Conceptual model, conceptual modeling, database, methodology, uncertainty, information system, entity, attribute, relationship, conceptual domain model, fuzzy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15848435 Structural Design Strategy of Double-Eccentric Butterfly Valve using Topology Optimization Techniques
Authors: Jun-Oh Kim, Seol-Min Yang, Seok-Heum Baek, Sangmo Kang
Abstract:
In this paper, the shape design process is briefly discussed emphasizing the use of topology optimization in the conceptual design stage. The basic idea is to view feasible domains for sensitivity region concepts. In this method, the main process consists of two steps: as the design moves further inside the feasible domain using Taguchi method, and thus becoming more successful topology optimization, the sensitivity region becomes larger. In designing a double-eccentric butterfly valve, related to hydrodynamic performance and disc structure, are discussed where the use of topology optimization has proven to dramatically improve an existing design and significantly decrease the development time of a shape design. Computational Fluid Dynamics (CFD) analysis results demonstrate the validity of this approach.
Keywords: Double-eccentric butterfly valve, CFD, Topology optimization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35438434 Asymptotic Approach for Rectangular Microstrip Patch antenna With Magnetic Anisotropy and Chiral Substrate
Authors: Zebiri Chemseddine, Benabdelaziz Fatiha
Abstract:
The effect of a chiral bianisotropic substrate on the complex resonant frequency of a rectangular microstrip resonator has been studied on the basis of the integral equation formulation. The analysis is based on numerical resolution of the integral equation using Galerkin procedure for moment method in the spectral domain. This work aim first to study the effect of the chirality of a bianisotopic substrate upon the resonant frequency and the half power bandwidth, second the effect of a magnetic anisotropy via an asymptotic approach for very weak substrate upon the resonant frequency and the half power bandwidth has been investigated. The obtained results are compared with previously published work [11-9], they were in good agreement.Keywords: Microstrip antenna, bianisotropic media, resonant frequency, moment method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16038433 Fast Calculation for Particle Interactions in SPH Simulations: Outlined Sub-domain Technique
Authors: Buntara Sthenly Gan, Naohiro Kawada
Abstract:
A simple and easy algorithm is presented for a fast calculation of kernel functions which required in fluid simulations using the Smoothed Particle Hydrodynamic (SPH) method. Present proposed algorithm improves the Linked-list algorithm and adopts the Pair-Wise Interaction technique, which are widely used for evaluating kernel functions in fluid simulations using the SPH method. The algorithm is easy to be implemented without any complexities in programming. Some benchmark examples are used to show the simulation time saved by using the proposed algorithm. Parametric studies on the number of divisions for sub-domains, smoothing length and total amount of particles are conducted to show the effectiveness of the present technique. A compact formulation is proposed for practical usage.
Keywords: Technique, fluid simulation, smoothing particle hydrodynamic (SPH), particle interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16308432 The Effects of Signal Level of the Microwave Generator on the Brillouin Gain Spectrum in BOTDA and BOTDR
Authors: M. Yucel, M. Yucel, N. F. Ozturk, H. H. Goktas, C. Gemci, F. V. Celebi
Abstract:
In this study, Brillouin Gain Spectrum (BGS) is experimentally analyzed in the Brillouin Optical Time Domain Reflectometry (BOTDR) and Brillouin Optical Time Domain Analyzer (BOTDA). For this purpose, the signal level of the microwave generator is varied and the effects of BGS are investigated. In the setups, 20 km conventional single mode fiber is used to both setups and laser wavelengths are selected around 1550 nm. To achieve best results, it can be used between 5 dBm to 15 dBm signal level of microwave generator for BOTDA and BOTDR setups.Keywords: Microwave signal level, Brillouin gain spectrum, BOTDA, BOTDR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18988431 Assisted Prediction of Hypertension Based on Heart Rate Variability and Improved Residual Networks
Authors: Yong Zhao, Jian He, Cheng Zhang
Abstract:
Cardiovascular disease resulting from hypertension poses a significant threat to human health, and early detection of hypertension can potentially save numerous lives. Traditional methods for detecting hypertension require specialized equipment and are often incapable of capturing continuous blood pressure fluctuations. To address this issue, this study starts by analyzing the principle of heart rate variability (HRV) and introduces the utilization of sliding window and power spectral density (PSD) techniques to analyze both temporal and frequency domain features of HRV. Subsequently, a hypertension prediction network that relies on HRV is proposed, combining Resnet, attention mechanisms, and a multi-layer perceptron. The network leverages a modified ResNet18 to extract frequency domain features, while employing an attention mechanism to integrate temporal domain features, thus enabling auxiliary hypertension prediction through the multi-layer perceptron. The proposed network is trained and tested using the publicly available SHAREE dataset from PhysioNet. The results demonstrate that the network achieves a high prediction accuracy of 92.06% for hypertension, surpassing traditional models such as K Near Neighbor (KNN), Bayes, Logistic regression, and traditional Convolutional Neural Network (CNN).
Keywords: Feature extraction, heart rate variability, hypertension, residual networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1958430 Computer Software Applicable in Rehabilitation, Cardiology and Molecular Biology
Authors: P. Kowalska, P. Gabka, K. Kamieniarz, M. Kamieniarz, W. Stryla, P. Guzik, T. Krauze
Abstract:
We have developed a computer program consisting of 6 subtests assessing the children hand dexterity applicable in the rehabilitation medicine. We have carried out a normative study on a representative sample of 285 children aged from 7 to 15 (mean age 11.3) and we have proposed clinical standards for three age groups (7-9, 9-11, 12-15 years). We have shown statistical significance of differences among the corresponding mean values of the task time completion. We have also found a strong correlation between the task time completion and the age of the subjects, as well as we have performed the test-retest reliability checks in the sample of 84 children, giving the high values of the Pearson coefficients for the dominant and non-dominant hand in the range 0.74Keywords: Biomedical data base processing, Computer software, Hand dexterity, Heart rate and blood pressure variability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14758429 A Study on Multi-Agent Behavior in a Soccer Game Domain
Authors: S. R. Mohd Shukri, M. K. Mohd Shaukhi
Abstract:
There have been many games developing simulation of soccer games. Many of these games have been designed with highly realistic features to attract more users. Many have also incorporated better artificial intelligent (AI) similar to that in a real soccer game. One of the challenging issues in a soccer game is the cooperation, coordination and negotiation among distributed agents in a multi-agent system. This paper focuses on the incorporation of multi-agent technique in a soccer game domain. The better the cooperation of a multi-agent team, the more intelligent the game will be. Thus, past studies were done on the robotic soccer game because of the better multi-agent system implementation. From this study, a better approach and technique of multi-agent behavior could be select to improve the author-s 2D online soccer game.Keywords: Multi-Agent, Robotic Intelligent, Role Assignment, Formation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19408428 An Investigation on Electric Field Distribution around 380 kV Transmission Line for Various Pylon Models
Authors: C. F. Kumru, C. Kocatepe, O. Arikan
Abstract:
In this study, electric field distribution analyses for three pylon models are carried out by a Finite Element Method (FEM) based software. Analyses are performed in both stationary and time domains to observe instantaneous values along with the effective ones. Considering the results of the study, different line geometries is considerably affecting the magnitude and distribution of electric field although the line voltages are the same. Furthermore, it is observed that maximum values of instantaneous electric field obtained in time domain analysis are quite higher than the effective ones in stationary mode. In consequence, electric field distribution analyses should be individually made for each different line model and the limit exposure values or distances to residential buildings should be defined according to the results obtained.Keywords: Electric field, energy transmission line, finite element method, pylon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27158427 Differentiation of Heart Rate Time Series from Electroencephalogram and Noise
Authors: V. I. Thajudin Ahamed, P. Dhanasekaran, Paul Joseph K.
Abstract:
Analysis of heart rate variability (HRV) has become a popular non-invasive tool for assessing the activities of autonomic nervous system. Most of the methods were hired from techniques used for time series analysis. Currently used methods are time domain, frequency domain, geometrical and fractal methods. A new technique, which searches for pattern repeatability in a time series, is proposed for quantifying heart rate (HR) time series. These set of indices, which are termed as pattern repeatability measure and pattern repeatability ratio are able to distinguish HR data clearly from noise and electroencephalogram (EEG). The results of analysis using these measures give an insight into the fundamental difference between the composition of HR time series with respect to EEG and noise.Keywords: Approximate entropy, heart rate variability, noise, pattern repeatability, and sample entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17348426 Identification of Training Topics for the Improvement of the Relevant Cognitive Skills of Technical Operators in the Railway Domain
Authors: Giulio Nisoli, Jonas Brüngger, Karin Hostettler, Nicole Stoller, Katrin Fischer
Abstract:
Technical operators in the railway domain are experts responsible for the supervisory control of the railway power grid as well as of the railway tunnels. The technical systems used to master these demanding tasks are constantly increasing in their degree of automation. It becomes therefore difficult for technical operators to maintain the control over the technical systems and the processes of their job. In particular, the operators must have the necessary experience and knowledge in dealing with a malfunction situation or unexpected event. For this reason, it is of growing importance that the skills relevant for the execution of the job are maintained and further developed beyond the basic training they receive, where they are educated in respect of technical knowledge and the work with guidelines. Training methods aimed at improving the cognitive skills needed by technical operators are still missing and must be developed. Goals of the present study were to identify which are the relevant cognitive skills of technical operators in the railway domain and to define which topics should be addressed by the training of these skills. Observational interviews were conducted in order to identify the main tasks and the organization of the work of technical operators as well as the technical systems used for the execution of their job. Based on this analysis, the most demanding tasks of technical operators could be identified and described. The cognitive skills involved in the execution of these tasks are those, which need to be trained. In order to identify and analyze these cognitive skills a cognitive task analysis (CTA) was developed. CTA specifically aims at identifying the cognitive skills that employees implement when performing their own tasks. The identified cognitive skills of technical operators were summarized and grouped in training topics. For every training topic, specific goals were defined. The goals regard the three main categories; knowledge, skills and attitude to be trained in every training topic. Based on the results of this study, it is possible to develop specific training methods to train the relevant cognitive skills of the technical operators.
Keywords: Cognitive skills, cognitive task analysis, technical operators in the railway domain, training topics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6918425 Visual Hull with Imprecise Input
Authors: Peng He
Abstract:
Imprecision is a long-standing problem in CAD design and high accuracy image-based reconstruction applications. The visual hull which is the closed silhouette equivalent shape of the objects of interest is an important concept in image-based reconstruction. We extend the domain-theoretic framework, which is a robust and imprecision capturing geometric model, to analyze the imprecision in the output shape when the input vertices are given with imprecision. Under this framework, we show an efficient algorithm to generate the 2D partial visual hull which represents the exact information of the visual hull with only basic imprecision assumptions. We also show how the visual hull from polyhedra problem can be efficiently solved in the context of imprecise input.Keywords: Geometric Domain, Computer Vision, Computational Geometry, Visual Hull, Image-Based reconstruction, Imprecise Input, CAD object
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14778424 Watermark Bit Rate in Diverse Signal Domains
Authors: Nedeljko Cvejic, Tapio Sepp
Abstract:
A study of the obtainable watermark data rate for information hiding algorithms is presented in this paper. As the perceptual entropy for wideband monophonic audio signals is in the range of four to five bits per sample, a significant amount of additional information can be inserted into signal without causing any perceptual distortion. Experimental results showed that transform domain watermark embedding outperforms considerably watermark embedding in time domain and that signal decompositions with a high gain of transform coding, like the wavelet transform, are the most suitable for high data rate information hiding. Keywords?Digital watermarking, information hiding, audio watermarking, watermark data rate.
Keywords: Digital watermarking, information hiding, audio watermarking, watermark data rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16288423 Improving Similarity Search Using Clustered Data
Authors: Deokho Kim, Wonwoo Lee, Jaewoong Lee, Teresa Ng, Gun-Ill Lee, Jiwon Jeong
Abstract:
This paper presents a method for improving object search accuracy using a deep learning model. A major limitation to provide accurate similarity with deep learning is the requirement of huge amount of data for training pairwise similarity scores (metrics), which is impractical to collect. Thus, similarity scores are usually trained with a relatively small dataset, which comes from a different domain, causing limited accuracy on measuring similarity. For this reason, this paper proposes a deep learning model that can be trained with a significantly small amount of data, a clustered data which of each cluster contains a set of visually similar images. In order to measure similarity distance with the proposed method, visual features of two images are extracted from intermediate layers of a convolutional neural network with various pooling methods, and the network is trained with pairwise similarity scores which is defined zero for images in identical cluster. The proposed method outperforms the state-of-the-art object similarity scoring techniques on evaluation for finding exact items. The proposed method achieves 86.5% of accuracy compared to the accuracy of the state-of-the-art technique, which is 59.9%. That is, an exact item can be found among four retrieved images with an accuracy of 86.5%, and the rest can possibly be similar products more than the accuracy. Therefore, the proposed method can greatly reduce the amount of training data with an order of magnitude as well as providing a reliable similarity metric.
Keywords: Visual search, deep learning, convolutional neural network, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8278422 A High Performance Technique in Harmonic Omitting Based on Predictive Current Control of a Shunt Active Power Filter
Authors: K. G. Firouzjah, A. Sheikholeslami
Abstract:
The perfect operation of common Active Filters is depended on accuracy of identification system distortion. Also, using a suitable method in current injection and reactive power compensation, leads to increased filter performance. Due to this fact, this paper presents a method based on predictive current control theory in shunt active filter applications. The harmonics of the load current is identified by using o–d–q reference frame on load current and eliminating the DC part of d–q components. Then, the rest of these components deliver to predictive current controller as a Threephase reference current by using Park inverse transformation. System is modeled in discreet time domain. The proposed method has been tested using MATLAB model for a nonlinear load (with Total Harmonic Distortion=20%). The simulation results indicate that the proposed filter leads to flowing a sinusoidal current (THD=0.15%) through the source. In addition, the results show that the filter tracks the reference current accurately.
Keywords: Active filter, predictive current control, low pass filter, harmonic omitting, o–d–q reference frame.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18308421 Towards Design of Context-Aware Sensor Grid Framework for Agriculture
Authors: Aqeel-ur-Rehman, Zubair A. Shaikh
Abstract:
This paper is to present context-aware sensor grid framework for agriculture and its design challenges. Use of sensor networks in the domain of agriculture is not new. However, due to the unavailability of any common framework, solutions that are developed in this domain are location, environment and problem dependent. Keeping the need of common framework for agriculture, Context-Aware Sensor Grid Framework is proposed. It will be helpful in developing solutions for majority of the problems related to irrigation, pesticides spray, use of fertilizers, regular monitoring of plot and yield etc. due to the capability of adjusting according to location and environment. The proposed framework is composed of three layer architecture including context-aware application layer, grid middleware layer and sensor network layer.Keywords: Agriculture, Context-Awareness, Grid Computing, and Sensor Grid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25758420 A novel Iterative Approach for Phase Noise Cancellation in Multi-Carrier Code Division Multiple Access (MC-CDMA) Systems
Authors: Joumana Farah, François Marx, Clovis Francis
Abstract:
The aim of this paper is to emphasize and alleviate the effect of phase noise due to imperfect local oscillators on the performances of a Multi-Carrier CDMA system. After the cancellation of Common Phase Error (CPE), an iterative approach is introduced which iteratively estimates Inter-Carrier Interference (ICI) components in the frequency domain and cancels their contribution in the time domain. Simulation are conducted in order to investigate the achievable performances for several parameters, such as the spreading factor, the modulation order, the phase noise power and the transmission Signal-to-Noise Ratio.
Keywords: Inter-carrier Interference, Multi-Carrier Code DivisionMultiple Access, Orthogonal Frequency Division Multiplexing, Phase noise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1555