Search results for: information processing model
10452 Data Compression in Ultrasonic Network Communication via Sparse Signal Processing
Authors: Beata Zima, Octavio A. Márquez Reyes, Masoud Mohammadgholiha, Jochen Moll, Luca De Marchi
Abstract:
This document presents the approach of using compressed sensing in signal encoding and information transferring within a guided wave sensor network, comprised of specially designed frequency steerable acoustic transducers (FSATs). Wave propagation in a damaged plate was simulated using commercial FEM-based software COMSOL. Guided waves were excited by means of FSATs, characterized by the special shape of its electrodes, and modeled using PIC255 piezoelectric material. The special shape of the FSAT, allows for focusing wave energy in a certain direction, accordingly to the frequency components of its actuation signal, which makes a larger monitored area available. The process begins when a FSAT detects and records reflection from damage in the structure, this signal is then encoded and prepared for transmission, using a combined approach, based on Compressed Sensing Matching Pursuit and Quadrature Amplitude Modulation (QAM). After codification of the signal is in binary, the information is transmitted between the nodes in the network. The message reaches the last node, where it is finally decoded and processed, to be used for damage detection and localization purposes. The main aim of the investigation is to determine the location of detected damage using reconstructed signals. The study demonstrates that the special steerable capabilities of FSATs, not only facilitate the detection of damage but also permit transmitting the damage information to a chosen area in a specific direction of the investigated structure.
Keywords: Data compression, ultrasonic communication, guided waves, FEM analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38010451 2D and 3D Unsteady Simulation of the Heat Transfer in the Sample during Heat Treatment by Moving Heat Source
Authors: Z. Veselý, M. Honner, J. Mach
Abstract:
The aim of the performed work is to establish the 2D and 3D model of direct unsteady task of sample heat treatment by moving source employing computer model on the basis of finite element method. Complex boundary condition on heat loaded sample surface is the essential feature of the task. Computer model describes heat treatment of the sample during heat source movement over the sample surface. It is started from 2D task of sample cross section as a basic model. Possibilities of extension from 2D to 3D task are discussed. The effect of the addition of third model dimension on temperature distribution in the sample is showed. Comparison of various model parameters on the sample temperatures is observed. Influence of heat source motion on the depth of material heat treatment is shown for several velocities of the movement. Presented computer model is prepared for the utilization in laser treatment of machine parts.Keywords: Computer simulation, unsteady model, heat treatment, complex boundary condition, moving heat source.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 204010450 Why Traditional Technology Acceptance Models Won't Work for Future Information Technologies?
Authors: Carsten Röcker
Abstract:
This paper illustrates why existing technology acceptance models are only of limited use for predicting and explaining the adoption of future information and communication technologies. It starts with a general overview over technology adoption processes, and presents several theories for the acceptance as well as adoption of traditional information technologies. This is followed by an overview over the recent developments in the area of information and communication technologies. Based on the arguments elaborated in these sections, it is shown why the factors used to predict adoption in existing systems, will not be sufficient for explaining the adoption of future information and communication technologies.Keywords: Technology Diffusion, Technology AcceptanceModels, Ambient Intelligence, Ubiquitous and Pervasive Computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 242710449 Comparison of Processing Conditions for Plasticized PVC and PVB
Authors: Michael Tupý, Jaroslav Císař, Pavel Mokrejš, Dagmar Měřínská, Alice Tesaříková-Svobodová
Abstract:
It is the worldwide problem that the recycled PVB is not recycled and it is wildly stored in landfills. However, PVB has similar chemical properties such as PVC. Moreover, both of these polymers are plasticized. Therefore, the study of thermal properties of plasticized PVC and the recycled PVB obtained by recycling of windshields is carried out. This work has done in order to find nondegradable processing conditions applicable for both polymers. Tested PVC contained 38% of plasticizer diisononyl phthalate (DINP) and PVB was plasticized with 28% of triethylene glycol, bis(2-ethylhexanoate) (3GO). The thermal and thermo-oxidative decomposition of both vinyl polymers are compared by calorimetric analysis and by tensile strength analysis.Keywords: Poly(vinyl chloride), Poly(vinyl butyral), Recycling, Reprocessing, Thermal analysis, Decomposition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 539210448 Environmental Management of the Tanning Industry's Supply Chain: An Integration Model from Lean Supply Chain, Green Supply Chain, Cleaner Production and ISO 14001:2004
Authors: N. Clavijo Buriticá, L. M. Correa Lópezand J. R., Sánchez Rodríguez
Abstract:
The environmental impact caused by industries is an issue that, in the last 20 years, has become very important in terms of society, economics and politics in Colombia. Particularly, the tannery process is extremely polluting because of uneffective treatments and regulations given to the dumping process and atmospheric emissions. Considering that, this investigation is intended to propose a management model based on the integration of Lean Supply Chain, Green Supply Chain, Cleaner Production and ISO 14001-2004, that prioritizes the strategic components of the organizations. As a result, a management model will be obtained and it will provide a strategic perspective through a systemic approach to the tanning process. This will be achieved through the use of Multicriteria Decision tools, along with Quality Function Deployment and Fuzzy Logic. The strategic approach that embraces the management model using the alignment of Lean Supply Chain, Green Supply Chain, Cleaner Production and ISO 14001-2004, is an integrated perspective that allows a gradual frame of the tactical and operative elements through the correct setting of the information flow, improving the decision making process. In that way, Small Medium Enterprises (SMEs) could improve their productivity, competitiveness and as an added value, the minimization of the environmental impact. This improvement is expected to be controlled through a Dashboard that helps the Organization measure its performance along the implementation of the model in its productive process.
Keywords: Integration, environmental impact, management, systemic organization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 204610447 Adaptive Gaussian Mixture Model for Skin Color Segmentation
Authors: Reza Hassanpour, Asadollah Shahbahrami, Stephan Wong
Abstract:
Skin color based tracking techniques often assume a static skin color model obtained either from an offline set of library images or the first few frames of a video stream. These models can show a weak performance in presence of changing lighting or imaging conditions. We propose an adaptive skin color model based on the Gaussian mixture model to handle the changing conditions. Initial estimation of the number and weights of skin color clusters are obtained using a modified form of the general Expectation maximization algorithm, The model adapts to changes in imaging conditions and refines the model parameters dynamically using spatial and temporal constraints. Experimental results show that the method can be used in effectively tracking of hand and face regions.Keywords: Face detection, Segmentation, Tracking, Gaussian Mixture Model, Adaptation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 241610446 Semantic Enhanced Social Media Sentiments for Stock Market Prediction
Authors: K. Nirmala Devi, V. Murali Bhaskaran
Abstract:
Traditional document representation for classification follows Bag of Words (BoW) approach to represent the term weights. The conventional method uses the Vector Space Model (VSM) to exploit the statistical information of terms in the documents and they fail to address the semantic information as well as order of the terms present in the documents. Although, the phrase based approach follows the order of the terms present in the documents rather than semantics behind the word. Therefore, a semantic concept based approach is used in this paper for enhancing the semantics by incorporating the ontology information. In this paper a novel method is proposed to forecast the intraday stock market price directional movement based on the sentiments from Twitter and money control news articles. The stock market forecasting is a very difficult and highly complicated task because it is affected by many factors such as economic conditions, political events and investor’s sentiment etc. The stock market series are generally dynamic, nonparametric, noisy and chaotic by nature. The sentiment analysis along with wisdom of crowds can automatically compute the collective intelligence of future performance in many areas like stock market, box office sales and election outcomes. The proposed method utilizes collective sentiments for stock market to predict the stock price directional movements. The collective sentiments in the above social media have powerful prediction on the stock price directional movements as up/down by using Granger Causality test.
Keywords: Bag of Words, Collective Sentiments, Ontology, Semantic relations, Sentiments, Social media, Stock Prediction, Twitter, Vector Space Model and wisdom of crowds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 280410445 Survey on Image Mining Using Genetic Algorithm
Authors: Jyoti Dua
Abstract:
One image is worth more than thousand words. Images if analyzed can reveal useful information. Low level image processing deals with the extraction of specific feature from a single image. Now the question arises: What technique should be used to extract patterns of very large and detailed image database? The answer of the question is: “Image Mining”. Image Mining deals with the extraction of image data relationship, implicit knowledge, and another pattern from the collection of images or image database. It is nothing but the extension of Data Mining. In the following paper, not only we are going to scrutinize the current techniques of image mining but also present a new technique for mining images using Genetic Algorithm.
Keywords: Image Mining, Data Mining, Genetic Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 244810444 Design of Polyetheretherketone Fixation Plates for Fractured Distal Femur
Authors: Abhishek Soni, Bhagat Singh
Abstract:
In the present study, a methodology has been proposed to treat fracture in the distal part of the femur bone. Initially, bone model has been developed using the computed tomography scan data of the fractured bone. This information has been further used to create polyether ether ketone (PEEK) implant for this fractured bone. Damaged bone and implant models have been assembled. This assembled model has been further analyzed for stress distribution. Moreover, deformation developed was also measured. It has been observed that the stress and deformation developed was not so appreciable. Thus, it proves that the aforementioned procedure can be suitably adopted for the treatment of fractured distal femur bone.
Keywords: Distal femur, fixation plates, PEEK, reverse engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46110443 External Effects on Dynamic Competitive Model of Domestic Airline and High Speed Rail
Authors: Shih-Ching Lo, Yu-Ping Liao
Abstract:
Social-economic variables influence transportation demand largely. Analyses of discrete choice model consider social-economic variables to study traveler-s mode choice and demand. However, to calibrate the discrete choice model needs to have plenty of questionnaire survey. Also, an aggregative model is proposed. The historical data of passenger volumes for high speed rail and domestic civil aviation are employed to calibrate and validate the model. In this study, models with different social-economic variables, which are oil price, GDP per capita, CPI and economic growth rate, are compared. From the results, the model with the oil price is better than models with the other social-economic variables.Keywords: forecasting, passenger volume, dynamic competitive model, social-economic variables, oil price.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 158610442 Sensor Network Based Emergency Response and Navigation Support Architecture
Authors: Dilusha Weeraddana, Ashanie Gunathillake, Samiru Gayan
Abstract:
In an emergency, combining Wireless Sensor Network's data with the knowledge gathered from various other information sources and navigation algorithms, could help safely guide people to a building exit while avoiding the risky areas. This paper presents an emergency response and navigation support architecture for data gathering, knowledge manipulation, and navigational support in an emergency situation. At normal state, the system monitors the environment. When an emergency event detects, the system sends messages to first responders and immediately identifies the risky areas from safe areas to establishing escape paths. The main functionalities of the system include, gathering data from a wireless sensor network which is deployed in a multi-story indoor environment, processing it with information available in a knowledge base, and sharing the decisions made, with first responders and people in the building. The proposed architecture will act to reduce risk of losing human lives by evacuating people much faster with least congestion in an emergency environment.
Keywords: Emergency response, Firefighters, Navigation, Wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 200710441 Improving the Quality of Transport Management Services with Fuzzy Signatures
Authors: Csaba I. Hencz, István Á. Harmati
Abstract:
Nowadays the significance of road transport is gradually increasing. All transport companies are working in the same external environment where the speed of transport is defined by traffic rules. The main objective is to accelerate the speed of service and it is only dependent on the individual abilities of the managing members. These operational control units make decisions quickly (in a typically experiential and/or intuitive way). For this reason, support for these decisions is an important task. Our goal is to create a decision support model based on fuzzy signatures that can assist the work of operational management automatically. If the model sets parameters properly, the management of transport could be more economical and efficient.
Keywords: Freight transport, decision support, information handling, fuzzy methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 81710440 Profitability Assessment of Granite Aggregate Production and the Development of a Profit Assessment Model
Authors: Melodi Mbuyi Mata, Blessing Olamide Taiwo, Afolabi Ayodele David
Abstract:
The purpose of this research is to create empirical models for assessing the profitability of granite aggregate production in Akure, Ondo state aggregate quarries. In addition, an Artificial Neural Network (ANN) model and multivariate predicting models for granite profitability were developed in the study. A formal survey questionnaire was used to collect data for the study. The data extracted from the case study mine for this study include granite marketing operations, royalty, production costs, and mine production information. The following methods were used to achieve the goal of this study: descriptive statistics, MATLAB 2017, and SPSS16.0 software in analyzing and modeling the data collected from granite traders in the study areas. The ANN and Multi Variant Regression models' prediction accuracy was compared using a coefficient of determination (R2), Root Mean Square Error (RMSE), and mean square error (MSE). Due to the high prediction error, the model evaluation indices revealed that the ANN model was suitable for predicting generated profit in a typical quarry. More quarries in Nigeria's southwest region and other geopolitical zones should be considered to improve ANN prediction accuracy.
Keywords: National development, granite, profitability assessment, ANN models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9010439 Time Series Forecasting Using Various Deep Learning Models
Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan
Abstract:
Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed length window in the past as an explicit input. In this paper, we study how the performance of predictive models change as a function of different look-back window sizes and different amounts of time to predict into the future. We also consider the performance of the recent attention-based transformer models, which had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (Recurrent Neural Network (RNN), Long Short-term Memory (LSTM), Gated Recurrent Units (GRU), and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the website of University of California, Irvine (UCI), which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean Absolute Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.
Keywords: Air quality prediction, deep learning algorithms, time series forecasting, look-back window.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 117410438 Efficient Method for ECG Compression Using Two Dimensional Multiwavelet Transform
Authors: Morteza Moazami-Goudarzi, Mohammad H. Moradi, Ali Taheri
Abstract:
In this paper we introduce an effective ECG compression algorithm based on two dimensional multiwavelet transform. Multiwavelets offer simultaneous orthogonality, symmetry and short support, which is not possible with scalar two-channel wavelet systems. These features are known to be important in signal processing. Thus multiwavelet offers the possibility of superior performance for image processing applications. The SPIHT algorithm has achieved notable success in still image coding. We suggested applying SPIHT algorithm to 2-D multiwavelet transform of2-D arranged ECG signals. Experiments on selected records of ECG from MIT-BIH arrhythmia database revealed that the proposed algorithm is significantly more efficient in comparison with previously proposed ECG compression schemes.
Keywords: ECG signal compression, multi-rateprocessing, 2-D Multiwavelet, Prefiltering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 203510437 Analysis of Control by Flattening of the Welded Tubes
Authors: Hannachi Med Tahar, H. Djebaili, B. Daheche
Abstract:
In this approach, we have tried to describe the flattening of welded tubes, and its experimental application. The test is carried out at the (National product processing company dishes and tubes production). Usually, the final products (tubes) undergo a series of non-destructive inspection online and offline welding, and obviously destructive mechanical testing (bending, flattening, flaring, etc.). For this and for the purpose of implementing the flattening test, which applies to the processing of round tubes in other forms, it took four sections of welded tubes draft (before stretching hot) and welded tubes finished (after drawing hot and annealing), it was also noted the report 'health' flattened tubes must not show or crack or tear. The test is considered poor if it reveals a lack of ductility of the metal.
Keywords: Flattening, destructive testing, tube drafts, finished tube, Castem 2001.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 127610436 Prioritising the TQM Enablers and IT Resources in the ICT Industry: An AHP Approach
Authors: Suby Khanam, Jamshed Siddiqui, Faisal Talib
Abstract:
Total Quality Management (TQM) is a managerial approach that improves the competitiveness of the industry, meanwhile Information technology (IT) was introduced with TQM for handling the technical issues which is supported by quality experts for fulfilling the customers’ requirement. Present paper aims to utilise AHP (Analytic Hierarchy Process) methodology to priorities and rank the hierarchy levels of TQM enablers and IT resource together for its successful implementation in the Information and Communication Technology (ICT) industry. A total of 17 TQM enablers (nine) and IT resources (eight) were identified and partitioned into 3 categories and were prioritised by AHP approach. The finding indicates that the 17 sub-criteria can be grouped into three main categories namely organizing, tools and techniques, and culture and people. Further, out of 17 sub-criteria, three sub-criteria: top management commitment and support, total employee involvement, and continuous improvement got highest priority whereas three sub-criteria such as structural equation modelling, culture change, and customer satisfaction got lowest priority. The result suggests a hierarchy model for ICT industry to prioritise the enablers and resources as well as to improve the TQM and IT performance in the ICT industry. This paper has some managerial implication which suggests the managers of ICT industry to implement TQM and IT together in their organizations to get maximum benefits and how to utilize available resources. At the end, conclusions, limitation, future scope of the study are presented.Keywords: Analytic Hierarchy Process, Information Technology, Information and Communication Technology, Prioritization, Total Quality Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195810435 Analytical Model to Predict the Shear Capacity of Reinforced Concrete Beams Externally Strengthened with CFRP Composites Conditions
Authors: Rajai Al-Rousan
Abstract:
This paper presents a proposed analytical model for predicting the shear strength of reinforced concrete beams strengthened with CFRP composites as external reinforcement. The proposed analytical model can predict the shear contribution of CFRP composites of RC beams with an acceptable coefficient of correlation with the tested results. Based on the comparison of the proposed model with the published well-known models (ACI model, Triantafillou model, and Colotti model), the ACI model had a wider range of 0.16 to 10.08 for the ratio between tested and predicted ultimate shears at failure. Also, an acceptable range of 0.27 to 2.78 for the ratio between tested and predicted ultimate shears by the Triantafillou model. Finally, the best prediction (the ratio between the tested and predicted ones) of the ultimate shear capacity is observed by using Colotti model with a range of 0.20 to 1.78. Thus, the contribution of the CFRP composites as external reinforcement can be predicted with high accuracy by using the proposed analytical model.
Keywords: Predicting, shear capacity, reinforced concrete, beams, strengthened, externally, CFRP composites.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 87210434 Bit Model Based Key Management Scheme for Secure Group Communication
Authors: R. Varalakshmi
Abstract:
For the last decade, researchers have started to focus their interest on Multicast Group Key Management Framework. The central research challenge is secure and efficient group key distribution. The present paper is based on the Bit model based Secure Multicast Group key distribution scheme using the most popular absolute encoder output type code named Gray Code. The focus is of two folds. The first fold deals with the reduction of computation complexity which is achieved in our scheme by performing fewer multiplication operations during the key updating process. To optimize the number of multiplication operations, an O(1) time algorithm to multiply two N-bit binary numbers which could be used in an N x N bit-model of reconfigurable mesh is used in this proposed work. The second fold aims at reducing the amount of information stored in the Group Center and group members while performing the update operation in the key content. Comparative analysis to illustrate the performance of various key distribution schemes is shown in this paper and it has been observed that this proposed algorithm reduces the computation and storage complexity significantly. Our proposed algorithm is suitable for high performance computing environment.
Keywords: Multicast Group key distribution, Bit model, Integer Multiplications, reconfigurable mesh, optimal algorithm, Gray Code, Computation Complexity, Storage Complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 197210433 Discovering the Dimension of Abstractness: Structure-Based Model that Learns New Categories and Categorizes on Different Levels of Abstraction
Authors: Georgi I. Petkov, Ivan I. Vankov, Yolina A. Petrova
Abstract:
A structure-based model of category learning and categorization at different levels of abstraction is presented. The model compares different structures and expresses their similarity implicitly in the forms of mappings. Based on this similarity, the model can categorize different targets either as members of categories that it already has or creates new categories. The model is novel using two threshold parameters to evaluate the structural correspondence. If the similarity between two structures exceeds the higher threshold, a new sub-ordinate category is created. Vice versa, if the similarity does not exceed the higher threshold but does the lower one, the model creates a new category on higher level of abstraction.
Keywords: Analogy-making, categorization, learning of categories, abstraction, hierarchical structure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 78110432 Assessment of Path Loss Prediction Models for Wireless Propagation Channels at L-Band Frequency over Different Micro-Cellular Environments of Ekiti State, Southwestern Nigeria
Authors: C. I. Abiodun, S. O. Azi, J. S. Ojo, P. Akinyemi
Abstract:
The design of accurate and reliable mobile communication systems depends majorly on the suitability of path loss prediction methods and the adaptability of the methods to various environments of interest. In this research, the results of the adaptability of radio channel behavior are presented based on practical measurements carried out in the 1800 MHz frequency band. The measurements are carried out in typical urban, suburban and rural environments in Ekiti State, Southwestern part of Nigeria. A total number of seven base stations of MTN GSM service located in the studied environments were monitored. Path loss and break point distances were deduced from the measured received signal strength (RSS) and a practical path loss model is proposed based on the deduced break point distances. The proposed two slope model, regression line and four existing path loss models were compared with the measured path loss values. The standard deviations of each model with respect to the measured path loss were estimated for each base station. The proposed model and regression line exhibited lowest standard deviations followed by the Cost231-Hata model when compared with the Erceg Ericsson and SUI models. Generally, the proposed two-slope model shows closest agreement with the measured values with a mean error values of 2 to 6 dB. These results show that, either the proposed two slope model or Cost 231-Hata model may be used to predict path loss values in mobile micro cell coverage in the well-considered environments. Information from this work will be useful for link design of microwave band wireless access systems in the region.
Keywords: Break-point distances, path loss models, path loss exponent, received signal strength.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 81910431 A Mean–Variance–Skewness Portfolio Optimization Model
Authors: Kostas Metaxiotis
Abstract:
Portfolio optimization is one of the most important topics in finance. This paper proposes a mean–variance–skewness (MVS) portfolio optimization model. Traditionally, the portfolio optimization problem is solved by using the mean–variance (MV) framework. In this study, we formulate the proposed model as a three-objective optimization problem, where the portfolio's expected return and skewness are maximized whereas the portfolio risk is minimized. For solving the proposed three-objective portfolio optimization model we apply an adapted version of the non-dominated sorting genetic algorithm (NSGAII). Finally, we use a real dataset from FTSE-100 for validating the proposed model.
Keywords: Evolutionary algorithms, portfolio optimization, skewness, stock selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142010430 The Gasoil Hydrofining Kinetics Constants Identification
Authors: C. Patrascioiu, V. Matei, N. Nicolae
Abstract:
The paper describes the experiments and the kinetic parameters calculus of the gasoil hydrofining. They are presented experimental results of gasoil hidrofining using Mo and promoted with Ni on aluminum support catalyst. The authors have adapted a kinetic model gasoil hydrofining. Using this proposed kinetic model and the experimental data they have calculated the parameters of the model. The numerical calculus is based on minimizing the difference between the experimental sulf concentration and kinetic model estimation.
Keywords: Hydrofining, kinetic, modeling, optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 202410429 The Principle Probabilities of Space-Distance Resolution for a Monostatic Radar and Realization in Cylindrical Array
Authors: Anatoly D. Pluzhnikov, Elena N. Pribludova, Alexander G. Ryndyk
Abstract:
In conjunction with the problem of the target selection on a clutter background, the analysis of the scanning rate influence on the spatial-temporal signal structure, the generalized multivariate correlation function and the quality of the resolution with the increase pulse repetition frequency is made. The possibility of the object space-distance resolution, which is conditioned by the range-to-angle conversion with an increased scanning rate, is substantiated. The calculations for the real cylindrical array at high scanning rate are presented. The high scanning rate let to get the signal to noise improvement of the order of 10 dB for the space-time signal processing.Keywords: Antenna pattern, array, signal processing, spatial resolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 107910428 The Recreation Technique Model from the Perspective of Environmental Quality Elements
Authors: G. Gradinaru, S. Olteanu
Abstract:
The quality improvements of the environmental elements could increase the recreational opportunities in a certain area (destination). The technique of the need for recreation focuses on choosing certain destinations for recreational purposes. The basic exchange taken into consideration is the one between the satisfaction gained after staying in that area and the value expressed in money and time allocated. The number of tourists in the respective area, the duration of staying and the money spent including transportation provide information on how individuals rank the place or certain aspects of the area (such as the quality of the environmental elements). For the statistical analysis of the environmental benefits offered by an area through the need of recreation technique, the following stages are suggested: - characterization of the reference area based on the statistical variables considered; - estimation of the environmental benefit through comparing the reference area with other similar areas (having the same environmental characteristics), from the perspective of the statistical variables considered. The model compared in recreation technique faced with a series of difficulties which refers to the reference area and correct transformation of time in money.Keywords: Comparison in recreation technique, the quality of the environmental elements, statistical analysis model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 108910427 Reduced Dynamic Time Warping for Handwriting Recognition Based on Multidimensional Time Series of a Novel Pen Device
Authors: Muzaffar Bashir, Jürgen Kempf
Abstract:
The purpose of this paper is to present a Dynamic Time Warping technique which reduces significantly the data processing time and memory size of multi-dimensional time series sampled by the biometric smart pen device BiSP. The acquisition device is a novel ballpoint pen equipped with a diversity of sensors for monitoring the kinematics and dynamics of handwriting movement. The DTW algorithm has been applied for time series analysis of five different sensor channels providing pressure, acceleration and tilt data of the pen generated during handwriting on a paper pad. But the standard DTW has processing time and memory space problems which limit its practical use for online handwriting recognition. To face with this problem the DTW has been applied to the sum of the five sensor signals after an adequate down-sampling of the data. Preliminary results have shown that processing time and memory size could significantly be reduced without deterioration of performance in single character and word recognition. Further excellent accuracy in recognition was achieved which is mainly due to the reduced dynamic time warping RDTW technique and a novel pen device BiSP.Keywords: Biometric character recognition, biometric person authentication, biometric smart pen BiSP, dynamic time warping DTW, online-handwriting recognition, multidimensional time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 240610426 Relationships between Information Transparency, Corporate Governance and D&O Insurance
Authors: Shu-Lin Lin, Ching-Chien Yang
Abstract:
This study examines the influence of information transparency and corporate governance on purchase directors and officers liability (D&O) insurance decisions. The results show that companies with greater information transparency have significant demand for D&O insurance. Greater transparency in voluntary disclosures is significantly and positively associated with demand for insurance, indicating that increasing the degree of information disclosure reduces information asymmetry for insurers, which stimulates their willingness to provide greater protection. Analysis of insured and uninsured subsamples indicates that uninsured companies have superior corporate governance compared to insured companies. Although insured companies tend to have weaker corporate governance structures, they appoint Big 4 firms or industry experts to compensate for the weakness of their corporate governance. Empirical results indicate that purchasing D&O insurance can strengthen external corporate governance and increase companies’ willingness to voluntarily provide more transparent information.
Keywords: Directors and officers liability (D&O) insurance, information transparency, corporate governance, Big 4.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188110425 Comparing the Durability of Saudi Silica Sands for Use in Foundry Processing
Authors: Mahdi Alsagour, Sam Ramrattan
Abstract:
This paper was developed to investigate two types of sands from the Kingdom of Saudi Arabia (KSA) for potential use in the global metal casting industry. Four types of sands were selected for study, two of the sand systems investigated are natural sands from the KSA. The third sand sample is a heat processed synthetic sand and the last sample is commercially available US silica sand that is used as a control in the study. The purpose of this study is to define the durability of the four sand systems selected for foundry usage. Additionally, chemical analysis of the sand systems is presented before and after elevated temperature exposure. Results show that Saudi silica sands are durable and can be used in foundry processing.
Keywords: Alternative molding media, foundry sand, reclamation, silica sand, specialty sand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 73610424 Progressive AAM Based Robust Face Alignment
Authors: Daehwan Kim, Jaemin Kim, Seongwon Cho, Yongsuk Jang, Sun-Tae Chung, Boo-Gyoun Kim
Abstract:
AAM has been successfully applied to face alignment, but its performance is very sensitive to initial values. In case the initial values are a little far distant from the global optimum values, there exists a pretty good possibility that AAM-based face alignment may converge to a local minimum. In this paper, we propose a progressive AAM-based face alignment algorithm which first finds the feature parameter vector fitting the inner facial feature points of the face and later localize the feature points of the whole face using the first information. The proposed progressive AAM-based face alignment algorithm utilizes the fact that the feature points of the inner part of the face are less variant and less affected by the background surrounding the face than those of the outer part (like the chin contour). The proposed algorithm consists of two stages: modeling and relation derivation stage and fitting stage. Modeling and relation derivation stage first needs to construct two AAM models: the inner face AAM model and the whole face AAM model and then derive relation matrix between the inner face AAM parameter vector and the whole face AAM model parameter vector. In the fitting stage, the proposed algorithm aligns face progressively through two phases. In the first phase, the proposed algorithm will find the feature parameter vector fitting the inner facial AAM model into a new input face image, and then in the second phase it localizes the whole facial feature points of the new input face image based on the whole face AAM model using the initial parameter vector estimated from using the inner feature parameter vector obtained in the first phase and the relation matrix obtained in the first stage. Through experiments, it is verified that the proposed progressive AAM-based face alignment algorithm is more robust with respect to pose, illumination, and face background than the conventional basic AAM-based face alignment algorithm.Keywords: Face Alignment, AAM, facial feature detection, model matching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 164110423 OCR for Script Identification of Hindi (Devnagari) Numerals using Feature Sub Selection by Means of End-Point with Neuro-Memetic Model
Authors: Banashree N. P., R. Vasanta
Abstract:
Recognition of Indian languages scripts is challenging problems. In Optical Character Recognition [OCR], a character or symbol to be recognized can be machine printed or handwritten characters/numerals. There are several approaches that deal with problem of recognition of numerals/character depending on the type of feature extracted and different way of extracting them. This paper proposes a recognition scheme for handwritten Hindi (devnagiri) numerals; most admired one in Indian subcontinent. Our work focused on a technique in feature extraction i.e. global based approach using end-points information, which is extracted from images of isolated numerals. These feature vectors are fed to neuro-memetic model [18] that has been trained to recognize a Hindi numeral. The archetype of system has been tested on varieties of image of numerals. . In proposed scheme data sets are fed to neuro-memetic algorithm, which identifies the rule with highest fitness value of nearly 100 % & template associates with this rule is nothing but identified numerals. Experimentation result shows that recognition rate is 92-97 % compared to other models.Keywords: OCR, Global Feature, End-Points, Neuro-Memetic model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1760