Search results for: training algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2375

Search results for: training algorithms

455 Information Technologies in Human Resources Management - Selected Examples

Authors: A. Karasek

Abstract:

Rapid growth of Information Technologies (IT) has had huge influence on enterprises, and it has contributed to its promotion and increasingly extensive use in enterprises. Information Technologies have to a large extent determined the processes taking place in an enterprise; what is more, IT development has brought the need to adopt a brand new approach to human resources management in an enterprise. The use of IT in human resource management (HRM) is of high importance due to the growing role of information and information technologies. The aim of this paper is to evaluate the use of information technologies in human resources management in enterprises. These practices will be presented in the following areas: recruitment and selection, development and training, employee assessment, motivation, talent management, personnel service. Results of conducted survey show diversity of solutions applied in particular areas of human resource management. In the future, further development in this area should be expected, as well as integration of individual HRM areas, growing mobile-enabled HR processes and their transfer into the cloud. Presented IT solutions applied in HRM are highly innovative, which is of great significance due to their possible implementation in other enterprises.

Keywords: E-HR, human resources management, HRM practices, HRMS, information technologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5487
454 Enhancing Multi-Frame Images Using Self-Delaying Dynamic Networks

Authors: Lewis E. Hibell, Honghai Liu, David J. Brown

Abstract:

This paper presents the use of a newly created network structure known as a Self-Delaying Dynamic Network (SDN) to create a high resolution image from a set of time stepped input frames. These SDNs are non-recurrent temporal neural networks which can process time sampled data. SDNs can store input data for a lifecycle and feature dynamic logic based connections between layers. Several low resolution images and one high resolution image of a scene were presented to the SDN during training by a Genetic Algorithm. The SDN was trained to process the input frames in order to recreate the high resolution image. The trained SDN was then used to enhance a number of unseen noisy image sets. The quality of high resolution images produced by the SDN is compared to that of high resolution images generated using Bi-Cubic interpolation. The SDN produced images are superior in several ways to the images produced using Bi-Cubic interpolation.

Keywords: Image Enhancement, Neural Networks, Multi-Frame.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168
453 The Appraisal of Construction Sites Productivity: In Kendall’s Concordance

Authors: Abdulkadir Abu Lawal

Abstract:

For the dearth of reliable cardinal numerical data, the linked phenomena in productivity indices such as operational costs and company turnovers, etc. could not be investigated. This would not give us insight to the root of productivity problems at unique sites. So, ordinal ranking by professionals who were most directly involved with construction sites was applied for Kendall’s concordance. Responses gathered from independent architects, builders/engineers, and quantity surveyors were herein analyzed. They were responses based on factors that affect sites productivity, and these factors were categorized as head office factors, resource management effectiveness factors, motivational factors, and training/skill development factors. It was found that productivity is low and has to be improved in order to facilitate Nigerian efforts in bridging its infrastructure deficit. The significance of this work is underlined with the Kendall’s coefficient of concordance of 0.78, while remedial measures must be emphasized to stimulate better productivity. Further detailed study can be undertaken by using Fuzzy logic analysis on wider Delphi survey.

Keywords: Factors, Kendall’s coefficient of concordance, magnitude of agreement, percentage magnitude of dichotomy, ranking variables.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 944
452 Holistic Face Recognition using Multivariate Approximation, Genetic Algorithms and AdaBoost Classifier: Preliminary Results

Authors: C. Villegas-Quezada, J. Climent

Abstract:

Several works regarding facial recognition have dealt with methods which identify isolated characteristics of the face or with templates which encompass several regions of it. In this paper a new technique which approaches the problem holistically dispensing with the need to identify geometrical characteristics or regions of the face is introduced. The characterization of a face is achieved by randomly sampling selected attributes of the pixels of its image. From this information we construct a set of data, which correspond to the values of low frequencies, gradient, entropy and another several characteristics of pixel of the image. Generating a set of “p" variables. The multivariate data set with different polynomials minimizing the data fitness error in the minimax sense (L∞ - Norm) is approximated. With the use of a Genetic Algorithm (GA) it is able to circumvent the problem of dimensionality inherent to higher degree polynomial approximations. The GA yields the degree and values of a set of coefficients of the polynomials approximating of the image of a face. By finding a family of characteristic polynomials from several variables (pixel characteristics) for each face (say Fi ) in the data base through a resampling process the system in use, is trained. A face (say F ) is recognized by finding its characteristic polynomials and using an AdaBoost Classifier from F -s polynomials to each of the Fi -s polynomials. The winner is the polynomial family closer to F -s corresponding to target face in data base.

Keywords: AdaBoost Classifier, Holistic Face Recognition, Minimax Multivariate Approximation, Genetic Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1460
451 A 10 Giga VPN Accelerator Board for Trust Channel Security System

Authors: Ki Hyun Kim, Jang-Hee Yoo, Kyo Il Chung

Abstract:

This paper proposes a VPN Accelerator Board (VPN-AB), a virtual private network (VPN) protocol designed for trust channel security system (TCSS). TCSS supports safety communication channel between security nodes in internet. It furnishes authentication, confidentiality, integrity, and access control to security node to transmit data packets with IPsec protocol. TCSS consists of internet key exchange block, security association block, and IPsec engine block. The internet key exchange block negotiates crypto algorithm and key used in IPsec engine block. Security Association blocks setting-up and manages security association information. IPsec engine block treats IPsec packets and consists of networking functions for communication. The IPsec engine block should be embodied by H/W and in-line mode transaction for high speed IPsec processing. Our VPN-AB is implemented with high speed security processor that supports many cryptographic algorithms and in-line mode. We evaluate a small TCSS communication environment, and measure a performance of VPN-AB in the environment. The experiment results show that VPN-AB gets a performance throughput of maximum 15.645Gbps when we set the IPsec protocol with 3DES-HMAC-MD5 tunnel mode.

Keywords: TCSS(Trust Channel Security System), VPN(VirtualPrivate Network), IPsec, SSL, Security Processor, Securitycommunication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075
450 Traffic Forecasting for Open Radio Access Networks Virtualized Network Functions in 5G Networks

Authors: Khalid Ali, Manar Jammal

Abstract:

In order to meet the stringent latency and reliability requirements of the upcoming 5G networks, Open Radio Access Networks (O-RAN) have been proposed. The virtualization of O-RAN has allowed it to be treated as a Network Function Virtualization (NFV) architecture, while its components are considered Virtualized Network Functions (VNFs). Hence, intelligent Machine Learning (ML) based solutions can be utilized to apply different resource management and allocation techniques on O-RAN. However, intelligently allocating resources for O-RAN VNFs can prove challenging due to the dynamicity of traffic in mobile networks. Network providers need to dynamically scale the allocated resources in response to the incoming traffic. Elastically allocating resources can provide a higher level of flexibility in the network in addition to reducing the OPerational EXpenditure (OPEX) and increasing the resources utilization. Most of the existing elastic solutions are reactive in nature, despite the fact that proactive approaches are more agile since they scale instances ahead of time by predicting the incoming traffic. In this work, we propose and evaluate traffic forecasting models based on the ML algorithm. The algorithms aim at predicting future O-RAN traffic by using previous traffic data. Detailed analysis of the traffic data was carried out to validate the quality and applicability of the traffic dataset. Hence, two ML models were proposed and evaluated based on their prediction capabilities.

Keywords: O-RAN, traffic forecasting, NFV, ARIMA, LSTM, elasticity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 473
449 Deep Reinforcement Learning for Optimal Decision-making in Supply Chains

Authors: Nitin Singh, Meng Ling, Talha Ahmed, Tianxia Zhao, Reinier van de Pol

Abstract:

We propose the use of Reinforcement Learning (RL) as a viable alternative for optimizing supply chain management, particularly in scenarios with stochasticity in product demands. RL’s adaptability to changing conditions and its demonstrated success in diverse fields of sequential decision-making make it a promising candidate for addressing supply chain problems. We investigate the impact of demand fluctuations in a multi-product supply chain system and develop RL agents with learned generalizable policies. We provide experimentation details for training RL agents and a statistical analysis of the results. We study generalization ability of RL agents for different demand uncertainty scenarios and observe superior performance compared to the agents trained with fixed demand curves. The proposed methodology has the potential to lead to cost reduction and increased profit for companies dealing with frequent inventory movement between supply and demand nodes.

Keywords: Inventory Management, Reinforcement Learning, Supply Chain Optimization, Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 292
448 The Change in Management Accounting from an Institutional and Contingency Perspective: A Case Study for a Romanian Company

Authors: Gabriel Jinga, Madalina Dumitru

Abstract:

The objective of this paper is to present the process of change in management accounting in Romania, a former communist country from Eastern Europe. In order to explain this process, we used the contingency and institutional theories. We focused on the following directions: the presentation of the scientific context and motivation of this research and the case study. We presented the state of the art in the process of change in the management accounting from the international and national perspective. We also described the evolution of management accounting in Romania in the context of economic and political changes. An important moment was the fall of communism in 1989. This represents a starting point for a new economic environment and for new management accounting. Accordingly, we developed a case study which presented this evolution. The conclusion of our research was that the changes in the management accounting system of the company analysed occurred in the same time with the institutionalisation of some elements (e.g. degree of competition, training and competencies in management accounting). The management accounting system was modelled by the contingencies specific to this company (e.g. environment, industry, strategy).

Keywords: Management accounting, change, Romania, contingency and institutional theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2298
447 Interactive Shadow Play Animation System

Authors: Bo Wan, Xiu Wen, Lingling An, Xiaoling Ding

Abstract:

The paper describes a Chinese shadow play animation system based on Kinect. Users, without any professional training, can personally manipulate the shadow characters to finish a shadow play performance by their body actions and get a shadow play video through giving the record command to our system if they want. In our system, Kinect is responsible for capturing human movement and voice commands data. Gesture recognition module is used to control the change of the shadow play scenes. After packaging the data from Kinect and the recognition result from gesture recognition module, VRPN transmits them to the server-side. At last, the server-side uses the information to control the motion of shadow characters and video recording. This system not only achieves human-computer interaction, but also realizes the interaction between people. It brings an entertaining experience to users and easy to operate for all ages. Even more important is that the application background of Chinese shadow play embodies the protection of the art of shadow play animation.

Keywords: Gesture recognition, Kinect, shadow play animation, VRPN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2667
446 A Microcontroller Implementation of Constrained Model Predictive Control

Authors: Amira Kheriji Abbes, Faouzi Bouani, Mekki Ksouri

Abstract:

Model Predictive Control (MPC) is an established control technique in a wide range of process industries. The reason for this success is its ability to handle multivariable systems and systems having input, output or state constraints. Neverthless comparing to PID controller, the implementation of the MPC in miniaturized devices like Field Programmable Gate Arrays (FPGA) and microcontrollers has historically been very small scale due to its complexity in implementation and its computation time requirement. At the same time, such embedded technologies have become an enabler for future manufacturing enterprisers as well as a transformer of organizations and markets. In this work, we take advantage of these recent advances in this area in the deployment of one of the most studied and applied control technique in the industrial engineering. In this paper, we propose an efficient firmware for the implementation of constrained MPC in the performed STM32 microcontroller using interior point method. Indeed, performances study shows good execution speed and low computational burden. These results encourage to develop predictive control algorithms to be programmed in industrial standard processes. The PID anti windup controller was also implemented in the STM32 in order to make a performance comparison with the MPC. The main features of the proposed constrained MPC framework are illustrated through two examples.

Keywords: Embedded software, microcontroller, constrainedModel Predictive Control, interior point method, PID antiwindup, Keil tool, C/Cµ language.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2769
445 Radial Basis Surrogate Model Integrated to Evolutionary Algorithm for Solving Computation Intensive Black-Box Problems

Authors: Abdulbaset Saad, Adel Younis, Zuomin Dong

Abstract:

For design optimization with high-dimensional expensive problems, an effective and efficient optimization methodology is desired. This work proposes a series of modification to the Differential Evolution (DE) algorithm for solving computation Intensive Black-Box Problems. The proposed methodology is called Radial Basis Meta-Model Algorithm Assisted Differential Evolutionary (RBF-DE), which is a global optimization algorithm based on the meta-modeling techniques. A meta-modeling assisted DE is proposed to solve computationally expensive optimization problems. The Radial Basis Function (RBF) model is used as a surrogate model to approximate the expensive objective function, while DE employs a mechanism to dynamically select the best performing combination of parameters such as differential rate, cross over probability, and population size. The proposed algorithm is tested on benchmark functions and real life practical applications and problems. The test results demonstrate that the proposed algorithm is promising and performs well compared to other optimization algorithms. The proposed algorithm is capable of converging to acceptable and good solutions in terms of accuracy, number of evaluations, and time needed to converge.

Keywords: Differential evolution, engineering design, expensive computations, meta-modeling, radial basis function, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1140
444 Threshold Concepts in TESOL: A Thematic Analysis of Disciplinary Guiding Principles

Authors: Neil Morgan

Abstract:

The notion of Threshold Concepts has offered a fertile new perspective on the transformative effects of mastery of particular concepts on student understanding of subject matter and their developing identities as inductees into disciplinary discourse communities. Only by successfully traversing essential knowledge thresholds can neophytes achieve the more sophisticated understandings of subject matter possessed by mature members of a discipline. This paper uses thematic analysis of disciplinary guiding principles to identify nine candidate Threshold Concepts that appear to underpin effective TESOL practice. The relationship between these candidate TESOL Threshold Concepts, TESOL principles, and TESOL instructional techniques appears to be amenable to a schematic representation based on superordinate categories of TESOL practitioner concern and, as such, offers an alternative to the view of Threshold Concepts as a privileged subset of disciplinary core concepts. The paper concludes by exploring the potential of a Threshold Concepts framework to productively inform TESOL initial teacher education (ITE) and in-service education and training (INSET).

Keywords: TESOL, threshold concepts, TESOL principles, TESOL ITE/INSET, community of practice.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 688
443 Motion Prediction and Motion Vector Cost Reduction during Fast Block Motion Estimation in MCTF

Authors: Karunakar A K, Manohara Pai M M

Abstract:

In 3D-wavelet video coding framework temporal filtering is done along the trajectory of motion using Motion Compensated Temporal Filtering (MCTF). Hence computationally efficient motion estimation technique is the need of MCTF. In this paper a predictive technique is proposed in order to reduce the computational complexity of the MCTF framework, by exploiting the high correlation among the frames in a Group Of Picture (GOP). The proposed technique applies coarse and fine searches of any fast block based motion estimation, only to the first pair of frames in a GOP. The generated motion vectors are supplied to the next consecutive frames, even to subsequent temporal levels and only fine search is carried out around those predicted motion vectors. Hence coarse search is skipped for all the motion estimation in a GOP except for the first pair of frames. The technique has been tested for different fast block based motion estimation algorithms over different standard test sequences using MC-EZBC, a state-of-the-art scalable video coder. The simulation result reveals substantial reduction (i.e. 20.75% to 38.24%) in the number of search points during motion estimation, without compromising the quality of the reconstructed video compared to non-predictive techniques. Since the motion vectors of all the pair of frames in a GOP except the first pair will have value ±1 around the motion vectors of the previous pair of frames, the number of bits required for motion vectors is also reduced by 50%.

Keywords: Motion Compensated Temporal Filtering, predictivemotion estimation, lifted wavelet transform, motion vector

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1585
442 Implementation of a Multimodal Biometrics Recognition System with Combined Palm Print and Iris Features

Authors: Rabab M. Ramadan, Elaraby A. Elgallad

Abstract:

With extensive application, the performance of unimodal biometrics systems has to face a diversity of problems such as signal and background noise, distortion, and environment differences. Therefore, multimodal biometric systems are proposed to solve the above stated problems. This paper introduces a bimodal biometric recognition system based on the extracted features of the human palm print and iris. Palm print biometric is fairly a new evolving technology that is used to identify people by their palm features. The iris is a strong competitor together with face and fingerprints for presence in multimodal recognition systems. In this research, we introduced an algorithm to the combination of the palm and iris-extracted features using a texture-based descriptor, the Scale Invariant Feature Transform (SIFT). Since the feature sets are non-homogeneous as features of different biometric modalities are used, these features will be concatenated to form a single feature vector. Particle swarm optimization (PSO) is used as a feature selection technique to reduce the dimensionality of the feature. The proposed algorithm will be applied to the Institute of Technology of Delhi (IITD) database and its performance will be compared with various iris recognition algorithms found in the literature.

Keywords: Iris recognition, particle swarm optimization, feature extraction, feature selection, palm print, scale invariant feature transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 847
441 Incidence of Disasters and Coping Mechanism among Farming Households in South West Nigeria

Authors: Fawehinmi Olabisi Alaba, Adeniyi O. R.

Abstract:

Farming households faces lots of disaster which contribute to endemic poverty. Anticipated increases in extreme weather events will exacerbate this. Primary data was administered to farming household using multi-stage random sampling technique. The result of the analysis shows that majority of the respondents (69.9%) are male, have mean household size, years of formal education and age of 5±1.14, 6±3.41, and 51.06±10.43 respectively. The major (48.9%) type of disaster experienced is flooding. Major coping mechanism adopted is sourcing for support from family and friends. Age, education, experience, access to extension agent, and mitigation control method contribute significantly to vulnerability to disaster. The major adaptation method (62.3%) is construction of drainage.

The study revealed that the coping mechanisms employed may become less effective as increasingly fragile livelihood systems struggle to withstand disaster shocks. Thus there is need for training of the farmers on measures to adapt to mitigate the shock from disasters

Keywords: Adaptation, Disasters, Flooding, Vulnerability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2098
440 A Novel Approach for Coin Identification using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms

Authors: J. Prakash, K. Rajesh

Abstract:

In this paper we present a new method for coin identification. The proposed method adopts a hybrid scheme using Eigenvalues of covariance matrix, Circular Hough Transform (CHT) and Bresenham-s circle algorithm. The statistical and geometrical properties of the small and large Eigenvalues of the covariance matrix of a set of edge pixels over a connected region of support are explored for the purpose of circular object detection. Sparse matrix technique is used to perform CHT. Since sparse matrices squeeze zero elements and contain only a small number of non-zero elements, they provide an advantage of matrix storage space and computational time. Neighborhood suppression scheme is used to find the valid Hough peaks. The accurate position of the circumference pixels is identified using Raster scan algorithm which uses geometrical symmetry property. After finding circular objects, the proposed method uses the texture on the surface of the coins called texton, which are unique properties of coins, refers to the fundamental micro structure in generic natural images. This method has been tested on several real world images including coin and non-coin images. The performance is also evaluated based on the noise withstanding capability.

Keywords: Circular Hough Transform, Coin detection, Covariance matrix, Eigenvalues, Raster scan Algorithm, Texton.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857
439 Production Planning for Animal Food Industry under Demand Uncertainty

Authors: Pirom Thangchitpianpol, Suttipong Jumroonrut

Abstract:

This research investigates the distribution of food demand for animal food and the optimum amount of that food production at minimum cost. The data consist of customer purchase orders for the food of laying hens, price of food for laying hens, cost per unit for the food inventory, cost related to food of laying hens in which the food is out of stock, such as fine, overtime, urgent purchase for material. They were collected from January, 1990 to December, 2013 from a factory in Nakhonratchasima province. The collected data are analyzed in order to explore the distribution of the monthly food demand for the laying hens and to see the rate of inventory per unit. The results are used in a stochastic linear programming model for aggregate planning in which the optimum production or minimum cost could be obtained. Programming algorithms in MATLAB and tools in Linprog software are used to get the solution. The distribution of the food demand for laying hens and the random numbers are used in the model. The study shows that the distribution of monthly food demand for laying has a normal distribution, the monthly average amount (unit: 30 kg) of production from January to December. The minimum total cost average for 12 months is Baht 62,329,181.77. Therefore, the production planning can reduce the cost by 14.64% from real cost.

Keywords: Animal food, Stochastic linear programming, Production planning, Demand Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885
438 Accurate Visualization of Graphs of Functions of Two Real Variables

Authors: Zeitoun D. G., Thierry Dana-Picard

Abstract:

The study of a real function of two real variables can be supported by visualization using a Computer Algebra System (CAS). One type of constraints of the system is due to the algorithms implemented, yielding continuous approximations of the given function by interpolation. This often masks discontinuities of the function and can provide strange plots, not compatible with the mathematics. In recent years, point based geometry has gained increasing attention as an alternative surface representation, both for efficient rendering and for flexible geometry processing of complex surfaces. In this paper we present different artifacts created by mesh surfaces near discontinuities and propose a point based method that controls and reduces these artifacts. A least squares penalty method for an automatic generation of the mesh that controls the behavior of the chosen function is presented. The special feature of this method is the ability to improve the accuracy of the surface visualization near a set of interior points where the function may be discontinuous. The present method is formulated as a minimax problem and the non uniform mesh is generated using an iterative algorithm. Results show that for large poorly conditioned matrices, the new algorithm gives more accurate results than the classical preconditioned conjugate algorithm.

Keywords: Function singularities, mesh generation, point allocation, visualization, collocation least squares method, Augmented Lagrangian method, Uzawa's Algorithm, Preconditioned Conjugate Gradien

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684
437 Land Use Change Detection Using Remote Sensing and GIS

Authors: Naser Ahmadi Sani, Karim Solaimani, Lida Razaghnia, Jalal Zandi

Abstract:

In recent decades, rapid and incorrect changes in land-use have been associated with consequences such as natural resources degradation and environmental pollution. Detecting changes in land-use is one of the tools for natural resource management and assessment of changes in ecosystems. The target of this research is studying the land-use changes in Haraz basin with an area of 677000 hectares in a 15 years period (1996 to 2011) using LANDSAT data. Therefore, the quality of the images was first evaluated. Various enhancement methods for creating synthetic bonds were used in the analysis. Separate training sites were selected for each image. Then the images of each period were classified in 9 classes using supervised classification method and the maximum likelihood algorithm. Finally, the changes were extracted in GIS environment. The results showed that these changes are an alarm for the HARAZ basin status in future. The reason is that 27% of the area has been changed, which is related to changing the range lands to bare land and dry farming and also changing the dense forest to sparse forest, horticulture, farming land and residential area.

Keywords: HARAZ Basin, Change Detection, Land-use, Satellite Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2291
436 Deep Reinforcement Learning Approach for Trading Automation in the Stock Market

Authors: Taylan Kabbani, Ekrem Duman

Abstract:

Deep Reinforcement Learning (DRL) algorithms can scale to previously intractable problems. The automation of profit generation in the stock market is possible using DRL, by combining  the financial assets price ”prediction” step and the ”allocation” step of the portfolio in one unified process to produce fully autonomous systems capable of interacting with its environment to make optimal decisions through trial and error. This work represents a DRL model to generate profitable trades in the stock market, effectively overcoming the limitations of supervised learning approaches. We formulate the trading problem as a Partially observed Markov Decision Process (POMDP) model, considering the constraints imposed by the stock market, such as liquidity and transaction costs. We then solved the formulated POMDP problem using the Twin Delayed Deep Deterministic Policy Gradient (TD3) algorithm and achieved a 2.68 Sharpe ratio on the test dataset. From the point of view of stock market forecasting and the intelligent decision-making mechanism, this paper demonstrates the superiority of DRL in financial markets over other types of machine learning and proves its credibility and advantages of strategic decision-making.

Keywords: Autonomous agent, deep reinforcement learning, MDP, sentiment analysis, stock market, technical indicators, twin delayed deep deterministic policy gradient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 481
435 Analysis and Classification of Hiv-1 Sub- Type Viruses by AR Model through Artificial Neural Networks

Authors: O. Yavuz, L. Ozyilmaz

Abstract:

HIV-1 genome is highly heterogeneous. Due to this variation, features of HIV-I genome is in a wide range. For this reason, the ability to infection of the virus changes depending on different chemokine receptors. From this point of view, R5 HIV viruses use CCR5 coreceptor while X4 viruses use CXCR5 and R5X4 viruses can utilize both coreceptors. Recently, in Bioinformatics, R5X4 viruses have been studied to classify by using the experiments on HIV-1 genome. In this study, R5X4 type of HIV viruses were classified using Auto Regressive (AR) model through Artificial Neural Networks (ANNs). The statistical data of R5X4, R5 and X4 viruses was analyzed by using signal processing methods and ANNs. Accessible residues of these virus sequences were obtained and modeled by AR model since the dimension of residues is large and different from each other. Finally the pre-processed data was used to evolve various ANN structures for determining R5X4 viruses. Furthermore ROC analysis was applied to ANNs to show their real performances. The results indicate that R5X4 viruses successfully classified with high sensitivity and specificity values training and testing ROC analysis for RBF, which gives the best performance among ANN structures.

Keywords: Auto-Regressive Model, HIV, Neural Networks, ROC Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1154
434 NonStationary CMA for Decision Feedback Equalization of Markovian Time Varying Channels

Authors: S. Cherif, M. Turki-Hadj Alouane

Abstract:

In this paper, we propose a modified version of the Constant Modulus Algorithm (CMA) tailored for blind Decision Feedback Equalizer (DFE) of first order Markovian time varying channels. The proposed NonStationary CMA (NSCMA) is designed so that it explicitly takes into account the Markovian structure of the channel nonstationarity. Hence, unlike the classical CMA, the NSCMA is not blind with respect to the channel time variations. This greatly helps the equalizer in the case of realistic channels, and avoids frequent transmissions of training sequences. This paper develops a theoretical analysis of the steady state performance of the CMA and the NSCMA for DFEs within a time varying context. Therefore, approximate expressions of the mean square errors are derived. We prove that in the steady state, the NSCMA exhibits better performance than the classical CMA. These new results are confirmed by simulation. Through an experimental study, we demonstrate that the Bit Error Rate (BER) is reduced by the NSCMA-DFE, and the improvement of the BER achieved by the NSCMA-DFE is as significant as the channel time variations are severe.

Keywords: Time varying channel, Markov model, Blind DFE, CMA, NSCMA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1274
433 Investigation of the Possibility to Prepare Supervised Classification Map of Gully Erosion by RS and GIS

Authors: Ali Mohammadi Torkashvand, Hamid Reza Alipour

Abstract:

This study investigates the possibility providing gully erosion map by the supervised classification of satellite images (ETM+) in two mountainous and plain land types. These land types were the part of Varamin plain, Tehran province, and Roodbar subbasin, Guilan province, as plain and mountain land types, respectively. The position of 652 and 124 ground control points were recorded by GPS respectively in mountain and plain land types. Soil gully erosion, land uses or plant covers were investigated in these points. Regarding ground control points and auxiliary points, training points of gully erosion and other surface features were introduced to software (Ilwis 3.3 Academic). The supervised classified map of gully erosion was prepared by maximum likelihood method and then, overall accuracy of this map was computed. Results showed that the possibility supervised classification of gully erosion isn-t possible, although it need more studies for results generalization to other mountainous regions. Also, with increasing land uses and other surface features in plain physiography, it decreases the classification of accuracy.

Keywords: Supervised classification, Gully erosion, Map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1798
432 Transformation Building of Micro- Entrepreneurs: A Conceptual Model

Authors: Abu Bakar Sedek Abdul Jamak, Saridan Abu Bakar, Zulkipli Ghazali, Roselind Wan

Abstract:

The majority of micro-entrepreneurs in Malaysia operate very small-scaled business activities such as food stalls, burger stalls, night market hawkers, grocery stores, constructions, rubber and oil palm small holders, and other agro-based services and activities. Why are they venturing into entrepreneurship - is it for survival, out of interest or due to encouragement and assistance from the local government? And why is it that some micro-entrepreneurs are lagging behind in entrepreneurship, and what do they need to rectify this situation so that they are able to progress further? Furthermore, what are the skills that the micro entrepreneurs should developed to transform them into successful micro-enterprises and become small and medium-sized enterprises (SME)? This paper proposes a 7-Step approach that can serve as a basis for identification of critical entrepreneurial success factors that enable policy makers, practitioners, consultants, training managers and other agencies in developing tools to assist micro business owners. This paper also highlights the experience of one of the successful companies in Malaysia that has transformed from micro-enterprise to become a large organization in less than 10 years.

Keywords: Entrepreneurship, Micro-entrepreneurs, Transformation, Customers

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2366
431 Three Tier Indoor Localization System for Digital Forensics

Authors: Dennis L. Owuor, Okuthe P. Kogeda, Johnson I. Agbinya

Abstract:

Mobile localization has attracted a great deal of attention recently due to the introduction of wireless networks. Although several localization algorithms and systems have been implemented and discussed in the literature, very few researchers have exploited the gap that exists between indoor localization, tracking, external storage of location information and outdoor localization for the purpose of digital forensics during and after a disaster. The contribution of this paper lies in the implementation of a robust system that is capable of locating, tracking mobile device users and store location information for both indoor and partially outdoor the cloud. The system can be used during disaster to track and locate mobile phone users. The developed system is a mobile application built based on Android, Hypertext Preprocessor (PHP), Cascading Style Sheets (CSS), JavaScript and MATLAB for the Android mobile users. Using Waterfall model of software development, we have implemented a three level system that is able to track, locate and store mobile device information in secure database (cloud) on almost a real time basis. The outcome of the study showed that the developed system is efficient with regard to the tracking and locating mobile devices. The system is also flexible, i.e. can be used in any building with fewer adjustments. Finally, the system is accurate for both indoor and outdoor in terms of locating and tracking mobile devices.

Keywords: Indoor localization, waterfall, digital forensics, tracking and cloud.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 913
430 Ethical Perspectives on Implementation of Computer Aided Design Curriculum in Architecture in Nigeria: A Case Study of Chukwuemeka Odumegwu Ojukwu University, Uli

Authors: Kelechi E. Ezeji

Abstract:

The use of Computer Aided Design (CAD) technologies has become pervasive in the Architecture, Engineering and Construction (AEC) industry. This has led to its inclusion as an important part of the training module in the curriculum for Architecture Schools in Nigeria. This paper examines the ethical questions that arise in the implementation of Computer Aided Design (CAD) Content of the curriculum for Architectural education. Using existing literature, it begins this scrutiny from the propriety of inclusion of CAD into the education of the architect and the obligations of the different stakeholders in the implementation process. It also examines the questions raised by the negative use of computing technologies as well as perceived negative influence of the use of CAD on design creativity. Survey methodology was employed to gather data from the Department of Architecture, Chukwuemeka Odumegwu Ojukwu University Uli, which has been used as a case study on how the issues raised are being addressed. The paper draws conclusions on what will make for successful ethical implementation.

Keywords: Computer aided design, curriculum, education, ethics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2208
429 Neural Network Based Icing Identification and Fault Tolerant Control of a 340 Aircraft

Authors: F. Caliskan

Abstract:

This paper presents a Neural Network (NN) identification of icing parameters in an A340 aircraft and a reconfiguration technique to keep the A/C performance close to the performance prior to icing. Five aircraft parameters are assumed to be considerably affected by icing. The off-line training for identifying the clear and iced dynamics is based on the Levenberg-Marquard Backpropagation algorithm. The icing parameters are located in the system matrix. The physical locations of the icing are assumed at the right and left wings. The reconfiguration is based on the technique known as the control mixer approach or pseudo inverse technique. This technique generates the new control input vector such that the A/C dynamics is not much affected by icing. In the simulations, the longitudinal and lateral dynamics of an Airbus A340 aircraft model are considered, and the stability derivatives affected by icing are identified. The simulation results show the successful NN identification of the icing parameters and the reconfigured flight dynamics having the similar performance before the icing. In other words, the destabilizing icing affect is compensated.

Keywords: Aircraft Icing, Stability Derivatives, Neural NetworkIdentification, Reconfiguration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681
428 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance

Authors: Loai AbdAllah, Mahmoud Kaiyal

Abstract:

Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.

Keywords: Missing values, distance metric, Bhattacharyya distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 752
427 Intelligent Recognition of Diabetes Disease via FCM Based Attribute Weighting

Authors: Kemal Polat

Abstract:

In this paper, an attribute weighting method called fuzzy C-means clustering based attribute weighting (FCMAW) for classification of Diabetes disease dataset has been used. The aims of this study are to reduce the variance within attributes of diabetes dataset and to improve the classification accuracy of classifier algorithm transforming from non-linear separable datasets to linearly separable datasets. Pima Indians Diabetes dataset has two classes including normal subjects (500 instances) and diabetes subjects (268 instances). Fuzzy C-means clustering is an improved version of K-means clustering method and is one of most used clustering methods in data mining and machine learning applications. In this study, as the first stage, fuzzy C-means clustering process has been used for finding the centers of attributes in Pima Indians diabetes dataset and then weighted the dataset according to the ratios of the means of attributes to centers of theirs. Secondly, after weighting process, the classifier algorithms including support vector machine (SVM) and k-NN (k- nearest neighbor) classifiers have been used for classifying weighted Pima Indians diabetes dataset. Experimental results show that the proposed attribute weighting method (FCMAW) has obtained very promising results in the classification of Pima Indians diabetes dataset.

Keywords: Fuzzy C-means clustering, Fuzzy C-means clustering based attribute weighting, Pima Indians diabetes dataset, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
426 Feature Selection with Kohonen Self Organizing Classification Algorithm

Authors: Francesco Maiorana

Abstract:

In this paper a one-dimension Self Organizing Map algorithm (SOM) to perform feature selection is presented. The algorithm is based on a first classification of the input dataset on a similarity space. From this classification for each class a set of positive and negative features is computed. This set of features is selected as result of the procedure. The procedure is evaluated on an in-house dataset from a Knowledge Discovery from Text (KDT) application and on a set of publicly available datasets used in international feature selection competitions. These datasets come from KDT applications, drug discovery as well as other applications. The knowledge of the correct classification available for the training and validation datasets is used to optimize the parameters for positive and negative feature extractions. The process becomes feasible for large and sparse datasets, as the ones obtained in KDT applications, by using both compression techniques to store the similarity matrix and speed up techniques of the Kohonen algorithm that take advantage of the sparsity of the input matrix. These improvements make it feasible, by using the grid, the application of the methodology to massive datasets.

Keywords: Clustering algorithm, Data mining, Feature selection, Grid, Kohonen Self Organizing Map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3027