Search results for: management algorithm
11784 A Multifactorial Algorithm to Automate Screening of Drug-Induced Liver Injury Cases in Clinical and Post-Marketing Settings
Authors: Osman Turkoglu, Alvin Estilo, Ritu Gupta, Liliam Pineda-Salgado, Rajesh Pandey
Abstract:
Background: Hepatotoxicity can be linked to a variety of clinical symptoms and histopathological signs, posing a great challenge in the surveillance of suspected drug-induced liver injury (DILI) cases in the safety database. Additionally, the majority of such cases are rare, idiosyncratic, highly unpredictable, and tend to demonstrate unique individual susceptibility; these qualities, in turn, lend to a pharmacovigilance monitoring process that is often tedious and time-consuming. Objective: Develop a multifactorial algorithm to assist pharmacovigilance physicians in identifying high-risk hepatotoxicity cases associated with DILI from the sponsor’s safety database (Argus). Methods: Multifactorial selection criteria were established using Structured Query Language (SQL) and the TIBCO Spotfire® visualization tool, via a combination of word fragments, wildcard strings, and mathematical constructs, based on Hy’s law criteria and pattern of injury (R-value). These criteria excluded non-eligible cases from monthly line listings mined from the Argus safety database. The capabilities and limitations of these criteria were verified by comparing a manual review of all monthly cases with system-generated monthly listings over six months. Results: On an average, over a period of six months, the algorithm accurately identified 92% of DILI cases meeting established criteria. The automated process easily compared liver enzyme elevations with baseline values, reducing the screening time to under 15 minutes as opposed to multiple hours exhausted using a cognitively laborious, manual process. Limitations of the algorithm include its inability to identify cases associated with non-standard laboratory tests, naming conventions, and/or incomplete/incorrectly entered laboratory values. Conclusions: The newly developed multifactorial algorithm proved to be extremely useful in detecting potential DILI cases, while heightening the vigilance of the drug safety department. Additionally, the application of this algorithm may be useful in identifying a potential signal for DILI in drugs not yet known to cause liver injury (e.g., drugs in the initial phases of development). This algorithm also carries the potential for universal application, due to its product-agnostic data and keyword mining features. Plans for the tool include improving it into a fully automated application, thereby completely eliminating a manual screening process.Keywords: automation, drug-induced liver injury, pharmacovigilance, post-marketing
Procedia PDF Downloads 15211783 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm
Authors: Xiang Jianhong, Wang Cong, Wang Linyu
Abstract:
With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.Keywords: telemedicine, fetal ECG, compressed sensing, joint sparse reconstruction, block sparse signal
Procedia PDF Downloads 12811782 Design and Implementation of DC-DC Converter with Inc-Cond Algorithm
Authors: Mustafa Engin Başoğlu, Bekir Çakır
Abstract:
The most important component affecting the efficiency of photovoltaic power systems are solar panels. Efficiency of these systems are significantly affected because of being low efficiency of solar panel. Therefore, solar panels should be operated under maximum power point conditions through a power converter. In this study, design boost converter with maximum power point tracking (MPPT) operation has been designed and performed with Incremental Conductance (Inc-Cond) algorithm by using direct duty control. Furthermore, it is shown that performance of boost converter with MPPT operation fails under low load resistance connection.Keywords: boost converter, incremental conductance (Inc-Cond), MPPT, solar panel
Procedia PDF Downloads 104611781 Stochastic Simulation of Random Numbers Using Linear Congruential Method
Authors: Melvin Ballera, Aldrich Olivar, Mary Soriano
Abstract:
Digital computers nowadays must be able to have a utility that is capable of generating random numbers. Usually, computer-generated random numbers are not random given predefined values such as starting point and end points, making the sequence almost predictable. There are many applications of random numbers such business simulation, manufacturing, services domain, entertainment sector and other equally areas making worthwhile to design a unique method and to allow unpredictable random numbers. Applying stochastic simulation using linear congruential algorithm, it shows that as it increases the numbers of the seed and range the number randomly produced or selected by the computer becomes unique. If this implemented in an environment where random numbers are very much needed, the reliability of the random number is guaranteed.Keywords: stochastic simulation, random numbers, linear congruential algorithm, pseudorandomness
Procedia PDF Downloads 31611780 Design and Implementation of a Cross-Network Security Management System
Authors: Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai
Abstract:
In recent years, the emerging network worms and attacks have distributive characteristics, which can spread globally in a very short time. Security management crossing networks to co-defense network-wide attacks and improve the efficiency of security administration is urgently needed. We propose a hierarchical distributed network security management system (HD-NSMS), which can integrate security management across multiple networks. First, we describe the system in macrostructure and microstructure; then discuss three key problems when building HD-NSMS: device model, alert mechanism, and emergency response mechanism; lastly, we describe the implementation of HD-NSMS. The paper is valuable for implementing NSMS in that it derives from a practical network security management system (NSMS).Keywords: network security management, device organization, emergency response, cross-network
Procedia PDF Downloads 16811779 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 21911778 Troubleshooting Petroleum Equipment Based on Wireless Sensors Based on Bayesian Algorithm
Authors: Vahid Bayrami Rad
Abstract:
In this research, common methods and techniques have been investigated with a focus on intelligent fault finding and monitoring systems in the oil industry. In fact, remote and intelligent control methods are considered a necessity for implementing various operations in the oil industry, but benefiting from the knowledge extracted from countless data generated with the help of data mining algorithms. It is a avoid way to speed up the operational process for monitoring and troubleshooting in today's big oil companies. Therefore, by comparing data mining algorithms and checking the efficiency and structure and how these algorithms respond in different conditions, The proposed (Bayesian) algorithm using data clustering and their analysis and data evaluation using a colored Petri net has provided an applicable and dynamic model from the point of view of reliability and response time. Therefore, by using this method, it is possible to achieve a dynamic and consistent model of the remote control system and prevent the occurrence of leakage in oil pipelines and refineries and reduce costs and human and financial errors. Statistical data The data obtained from the evaluation process shows an increase in reliability, availability and high speed compared to other previous methods in this proposed method.Keywords: wireless sensors, petroleum equipment troubleshooting, Bayesian algorithm, colored Petri net, rapid miner, data mining-reliability
Procedia PDF Downloads 6611777 Breast Cancer Risk is Predicted Using Fuzzy Logic in MATLAB Environment
Authors: S. Valarmathi, P. B. Harathi, R. Sridhar, S. Balasubramanian
Abstract:
Machine learning tools in medical diagnosis is increasing due to the improved effectiveness of classification and recognition systems to help medical experts in diagnosing breast cancer. In this study, ID3 chooses the splitting attribute with the highest gain in information, where gain is defined as the difference between before the split versus after the split. It is applied for age, location, taluk, stage, year, period, martial status, treatment, heredity, sex, and habitat against Very Serious (VS), Very Serious Moderate (VSM), Serious (S) and Not Serious (NS) to calculate the gain of information. The ranked histogram gives the gain of each field for the breast cancer data. The doctors use TNM staging which will decide the risk level of the breast cancer and play an important decision making field in fuzzy logic for perception based measurement. Spatial risk area (taluk) of the breast cancer is calculated. Result clearly states that Coimbatore (North and South) was found to be risk region to the breast cancer than other areas at 20% criteria. Weighted value of taluk was compared with criterion value and integrated with Map Object to visualize the results. ID3 algorithm shows the high breast cancer risk regions in the study area. The study has outlined, discussed and resolved the algorithms, techniques / methods adopted through soft computing methodology like ID3 algorithm for prognostic decision making in the seriousness of the breast cancer.Keywords: ID3 algorithm, breast cancer, fuzzy logic, MATLAB
Procedia PDF Downloads 51911776 Particle Filter Supported with the Neural Network for Aircraft Tracking Based on Kernel and Active Contour
Authors: Mohammad Izadkhah, Mojtaba Hoseini, Alireza Khalili Tehrani
Abstract:
In this paper we presented a new method for tracking flying targets in color video sequences based on contour and kernel. The aim of this work is to overcome the problem of losing target in changing light, large displacement, changing speed, and occlusion. The proposed method is made in three steps, estimate the target location by particle filter, segmentation target region using neural network and find the exact contours by greedy snake algorithm. In the proposed method we have used both region and contour information to create target candidate model and this model is dynamically updated during tracking. To avoid the accumulation of errors when updating, target region given to a perceptron neural network to separate the target from background. Then its output used for exact calculation of size and center of the target. Also it is used as the initial contour for the greedy snake algorithm to find the exact target's edge. The proposed algorithm has been tested on a database which contains a lot of challenges such as high speed and agility of aircrafts, background clutter, occlusions, camera movement, and so on. The experimental results show that the use of neural network increases the accuracy of tracking and segmentation.Keywords: video tracking, particle filter, greedy snake, neural network
Procedia PDF Downloads 34311775 Neural Network Based Fluctuation Frequency Control in PV-Diesel Hybrid Power System
Authors: Heri Suryoatmojo, Adi Kurniawan, Feby A. Pamuji, Nursalim, Syaffaruddin, Herbert Innah
Abstract:
Photovoltaic (PV) system hybrid with diesel system is utilized widely for electrification in remote area. PV output power fluctuates due to uncertainty condition of temperature and sun irradiance. When the penetration of PV power is large, the reliability of the power utility will be disturbed and seriously impact the unstable frequency of system. Therefore, designing a robust frequency controller in PV-diesel hybrid power system is very important. This paper proposes new method of frequency control application in hybrid PV-diesel system based on artificial neural network (ANN). This method can minimize the frequency deviation without smoothing PV output power that controlled by maximum power point tracking (MPPT) method. The neural network algorithm controller considers average irradiance, change of irradiance and frequency deviation. In order the show the effectiveness of proposed algorithm, the addition of battery as energy storage system is also presented. To validate the proposed method, the results of proposed system are compared with the results of similar system using MPPT only. The simulation results show that the proposed method able to suppress frequency deviation smaller compared to the results of system using MPPT only.Keywords: energy storage system, frequency deviation, hybrid power generation, neural network algorithm
Procedia PDF Downloads 50311774 Optimal Design of Submersible Permanent Magnet Linear Synchronous Motor Based Design of Experiment and Genetic Algorithm
Authors: Xiao Zhang, Wensheng Xiao, Junguo Cui, Hongmin Wang
Abstract:
Submersible permanent magnet linear synchronous motors (SPMLSMs) are electromagnetic devices, which can directly drive plunger pump to obtain the crude oil. Those motors have been gradually applied in oil fields due to high thrust force density and high efficiency. Since the force performance closely depends on the concrete structural parameters, the seven different structural parameters are investigated in detail. This paper presents an optimum design of an SPMLSM to minimize the detent force and maximize the thrust by using design of experiment (DOE) and genetic algorithm (GA). The three significant structural parameters (air-gap length, slot width, pole-arc coefficient) are separately screened using 27 1/16 fractional factorial design (FFD) to investigate the significant effect of seven parameters used in this research on the force performance. Response surface methodology (RSM) is well adapted to make analytical model of thrust and detent force with constraints of corresponding significant parameters and enable objective function to be easily created, respectively. GA is performed as a searching tool to search for the Pareto-optimal solutions. By finite element analysis, the proposed PMLSM shows merits in improving thrust and reducing the detent force dramatically.Keywords: optimization, force performance, design of experiment (DOE), genetic algorithm (GA)
Procedia PDF Downloads 29011773 Requirement Analysis for Emergency Management Software
Authors: Tomáš Ludík, Jiří Barta, Sabina Chytilová, Josef Navrátil
Abstract:
Emergency management is a discipline of dealing with and avoiding risks. Appropriate emergency management software allows better management of these risks and has a direct influence on reducing potential negative impacts. Although there are several emergency management software products in the Czech Republic, they cover user requirements from the emergency management field only partially. Therefore, the paper focuses on the issues of requirement analysis within development of emergency management software. Analysis of the current state describes the basic features and properties of user requirements for software development as well as basic methods and approaches for gathering these requirements. Then, the paper presents more specific mechanisms for requirement analysis based on chosen software development approach: structured, object-oriented or agile. Based on these experiences it is designed new methodology for requirement analysis. Methodology describes how to map user requirements comprehensively in the field of emergency management and thus reduce misunderstanding between software analyst and emergency manager. Proposed methodology was consulted with department of fire brigade and also has been applied in the requirements analysis for their current emergency management software. The proposed methodology has general character and can be used also in other specific areas during requirement analysis.Keywords: emergency software, methodology, requirement analysis, stakeholders, use case diagram, user stories
Procedia PDF Downloads 54011772 Wind Power Forecasting Using Echo State Networks Optimized by Big Bang-Big Crunch Algorithm
Authors: Amir Hossein Hejazi, Nima Amjady
Abstract:
In recent years, due to environmental issues traditional energy sources had been replaced by renewable ones. Wind energy as the fastest growing renewable energy shares a considerable percent of energy in power electricity markets. With this fast growth of wind energy worldwide, owners and operators of wind farms, transmission system operators, and energy traders need reliable and secure forecasts of wind energy production. In this paper, a new forecasting strategy is proposed for short-term wind power prediction based on Echo State Networks (ESN). The forecast engine utilizes state-of-the-art training process including dynamical reservoir with high capability to learn complex dynamics of wind power or wind vector signals. The study becomes more interesting by incorporating prediction of wind direction into forecast strategy. The Big Bang-Big Crunch (BB-BC) evolutionary optimization algorithm is adopted for adjusting free parameters of ESN-based forecaster. The proposed method is tested by real-world hourly data to show the efficiency of the forecasting engine for prediction of both wind vector and wind power output of aggregated wind power production.Keywords: wind power forecasting, echo state network, big bang-big crunch, evolutionary optimization algorithm
Procedia PDF Downloads 57211771 Filtering Intrusion Detection Alarms Using Ant Clustering Approach
Authors: Ghodhbani Salah, Jemili Farah
Abstract:
With the growth of cyber attacks, information safety has become an important issue all over the world. Many firms rely on security technologies such as intrusion detection systems (IDSs) to manage information technology security risks. IDSs are considered to be the last line of defense to secure a network and play a very important role in detecting large number of attacks. However the main problem with today’s most popular commercial IDSs is generating high volume of alerts and huge number of false positives. This drawback has become the main motivation for many research papers in IDS area. Hence, in this paper we present a data mining technique to assist network administrators to analyze and reduce false positive alarms that are produced by an IDS and increase detection accuracy. Our data mining technique is unsupervised clustering method based on hybrid ANT algorithm. This algorithm discovers clusters of intruders’ behavior without prior knowledge of a possible number of classes, then we apply K-means algorithm to improve the convergence of the ANT clustering. Experimental results on real dataset show that our proposed approach is efficient with high detection rate and low false alarm rate.Keywords: intrusion detection system, alarm filtering, ANT class, ant clustering, intruders’ behaviors, false alarms
Procedia PDF Downloads 40411770 Managing Configuration Management in Different Types of Organizations
Authors: Dilek Bilgiç
Abstract:
Configuration Management (CM) is a discipline assuring the consistency between product information the reality all along the product lifecycle. Although the extensive benefits of this discipline, such as the direct impact on increasing return on investment, reducing lifecycle costs, are realized by most organizations. It is worth evaluating that CM functions might be successfully implemented in some organized anarchies. This paper investigates how to manage ambiguity in CM processes as an opportunity within an environment that has different types of complexities and choice arenas. It is not explained how to establish a configuration management organization in a company; more specifically, it is analyzed how to apply configuration management processes when different types of streams exist. From planning to audit, all the CM functions may provide different organization learning opportunities when those applied with the right leadership methods.Keywords: configuration management, leadership, organizational analysis, organized anarchy, cm process, organizational learning, organizational maturity, configuration status accounting, leading innovation, change management
Procedia PDF Downloads 21011769 Acoustic Echo Cancellation Using Different Adaptive Algorithms
Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil
Abstract:
An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)
Procedia PDF Downloads 8011768 Assessing Knowledge Management Impacts: Challenges, Limits and Base for a New Framework
Authors: Patrick Mbassegue, Mickael Gardoni
Abstract:
In a market environment centered more and more on services and the digital economy, knowledge management becomes a framework that can help organizations to create value and to improve their overall performance. Based on an optimal allocation of scarce resources, managers are interested in demonstrating the added value generated by knowledge management projects. One of the challenges faced by organizations is the difficulty in measuring impacts and concrete results of knowledge management initiatives. The present article concerns the measure of concrete results coming from knowledge management projects based on balance scorecard model. One of the goals is to underline what can be done based on this model but also to highlight the limits associated. The present article is structured in five parts; 1-knowledge management projects and organizational impacts; 2- a framework and a methodology to measure organizational impacts; 3- application illustrated in two case studies; 4- limits concerning the proposed framework; 5- the proposal of a new framework to measure organizational impacts.Keywords: knowledge management, project, balance scorecard, impacts
Procedia PDF Downloads 26211767 Design of an Augmented Automatic Choosing Control with Constrained Input by Lyapunov Functions Using Gradient Optimization Automatic Choosing Functions
Authors: Toshinori Nawata
Abstract:
In this paper a nonlinear feedback control called augmented automatic choosing control (AACC) for a class of nonlinear systems with constrained input is presented. When designing the control, a constant term which arises from linearization of a given nonlinear system is treated as a coefficient of a stable zero dynamics. Parameters of the control are suboptimally selected by maximizing the stable region in the sense of Lyapunov with the aid of a genetic algorithm. This approach is applied to a field excitation control problem of power system to demonstrate the splendidness of the AACC. Simulation results show that the new controller can improve performance remarkably well.Keywords: augmented automatic choosing control, nonlinear control, genetic algorithm, zero dynamics
Procedia PDF Downloads 47811766 High Resolution Image Generation Algorithm for Archaeology Drawings
Authors: Xiaolin Zeng, Lei Cheng, Zhirong Li, Xueping Liu
Abstract:
Aiming at the problem of low accuracy and susceptibility to cultural relic diseases in the generation of high-resolution archaeology drawings by current image generation algorithms, an archaeology drawings generation algorithm based on a conditional generative adversarial network is proposed. An attention mechanism is added into the high-resolution image generation network as the backbone network, which enhances the line feature extraction capability and improves the accuracy of line drawing generation. A dual-branch parallel architecture consisting of two backbone networks is implemented, where the semantic translation branch extracts semantic features from orthophotographs of cultural relics, and the gradient screening branch extracts effective gradient features. Finally, the fusion fine-tuning module combines these two types of features to achieve the generation of high-quality and high-resolution archaeology drawings. Experimental results on the self-constructed archaeology drawings dataset of grotto temple statues show that the proposed algorithm outperforms current mainstream image generation algorithms in terms of pixel accuracy (PA), structural similarity (SSIM), and peak signal-to-noise ratio (PSNR) and can be used to assist in drawing archaeology drawings.Keywords: archaeology drawings, digital heritage, image generation, deep learning
Procedia PDF Downloads 5911765 Sustainability Fitting into Supply Chain
Authors: Menoka Bal, David Bryde
Abstract:
Sustainability in supply chain has become a topic of great interest and is linked to the assumption that a more sustainable the supply chain is the more the supply chain can perform better. The aim of this paper is to identify the different key aspects of the sustainable supply chain management. This paper will also identify the practices that are required to fulfill the demands of sustainability and, therefore, contributing to improve the sustainability performance. As part of this, the authors will identify how these different practices of implementing to achieve Sustainability in Supply Chain. This paper is conceptual in nature. This paper identifies some of the key categories which are of high importance for the sustainable management of supply chains. These key categories are: Managing the Supply Chain Risk, Improving the Supply Chain Performance, Managing the Supply Chain Value, Making the Supply Chain Leaner, Managing the Supply Chain Relationship. Through in-depth analysis, this paper aims to develop a theory of integrated management process that is most appropriate for sustainability assessment in supply chain.Keywords: sustainability, risk management, value management, project performance, supply chain management
Procedia PDF Downloads 67811764 Theoretical Paradigms for Total Quality Environmental Management (TQEM)
Authors: Mohammad Hossein Khasmafkan Nezam, Nader Chavoshi Boroujeni, Mohamad Reza Veshaghi
Abstract:
Quality management is dominated by rational paradigms for the measurement and management of quality, but these paradigms start to ‘break down’, when faced with the inherent complexity of managing quality in intensely competitive changing environments. In this article, the various theoretical paradigms employed to manage quality are reviewed and the advantages and limitations of these paradigms are highlighted. A major implication of this review is that when faced with complexity, an ideological stance to any single strategy paradigm for total quality environmental management is ineffective. We suggest that as complexity increases and we envisage intensely competitive changing environments there will be a greater need to consider a multi-paradigm integrationist view of strategy for TQEM.Keywords: total quality management (TQM), total quality environmental management (TQEM), ideologies (philosophy), theoretical paradigms
Procedia PDF Downloads 32011763 Analysis, Design, and Implementation of Quality Management System for KSA Software Company
Authors: Omar Said Almushyt
Abstract:
Quality management, in all countries all over the world, has become recently necessary to face challenges among companies. Software companies in KSA suffer from two problems, namely, low customer satisfaction, and low product quality. Implementation of quality management for a software company can solve these problems, by improving the quality of products and enhancing customer satisfaction. This will lead the company to be competitive. Introducing quality management system onto system analysis followed by system design and finally implementing that system can achieve these goals. Results of the present work showed that the proposed method can increase both the product quality by 10 % and the customer satisfaction by 20 %.Keywords: quality, management, software, information engineering
Procedia PDF Downloads 44011762 Automatic Threshold Search for Heat Map Based Feature Selection: A Cancer Dataset Analysis
Authors: Carlos Huertas, Reyes Juarez-Ramirez
Abstract:
Public health is one of the most critical issues today; therefore, there is great interest to improve technologies in the area of diseases detection. With machine learning and feature selection, it has been possible to aid the diagnosis of several diseases such as cancer. In this work, we present an extension to the Heat Map Based Feature Selection algorithm, this modification allows automatic threshold parameter selection that helps to improve the generalization performance of high dimensional data such as mass spectrometry. We have performed a comparison analysis using multiple cancer datasets and compare against the well known Recursive Feature Elimination algorithm and our original proposal, the results show improved classification performance that is very competitive against current techniques.Keywords: biomarker discovery, cancer, feature selection, mass spectrometry
Procedia PDF Downloads 33811761 Management Trainee Program
Authors: Ambreen Amir Ali
Abstract:
In todays’ dynamic environment, it has become very crucial to have comprehensive management trainee program to hire future leaders of organization. It is being proved that fresh graduates mostly join organizations because of its institution but later on they leave organization because of their immediate manager or supervisor. The concept of coaching and mentoring in talent management systems are very important, because mentors are those who can advise, facilitate, help and support new entrants to advance in their career. When it comes to going for talent hunt, one point needs to be highlighted that MTs are the raw talent for your organization, now it’s the responsibility of employers to nourish them, polish and developed them so that they can enthusiastically take care of senior leadership roles.Keywords: management trainee, retention, leadership roles, coaching
Procedia PDF Downloads 63711760 Maintenance Management Practice for Building
Authors: Harold Jideofor Nnachetam
Abstract:
Maintenance management in Nigeria Polytechnic faced many issues due to poor service delivery, inadequate finance, and poor maintenance plan and maintenance backlogs. The purpose of this study is to improve the conventional method practices which tend to be ineffective in Nigeria Polytechnic. The case study was conducted with eight Polytechnics in Nigeria. The selected Polytechnic is based on conventional method practices and its major problems, attempt to implement computerized technology and the willingness of staff to share their experiences. All feedbacks from respondents through semi-structured interview were recorded using video camera and transcribed verbatim. The overall findings of this research indicated; poor service delivery, inadequate financial, poor maintenance planning and maintenance backlogs. There is also need to overcome less man power competencies of maintenance management practices which existed with all eight Polytechnics. In addition, the study also found that the Polytechnics still use conventional maintenance management processes in managing building facility condition. As a result, the maintenance management staff was not able to improve the maintenance management performance at the Polytechnics. The findings are intended to be used for maintenance management practices at Nigeria Polytechnics in order to provide high-quality of building facility with safe and healthy environments.Keywords: maintenance management, conventional method, maintenance management system, Nigeria polytechnic
Procedia PDF Downloads 32211759 Task Scheduling on Parallel System Using Genetic Algorithm
Authors: Jasbir Singh Gill, Baljit Singh
Abstract:
Scheduling and mapping the application task graph on multiprocessor parallel systems is considered as the most crucial and critical NP-complete problem. Many genetic algorithms have been proposed to solve such problems. In this paper, two genetic approach based algorithms have been designed and developed with or without task duplication. The proposed algorithms work on two fitness functions. The first fitness i.e. task fitness is used to minimize the total finish time of the schedule (schedule length) while the second fitness function i.e. process fitness is concerned with allocating the tasks to the available highly efficient processor from the list of available processors (load balance). Proposed genetic-based algorithms have been experimentally implemented and evaluated with other state-of-art popular and widely used algorithms.Keywords: parallel computing, task scheduling, task duplication, genetic algorithm
Procedia PDF Downloads 34911758 Factors Affecting the Effectiveness of Management Creativity Using Theory Planned Behavior
Authors: Basheer Ahmad Al-Alwan, Ali Ratib Al-Awamreh, Badar Saif Alhatmi
Abstract:
The success of organizations in today's rapidly changing business landscape greatly hinges on the effectiveness of management creativity. This research aimed to uncover the elements influencing the effectiveness of management creativity by employing the Theory of Planned Behavior. The study's findings indicate that two significant predictors of management creativity effectiveness are one's attitude towards it and the subjective norms within the organization. Such results are rather important for the organizations and their leaders who would want to increase management creativity. The attitudes of subordinates towards management creativity should be positive if managers wish to cultivate management creativity among their employees, and the organizational culture must also be one that enhances and supports creative thinking. They should also make available all the requisite resources and support required for the implementation of their creative ideas and let employees participate in the decision-making processes in order to increase their sense of control over their creative activities. This research contributes to the literature on managerial creativity by presenting evidence about the effectiveness of managerial creativity and the strategies aimed at increasing the level of creativity in organizations through empirical insights.Keywords: management creativity, attitudes, subjective norms, perceived behavioral control
Procedia PDF Downloads 1011757 Improvement Image Summarization using Image Processing and Particle swarm optimization Algorithm
Authors: Hooman Torabifard
Abstract:
In the last few years, with the progress of technology and computers and artificial intelligence entry into all kinds of scientific and industrial fields, the lifestyles of human life have changed and in general, the way of humans live on earth has many changes and development. Until now, some of the changes has occurred in the context of digital images and image processing and still continues. However, besides all the benefits, there have been disadvantages. One of these disadvantages is the multiplicity of images with high volume and data; the focus of this paper is on improving and developing a method for summarizing and enhancing the productivity of these images. The general method used for this purpose in this paper consists of a set of methods based on data obtained from image processing and using the PSO (Particle swarm optimization) algorithm. In the remainder of this paper, the method used is elaborated in detail.Keywords: image summarization, particle swarm optimization, image threshold, image processing
Procedia PDF Downloads 13311756 Fuzzy Time Series Forecasting Based on Fuzzy Logical Relationships, PSO Technique, and Automatic Clustering Algorithm
Authors: A. K. M. Kamrul Islam, Abdelhamid Bouchachia, Suang Cang, Hongnian Yu
Abstract:
Forecasting model has a great impact in terms of prediction and continues to do so into the future. Although many forecasting models have been studied in recent years, most researchers focus on different forecasting methods based on fuzzy time series to solve forecasting problems. The forecasted models accuracy fully depends on the two terms that are the length of the interval in the universe of discourse and the content of the forecast rules. Moreover, a hybrid forecasting method can be an effective and efficient way to improve forecasts rather than an individual forecasting model. There are different hybrids forecasting models which combined fuzzy time series with evolutionary algorithms, but the performances are not quite satisfactory. In this paper, we proposed a hybrid forecasting model which deals with the first order as well as high order fuzzy time series and particle swarm optimization to improve the forecasted accuracy. The proposed method used the historical enrollments of the University of Alabama as dataset in the forecasting process. Firstly, we considered an automatic clustering algorithm to calculate the appropriate interval for the historical enrollments. Then particle swarm optimization and fuzzy time series are combined that shows better forecasting accuracy than other existing forecasting models.Keywords: fuzzy time series (fts), particle swarm optimization, clustering algorithm, hybrid forecasting model
Procedia PDF Downloads 25011755 New Technologies in Corporate Finance Management in the Digital Economy: Case of Kyrgyzstan
Authors: Marat Kozhomberdiev
Abstract:
The research will investigate the modern corporate finance management technologies currently used in the era of digitalization of the global economy and the degree to which financial institutions are utilizing these new technologies in the field of corporate finance management in Kyrgyzstan. The main purpose of the research is to reveal the role of financial management technologies as joint service centers, intercompany banks, specialized payment centers in the third-world country. Particularly, the analysis of the implacability of automated corporate finance management systems such as enterprise resource planning system (ERP) and treasury management system (TMS) will be carried out. Moreover, the research will investigate the role of cloud accounting systems in corporate finance management in Kyrgyz banks and whether it has any impact on the field of improving corporate finance management. The study will utilize a data collection process via surveying 3 banks in Kyrgyzstan, namely Mol-Bulak, RSK, and KICB. The banks were chosen based on their ownerships, such as state banks, private banks with local authorized capital, and private bank with international capital. The regression analysis will be utilized to reveal the correlation between the ownership of the bank and the use of new financial management technologies. The research will provide policy recommendations to both private and state banks on developing strategies for switching and utilizing modern corporate finance management technologies in their daily operations.Keywords: digital economy, corporate finance, digital environment, digital technologies, cloud technologies, financial management
Procedia PDF Downloads 70