Search results for: performance prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14252

Search results for: performance prediction

13682 Studying the Influence of Logistics on Organizational Performance through a Supply Chain Strategy: Case Study in Goldiran Electronics Co.

Authors: Ali Hajiesmaeili, Mehdi Rahimi, Ehsan Jaberi, Amir Abbas Hosseini

Abstract:

The purpose of this study is investigating the influences of logistics performance on organizational performance including both marketing & financial aspects, and showing the financial impacts of selecting the right marketing and logistics priorities in line with their supply chain type, and also giving the practitioners an advance identification of their priorities and participation types of supply chain, and the best combination of their strategies and resources in this regard. We made use of the original model’s questionnaire to gather all expert’s data and also SPSS and AMOS Ver.22 to analyze the gathered data. CFA method was also used to test whether a relationship between observed variables and their underlying latent constructs exists. Supply chain strategy implementation leads to logistics performance improvement, and marketing performance will be affected as well. Logistics service providers should focus on enhancement of supply chain performance, since logistics performance has been considered as a basis of evaluation of supply chain management strategy. Consequently, performance of the organization will be enhanced. This case is the first research made in Iran that analyzes the relationship between Logistics & Organizational performance in Home Appliances and Home Entertainment companies.

Keywords: logistics, organizational, performance, supply chain, strategy

Procedia PDF Downloads 643
13681 Corporate Governance Attributes and Financial Performance in Malaysian Listed Companies

Authors: Idris Adamu Alhaji, Wan Fauziahbt Wan Yusoff

Abstract:

This study was conducted to identify the relationship between Corporate Governance attributes and Firm Performance, various studies, had been carried out mostly in developed countries, in order to identify the relationship between corporate governance attributes and firm performance. Since, the value creation of corporate governance can be measured through the firm performance, corporate governance act as a mechanism to align management's goals with the stakeholders especially to increase firm performance. Despite extensive study of corporate governance there is still an inconsistence relationship between corporate governance attributes and firm performance. Therefore, the aim of this paper is to identify the relationship between corporate governance attributes and firm performance. Five corporate governance element were used as independent variables which include: Independent director, board size, audit committee, leadership structure and board meeting. Meanwhile, the dependent variables are two firm performance measurements; return on equity (ROE) and earning per share (EPS). This study uses quantitative approaches whereby data were gathered from secondary source data were collected from Annual Reports of the companies, online journals etc. This study revealed that, there is a significant relationship between corporate governance attributes and firm performance. Therefore, the results show that good corporate governance practice influence firm performance. Finally, it's hoped that this study provides current corporate governance scenario in Malaysia that can be used to enhance the development of corporate governance of the country.

Keywords: corporate governance, return on equity, earning per share, financial performance

Procedia PDF Downloads 462
13680 Cooperative Coevolution for Neuro-Evolution of Feed Forward Networks for Time Series Prediction Using Hidden Neuron Connections

Authors: Ravneil Nand

Abstract:

Cooperative coevolution uses problem decomposition methods to solve a larger problem. The problem decomposition deals with breaking down the larger problem into a number of smaller sub-problems depending on their method. Different problem decomposition methods have their own strengths and limitations depending on the neural network used and application problem. In this paper we are introducing a new problem decomposition method known as Hidden-Neuron Level Decomposition (HNL). The HNL method is competing with established problem decomposition method in time series prediction. The results show that the proposed approach has improved the results in some benchmark data sets when compared to the standalone method and has competitive results when compared to methods from literature.

Keywords: cooperative coevaluation, feed forward network, problem decomposition, neuron, synapse

Procedia PDF Downloads 324
13679 Comparison of the Distillation Curve Obtained Experimentally with the Curve Extrapolated by a Commercial Simulator

Authors: Lívia B. Meirelles, Erika C. A. N. Chrisman, Flávia B. de Andrade, Lilian C. M. de Oliveira

Abstract:

True Boiling Point distillation (TBP) is one of the most common experimental techniques for the determination of petroleum properties. This curve provides information about the performance of petroleum in terms of its cuts. The experiment is performed in a few days. Techniques are used to determine the properties faster with a software that calculates the distillation curve when a little information about crude oil is known. In order to evaluate the accuracy of distillation curve prediction, eight points of the TBP curve and specific gravity curve (348 K and 523 K) were inserted into the HYSYS Oil Manager, and the extended curve was evaluated up to 748 K. The methods were able to predict the curve with the accuracy of 0.6%-9.2% error (Software X ASTM), 0.2%-5.1% error (Software X Spaltrohr).

Keywords: distillation curve, petroleum distillation, simulation, true boiling point curve

Procedia PDF Downloads 432
13678 The Powerful of Training; Development and Compensation; Rewards in Sustaining SME’s Performance

Authors: Mohd Fitri Mansor, Noor Hidayah Abu, Hussen Nasir

Abstract:

Human capital is one of valuable assets to the organization in order to sustain organization performance and to achieve both employees and employer objectives. The aim of the study is to examine the powerful of both Human Resource practices (i.e. Training & Development and Compensation & Rewards) towards sustaining SME’s performance. The objectives of the current study are to examine the relationship between training and development as well as compensation and rewards in sustaining Malaysian SME’s performance. Finally, is to identify the strongest variable contribute to the sustainability of SMEs performance. The result from 80 Malaysian SME’s owners found that both variables training & development and compensation & rewards significantly contributes to the sustainability of SME,s performance. Meanwhile, the strongest variable contributes to the sustainability of SMEs performance was training and development. The study contributes to the knowledge and awareness to the SME’s owners an important or the powerful of human resource practices in sustaining their organization performance.

Keywords: training and development, compensation and rewards, sustainability, SME’s performance

Procedia PDF Downloads 476
13677 Numerical Prediction of Entropy Generation in Heat Exchangers

Authors: Nadia Allouache

Abstract:

The concept of second law is assumed to be important to optimize the energy losses in heat exchangers. The present study is devoted to the numerical prediction of entropy generation due to heat transfer and friction in a double tube heat exchanger partly or fully filled with a porous medium. The goal of this work is to find the optimal conditions that allow minimizing entropy generation. For this purpose, numerical modeling based on the control volume method is used to describe the flow and heat transfer phenomena in the fluid and the porous medium. Effects of the porous layer thickness, its permeability, and the effective thermal conductivity have been investigated. Unexpectedly, the fully porous heat exchanger yields a lower entropy generation than the partly porous case or the fluid case even if the friction increases the entropy generation.

Keywords: heat exchangers, porous medium, second law approach, turbulent flow

Procedia PDF Downloads 290
13676 Entropy Risk Factor Model of Exchange Rate Prediction

Authors: Darrol Stanley, Levan Efremidze, Jannie Rossouw

Abstract:

We investigate the predictability of the USD/ZAR (South African Rand) exchange rate with sample entropy analytics for the period of 2004-2015. We calculate sample entropy based on the daily data of the exchange rate and conduct empirical implementation of several market timing rules based on these entropy signals. The dynamic investment portfolio based on entropy signals produces better risk adjusted performance than a buy and hold strategy. The returns are estimated on the portfolio values in U.S. dollars. These results are preliminary and do not yet account for reasonable transactions costs, although these are very small in currency markets.

Keywords: currency trading, entropy, market timing, risk factor model

Procedia PDF Downloads 260
13675 Process Modeling of Electric Discharge Machining of Inconel 825 Using Artificial Neural Network

Authors: Himanshu Payal, Sachin Maheshwari, Pushpendra S. Bharti

Abstract:

Electrical discharge machining (EDM), a non-conventional machining process, finds wide applications for shaping difficult-to-cut alloys. Process modeling of EDM is required to exploit the process to the fullest. Process modeling of EDM is a challenging task owing to involvement of so many electrical and non-electrical parameters. This work is an attempt to model the EDM process using artificial neural network (ANN). Experiments were carried out on die-sinking EDM taking Inconel 825 as work material. ANN modeling has been performed using experimental data. The prediction ability of trained network has been verified experimentally. Results indicate that ANN can predict the values of performance measures of EDM satisfactorily.

Keywords: artificial neural network, EDM, metal removal rate, modeling, surface roughness

Procedia PDF Downloads 406
13674 Using Happening Performance in Vocabulary Teaching

Authors: Mustafa Gultekin

Abstract:

It is believed that drama can be used in language classes to create a positive atmosphere for students to use the target language in an interactive way. Thus, drama has been extensively used in many settings in language classes. Although happening has been generally used as a performance art of theatre, this new kind of performance has not been widely known in language teaching area. Therefore, it can be an innovative idea to use happening in language classes, and thus a positive environment can be created for students to use the language in an interactive way. Happening can be defined as an art performance that puts emphasis on interaction in an audience. Because of its interactive feature, happening can also be used in language classes to motivate students to use the language in an interactive environment. The present study aims to explain how a happening performance can be applied to a learning environment to teach vocabulary in English. In line with this purpose, a learning environment was designed for a vocabulary presentation lesson. At the end of the performance, students were asked to compare the traditional way of teaching and happening performance in terms of effectiveness. It was found that happening performance provided the students with a more creative and interactive environment to use the language. Therefore, happening can be used in language classrooms as an innovative tool for education.

Keywords: English, happening, language learning, vocabulary teaching

Procedia PDF Downloads 361
13673 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 103
13672 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 413
13671 Fuzzy Analytic Hierarchy Process for Determination of Supply Chain Performance Evaluation Criteria

Authors: Ibrahim Cil, Onur Kurtcu, H. Ibrahim Demir, Furkan Yener, Yusuf. S. Turkan, Muharrem Unver, Ramazan Evren

Abstract:

Fuzzy AHP (Analytic Hierarchy Process) method is decision-making way at the end of integrating the current AHP method with fuzzy structure. In this study, the processes of production planning, inventory management and purchasing department of a system were analysed and were requested to decide the performance criteria of each area. At this point, the current work processes were analysed by various decision-makers and comparing each criteria by giving points according to 1-9 scale were completed. The criteria were listed in order to their weights by using Fuzzy AHP approach and top three performance criteria of each department were determined. After that, the performance criteria of supply chain consisting of three departments were asked to determine. The processes of each department were compared by decision-makers at the point of building the supply chain performance system and getting the performance criteria. According to the results, the criteria of performance system of supply chain by using Fuzzy AHP were determined for which will be used in the supply chain performance system in the future.

Keywords: AHP, fuzzy, performance evaluation, supply chain

Procedia PDF Downloads 337
13670 Performance in Police Organizations: Approaches from the Literature Review

Authors: Felipe Haleyson Ribeiro dos Santos, Edson Ronaldo Guarido Filho

Abstract:

This article aims to review the literature on performance in police organizations. For that, the inOrdinatio method was adopted, which defines the form of selection and classification of articles. The search was carried out in databases, which resulted in a total of 619 documents that were cataloged and classified with the support of the Mendeley software. The theoretical scope intended here is to identify how performance in police organizations has been studied. After deepening the analysis and focusing on management, it was possible to classify the articles into three levels: individual, organizational, and institutional. However, to our best knowledge, no studies were found that addressed the performance relationship between the levels, which can be seen as a suggestion for further research.

Keywords: police management, performance, management, multi-level

Procedia PDF Downloads 103
13669 Prediction of Childbearing Orientations According to Couples' Sexual Review Component

Authors: Razieh Rezaeekalantari

Abstract:

Objective: The purpose of this study was to investigate the prediction of parenting orientations in terms of the components of couples' sexual review. Methods: This was a descriptive correlational research method. The population consisted of 500 couples referring to Sari Health Center. Two hundred and fifteen (215) people were selected randomly by using Krejcie-Morgan-sample-size-table. For data collection, the childbearing orientations scale and the Multidimensional Sexual Self-Concept Questionnaire were used. Result: For data analysis, the mean and standard deviation were used and to analyze the research hypothesis regression correlation and inferential statistics were used. Conclusion: The findings indicate that there is not a significant relationship between the tendency to childbearing and the predictive value of sexual review (r = 0.84) with significant level (sig = 219.19) (P < 0.05). So, with 95% confidence, we conclude that there is not a meaningful relationship between sexual orientation and tendency to child-rearing.

Keywords: couples referring, health center, sexual review component, parenting orientations

Procedia PDF Downloads 214
13668 On Performance of Cache Replacement Schemes in NDN-IoT

Authors: Rasool Sadeghi, Sayed Mahdi Faghih Imani, Negar Najafi

Abstract:

The inherent features of Named Data Networking (NDN) provides a robust solution for Internet of Thing (IoT). Therefore, NDN-IoT has emerged as a combined architecture which exploits the benefits of NDN for interconnecting of the heterogeneous objects in IoT. In NDN-IoT, caching schemes are a key role to improve the network performance. In this paper, we consider the effectiveness of cache replacement schemes in NDN-IoT scenarios. We investigate the impact of replacement schemes on average delay, average hop count, and average interest retransmission when replacement schemes are Least Frequently Used (LFU), Least Recently Used (LRU), First-In-First-Out (FIFO) and Random. The simulation results demonstrate that LFU and LRU present a stable performance when the cache size changes. Moreover, the network performance improves when the number of consumers increases.

Keywords: NDN-IoT, cache replacement, performance, ndnSIM

Procedia PDF Downloads 356
13667 Sorghum Grains Grading for Food, Feed, and Fuel Using NIR Spectroscopy

Authors: Irsa Ejaz, Siyang He, Wei Li, Naiyue Hu, Chaochen Tang, Songbo Li, Meng Li, Boubacar Diallo, Guanghui Xie, Kang Yu

Abstract:

Background: Near-infrared spectroscopy (NIR) is a non-destructive, fast, and low-cost method to measure the grain quality of different cereals. Previously reported NIR model calibrations using the whole grain spectra had moderate accuracy. Improved predictions are achievable by using the spectra of whole grains, when compared with the use of spectra collected from the flour samples. However, the feasibility for determining the critical biochemicals, related to the classifications for food, feed, and fuel products are not adequately investigated. Objectives: To evaluate the feasibility of using NIRS and the influence of four sample types (whole grains, flours, hulled grain flours, and hull-less grain flours) on the prediction of chemical components to improve the grain sorting efficiency for human food, animal feed, and biofuel. Methods: NIR was applied in this study to determine the eight biochemicals in four types of sorghum samples: hulled grain flours, hull-less grain flours, whole grains, and grain flours. A total of 20 hybrids of sorghum grains were selected from the two locations in China. Followed by NIR spectral and wet-chemically measured biochemical data, partial least squares regression (PLSR) was used to construct the prediction models. Results: The results showed that sorghum grain morphology and sample format affected the prediction of biochemicals. Using NIR data of grain flours generally improved the prediction compared with the use of NIR data of whole grains. In addition, using the spectra of whole grains enabled comparable predictions, which are recommended when a non-destructive and rapid analysis is required. Compared with the hulled grain flours, hull-less grain flours allowed for improved predictions for tannin, cellulose, and hemicellulose using NIR data. Conclusion: The established PLSR models could enable food, feed, and fuel producers to efficiently evaluate a large number of samples by predicting the required biochemical components in sorghum grains without destruction.

Keywords: FT-NIR, sorghum grains, biochemical composition, food, feed, fuel, PLSR

Procedia PDF Downloads 60
13666 Cardiovascular Disease Prediction Using Machine Learning Approaches

Authors: P. Halder, A. Zaman

Abstract:

It is estimated that heart disease accounts for one in ten deaths worldwide. United States deaths due to heart disease are among the leading causes of death according to the World Health Organization. Cardiovascular diseases (CVDs) account for one in four U.S. deaths, according to the Centers for Disease Control and Prevention (CDC). According to statistics, women are more likely than men to die from heart disease as a result of strokes. A 50% increase in men's mortality was reported by the World Health Organization in 2009. The consequences of cardiovascular disease are severe. The causes of heart disease include diabetes, high blood pressure, high cholesterol, abnormal pulse rates, etc. Machine learning (ML) can be used to make predictions and decisions in the healthcare industry. Thus, scientists have turned to modern technologies like Machine Learning and Data Mining to predict diseases. The disease prediction is based on four algorithms. Compared to other boosts, the Ada boost is much more accurate.

Keywords: heart disease, cardiovascular disease, coronary artery disease, feature selection, random forest, AdaBoost, SVM, decision tree

Procedia PDF Downloads 146
13665 Middle-Level Management Involvement in Strategy Process, and Organizational Performance

Authors: Mazyar Taghavi

Abstract:

This research examines middle-level managers’ involvement in strategy process in 15 manufacturing and service companies in Iran. We considered two dominant theoretical arguments for expecting a positive association. According to the first direction involvement improves organizational performance by improving the quality of strategic decisions. According to the second track, middle managers contribute to increased levels of performance through strategic consensus among them. Results indicate that involvement in the strategy is related to organizational performance. Involvement is associated with consensus (i.e. strategic understanding and commitment) among middle-level managers. However, findings indicate that consensus is not related to the organizational performance.

Keywords: middle-level management, strategy process, organizational performance, strategy consensus

Procedia PDF Downloads 431
13664 Prediction of Sepsis Illness from Patients Vital Signs Using Long Short-Term Memory Network and Dynamic Analysis

Authors: Marcio Freire Cruz, Naoaki Ono, Shigehiko Kanaya, Carlos Arthur Mattos Teixeira Cavalcante

Abstract:

The systems that record patient care information, known as Electronic Medical Record (EMR) and those that monitor vital signs of patients, such as heart rate, body temperature, and blood pressure have been extremely valuable for the effectiveness of the patient’s treatment. Several kinds of research have been using data from EMRs and vital signs of patients to predict illnesses. Among them, we highlight those that intend to predict, classify, or, at least identify patterns, of sepsis illness in patients under vital signs monitoring. Sepsis is an organic dysfunction caused by a dysregulated patient's response to an infection that affects millions of people worldwide. Early detection of sepsis is expected to provide a significant improvement in its treatment. Preceding works usually combined medical, statistical, mathematical and computational models to develop detection methods for early prediction, getting higher accuracies, and using the smallest number of variables. Among other techniques, we could find researches using survival analysis, specialist systems, machine learning and deep learning that reached great results. In our research, patients are modeled as points moving each hour in an n-dimensional space where n is the number of vital signs (variables). These points can reach a sepsis target point after some time. For now, the sepsis target point was calculated using the median of all patients’ variables on the sepsis onset. From these points, we calculate for each hour the position vector, the first derivative (velocity vector) and the second derivative (acceleration vector) of the variables to evaluate their behavior. And we construct a prediction model based on a Long Short-Term Memory (LSTM) Network, including these derivatives as explanatory variables. The accuracy of the prediction 6 hours before the time of sepsis, considering only the vital signs reached 83.24% and by including the vectors position, speed, and acceleration, we obtained 94.96%. The data are being collected from Medical Information Mart for Intensive Care (MIMIC) Database, a public database that contains vital signs, laboratory test results, observations, notes, and so on, from more than 60.000 patients.

Keywords: dynamic analysis, long short-term memory, prediction, sepsis

Procedia PDF Downloads 116
13663 Residual Analysis and Ground Motion Prediction Equation Ranking Metrics for Western Balkan Strong Motion Database

Authors: Manuela Villani, Anila Xhahysa, Christopher Brooks, Marco Pagani

Abstract:

The geological structure of Western Balkans is strongly affected by the collision between Adria microplate and the southwestern Euroasia margin, resulting in a considerably active seismic region. The Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project (BSHAP) (2007-2011, 2012-2015) by NATO supported the preparation of new seismic hazard maps of the Western Balkan, but when inspecting the seismic hazard models produced later by these countries on a national scale, significant differences in design PGA values are observed in the border, for instance, North Albania-Montenegro, South Albania- Greece, etc. Considering the fact that the catalogues were unified and seismic sources were defined within BSHAP framework, obviously, the differences arise from the Ground Motion Prediction Equations selection, which are generally the component with highest impact on the seismic hazard assessment. At the time of the project, a modest database was present, namely 672 three-component records, whereas nowadays, this strong motion database has increased considerably up to 20,939 records with Mw ranging in the interval 3.7-7 and epicentral distance distribution from 0.47km to 490km. Statistical analysis of the strong motion database showed the lack of recordings in the moderate-to-large magnitude and short distance ranges; therefore, there is need to re-evaluate the Ground Motion Prediction Equation in light of the recently updated database and the new generations of GMMs. In some cases, it was observed that some events were more extensively documented in one database than the other, like the 1979 Montenegro earthquake, with a considerably larger number of records in the BSHAP Analogue SM database when compared to ESM23. Therefore, the strong motion flat-file provided from the Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project was merged with the ESM23 database for the polygon studied in this project. After performing the preliminary residual analysis, the candidate GMPE-s were identified. This process was done using the GMPE performance metrics available within the SMT in the OpenQuake Platform. The Likelihood Model and Euclidean Distance Based Ranking (EDR) were used. Finally, for this study, a GMPE logic tree was selected and following the selection of candidate GMPEs, model weights were assigned using the average sample log-likelihood approach of Scherbaum.

Keywords: residual analysis, GMPE, western balkan, strong motion, openquake

Procedia PDF Downloads 73
13662 Personalized Infectious Disease Risk Prediction System: A Knowledge Model

Authors: Retno A. Vinarti, Lucy M. Hederman

Abstract:

This research describes a knowledge model for a system which give personalized alert to users about infectious disease risks in the context of weather, location and time. The knowledge model is based on established epidemiological concepts augmented by information gleaned from infection-related data repositories. The existing disease risk prediction research has more focuses on utilizing raw historical data and yield seasonal patterns of infectious disease risk emergence. This research incorporates both data and epidemiological concepts gathered from Atlas of Human Infectious Disease (AHID) and Centre of Disease Control (CDC) as basic reasoning of infectious disease risk prediction. Using CommonKADS methodology, the disease risk prediction task is an assignment synthetic task, starting from knowledge identification through specification, refinement to implementation. First, knowledge is gathered from AHID primarily from the epidemiology and risk group chapters for each infectious disease. The result of this stage is five major elements (Person, Infectious Disease, Weather, Location and Time) and their properties. At the knowledge specification stage, the initial tree model of each element and detailed relationships are produced. This research also includes a validation step as part of knowledge refinement: on the basis that the best model is formed using the most common features, Frequency-based Selection (FBS) is applied. The portion of the Infectious Disease risk model relating to Person comes out strongest, with Location next, and Weather weaker. For Person attribute, Age is the strongest, Activity and Habits are moderate, and Blood type is weakest. At the Location attribute, General category (e.g. continents, region, country, and island) results much stronger than Specific category (i.e. terrain feature). For Weather attribute, Less Precise category (i.e. season) comes out stronger than Precise category (i.e. exact temperature or humidity interval). However, given that some infectious diseases are significantly more serious than others, a frequency based metric may not be appropriate. Future work will incorporate epidemiological measurements of disease seriousness (e.g. odds ratio, hazard ratio and fatality rate) into the validation metrics. This research is limited to modelling existing knowledge about epidemiology and chain of infection concepts. Further step, verification in knowledge refinement stage, might cause some minor changes on the shape of tree.

Keywords: epidemiology, knowledge modelling, infectious disease, prediction, risk

Procedia PDF Downloads 237
13661 Verification of Satellite and Observation Measurements to Build Solar Energy Projects in North Africa

Authors: Samy A. Khalil, U. Ali Rahoma

Abstract:

The measurements of solar radiation, satellite data has been routinely utilize to estimate solar energy. However, the temporal coverage of satellite data has some limits. The reanalysis, also known as "retrospective analysis" of the atmosphere's parameters, is produce by fusing the output of NWP (Numerical Weather Prediction) models with observation data from a variety of sources, including ground, and satellite, ship, and aircraft observation. The result is a comprehensive record of the parameters affecting weather and climate. The effectiveness of reanalysis datasets (ERA-5) for North Africa was evaluate against high-quality surfaces measured using statistical analysis. Estimating the distribution of global solar radiation (GSR) over five chosen areas in North Africa through ten-years during the period time from 2011 to 2020. To investigate seasonal change in dataset performance, a seasonal statistical analysis was conduct, which showed a considerable difference in mistakes throughout the year. By altering the temporal resolution of the data used for comparison, the performance of the dataset is alter. Better performance is indicate by the data's monthly mean values, but data accuracy is degraded. Solar resource assessment and power estimation are discuses using the ERA-5 solar radiation data. The average values of mean bias error (MBE), root mean square error (RMSE) and mean absolute error (MAE) of the reanalysis data of solar radiation vary from 0.079 to 0.222, 0.055 to 0.178, and 0.0145 to 0.198 respectively during the period time in the present research. The correlation coefficient (R2) varies from 0.93 to 99% during the period time in the present research. This research's objective is to provide a reliable representation of the world's solar radiation to aid in the use of solar energy in all sectors.

Keywords: solar energy, ERA-5 analysis data, global solar radiation, North Africa

Procedia PDF Downloads 93
13660 A Fuzzy Structural Equation Model for Development of a Safety Performance Index Assessment Tool in Construction Sites

Authors: Murat Gunduz, Mustafa Ozdemir

Abstract:

In this research, a framework is to be proposed to model the safety performance in construction sites. Determinants of safety performance are to be defined through extensive literature review and a multidimensional safety performance model is to be developed. In this context, a questionnaire is to be administered to construction companies with sites. The collected data through questionnaires including linguistic terms are then to be defuzzified to get concrete numbers by using fuzzy set theory which provides strong and significant instruments for the measurement of ambiguities and provides the opportunity to meaningfully represent concepts expressed in the natural language. The validity of the proposed safety performance model, relationships between determinants of safety performance are to be analyzed using the structural equation modeling (SEM) which is a highly strong multi variable analysis technique that makes possible the evaluation of latent structures. After validation of the model, a safety performance index assessment tool is to be proposed by the help of software. The proposed safety performance assessment tool will be based on the empirically validated theoretical model.

Keywords: Fuzzy set theory, safety performance assessment, safety index, structural equation modeling (SEM), construction sites

Procedia PDF Downloads 515
13659 Effects of Employees’ Training Program on the Performance of Small Scale Enterprises in Oyo State

Authors: Itiola Kehinde Adeniran

Abstract:

The study examined the effect of employees’ training on the performance of small scale enterprises in Oyo State. A structured questionnaire was used to collect data from 150 respondents through purposive sampling method. Linear regression was used with the aid of statistical package for social science (SPSS) version 20 to analyze the data collected in order to examine the effect of independent variable, employees’ training on dependent variable, performance (profit) of small scale enterprises. The result revealed that employees’ training has a significant effect on the performance of small scale enterprises. It was concluded that predictor variable namely (training) is 55.5% variance of enterprises performance (profitability). Therefore, the paper recommended that all small scale enterprises in Nigeria should embrace manpower training and development in order to improve employees’ performance leading to organizational profitability.

Keywords: training, employee performance, small scale enterprise, organizational profitability

Procedia PDF Downloads 379
13658 A Strategic Performance Control System for Municipal Organization

Authors: Emin Gundogar, Aysegul Yilmaz

Abstract:

Strategic performance control is a significant procedure in management. There are various methods to improve this procedure. This study introduces an information system that is developed to score performance for municipal management. The application of the system is clarified by exemplifying municipal processes.

Keywords: management information system, municipal management, performance control

Procedia PDF Downloads 467
13657 Surface Roughness Prediction Using Numerical Scheme and Adaptive Control

Authors: Michael K.O. Ayomoh, Khaled A. Abou-El-Hossein., Sameh F.M. Ghobashy

Abstract:

This paper proposes a numerical modelling scheme for surface roughness prediction. The approach is premised on the use of 3D difference analysis method enhanced with the use of feedback control loop where a set of adaptive weights are generated. The surface roughness values utilized in this paper were adapted from [1]. Their experiments were carried out using S55C high carbon steel. A comparison was further carried out between the proposed technique and those utilized in [1]. The experimental design has three cutting parameters namely: depth of cut, feed rate and cutting speed with twenty-seven experimental sample-space. The simulation trials conducted using Matlab software is of two sub-classes namely: prediction of the surface roughness readings for the non-boundary cutting combinations (NBCC) with the aid of the known surface roughness readings of the boundary cutting combinations (BCC). The following simulation involved the use of the predicted outputs from the NBCC to recover the surface roughness readings for the boundary cutting combinations (BCC). The simulation trial for the NBCC attained a state of total stability in the 7th iteration i.e. a point where the actual and desired roughness readings are equal such that error is minimized to zero by using a set of dynamic weights generated in every following simulation trial. A comparative study among the three methods showed that the proposed difference analysis technique with adaptive weight from feedback control, produced a much accurate output as against the abductive and regression analysis techniques presented in this.

Keywords: Difference Analysis, Surface Roughness; Mesh- Analysis, Feedback control, Adaptive weight, Boundary Element

Procedia PDF Downloads 615
13656 The Design of a Vehicle Traffic Flow Prediction Model for a Gauteng Freeway Based on an Ensemble of Multi-Layer Perceptron

Authors: Tebogo Emma Makaba, Barnabas Ndlovu Gatsheni

Abstract:

The cities of Johannesburg and Pretoria both located in the Gauteng province are separated by a distance of 58 km. The traffic queues on the Ben Schoeman freeway which connects these two cities can stretch for almost 1.5 km. Vehicle traffic congestion impacts negatively on the business and the commuter’s quality of life. The goal of this paper is to identify variables that influence the flow of traffic and to design a vehicle traffic prediction model, which will predict the traffic flow pattern in advance. The model will unable motorist to be able to make appropriate travel decisions ahead of time. The data used was collected by Mikro’s Traffic Monitoring (MTM). Multi-Layer perceptron (MLP) was used individually to construct the model and the MLP was also combined with Bagging ensemble method to training the data. The cross—validation method was used for evaluating the models. The results obtained from the techniques were compared using predictive and prediction costs. The cost was computed using combination of the loss matrix and the confusion matrix. The predicted models designed shows that the status of the traffic flow on the freeway can be predicted using the following parameters travel time, average speed, traffic volume and day of month. The implications of this work is that commuters will be able to spend less time travelling on the route and spend time with their families. The logistics industry will save more than twice what they are currently spending.

Keywords: bagging ensemble methods, confusion matrix, multi-layer perceptron, vehicle traffic flow

Procedia PDF Downloads 337
13655 Springback Prediction for Sheet Metal Cold Stamping Using Convolutional Neural Networks

Authors: Lei Zhu, Nan Li

Abstract:

Cold stamping has been widely applied in the automotive industry for the mass production of a great range of automotive panels. Predicting the springback to ensure the dimensional accuracy of the cold-stamped components is a critical step. The main approaches for the prediction and compensation of springback in cold stamping include running Finite Element (FE) simulations and conducting experiments, which require forming process expertise and can be time-consuming and expensive for the design of cold stamping tools. Machine learning technologies have been proven and successfully applied in learning complex system behaviours using presentative samples. These technologies exhibit the promising potential to be used as supporting design tools for metal forming technologies. This study, for the first time, presents a novel application of a Convolutional Neural Network (CNN) based surrogate model to predict the springback fields for variable U-shape cold bending geometries. A dataset is created based on the U-shape cold bending geometries and the corresponding FE simulations results. The dataset is then applied to train the CNN surrogate model. The result shows that the surrogate model can achieve near indistinguishable full-field predictions in real-time when compared with the FE simulation results. The application of CNN in efficient springback prediction can be adopted in industrial settings to aid both conceptual and final component designs for designers without having manufacturing knowledge.

Keywords: springback, cold stamping, convolutional neural networks, machine learning

Procedia PDF Downloads 139
13654 Developing a HSE-Finacial Indicator Model in Oil Industry

Authors: Reza Safari, Ali Rajabzadeh Ghatari, Raheleh Hossseinzadeh Mahabadi

Abstract:

In the present world, there are different pressures on firms such as competition, legislations, social etc. these pressures force the firms to follow “survival” as their primary goal and then growth. One of the main factors that helps firms to reach their goals is proper financial performance. To find out about the financial performance, a firm should monitors its financial performance. Financial performance affected by many factors. This research seeks to clear which financial performance indicators are most important according to Environmental situation of a firm and what are their priorities. To do so, environmental indicators specified as presented on OECD Key Environmental Indicators 2008 and so the financial performance indicators such as Profitability, Liquidity, Gearing, Investor ratios, and etc. At this stage, the affections questioned through questionnaires. After gaining the results, data analyzed using Promethee technique. By using decision matrixes extracted from those techniques an expert system designed. This expert system suggests the suitable financial performance indicators and their ranking by receiving the environment situation given environment indicators weight.

Keywords: environment indicators, financial performance indicators, promethee, expert system

Procedia PDF Downloads 432
13653 Development of Imprinting and Replica Molding of Soft Mold Curved Surface

Authors: Yung-Jin Weng, Chia-Chi Chang, Chun-Yu Tsai

Abstract:

This paper is focused on the research of imprinting and replica molding of quasi-grey scale soft mold curved surface microstructure mold. In this paper, a magnetic photocuring forming system is first developed and built independently, then the magnetic curved surface microstructure soft mode is created; moreover, the magnetic performance of the magnetic curved surface at different heights is tested and recorded, and through experimentation and simulation, the magnetic curved surface microstructure soft mold is used in the research of quasi-grey scale soft mold curved surface microstructure imprinting and replica molding. The experimental results show that, under different surface curvatures and voltage control conditions, different quasi-grey scale array microstructures take shape. In addition, this paper conducts research on the imprinting and replica molding of photoresist composite magnetic powder in order to discuss the forming performance of magnetic photoresist, and finally, the experimental result is compared with the simulation to obtain more accurate prediction and results. This research is predicted to provide microstructure component preparation technology with heterogeneity and controllability, and is a kind of valid shaping quasi-grey scale microstructure manufacturing technology method.

Keywords: soft mold, magnetic, microstructure, curved surface

Procedia PDF Downloads 320