Search results for: linear statistical model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21278

Search results for: linear statistical model

20768 Optimization of Bills Assignment to Different Skill-Levels of Data Entry Operators in a Business Process Outsourcing Industry

Authors: M. S. Maglasang, S. O. Palacio, L. P. Ogdoc

Abstract:

Business Process Outsourcing has been one of the fastest growing and emerging industry in the Philippines today. Unlike most of the contact service centers, more popularly known as "call centers", The BPO Industry’s primary outsourced service is performing audits of the global clients' logistics. As a service industry, manpower is considered as the most important yet the most expensive resource in the company. Because of this, there is a need to maximize the human resources so people are effectively and efficiently utilized. The main purpose of the study is to optimize the current manpower resources through effective distribution and assignment of different types of bills to the different skill-level of data entry operators. The assignment model parameters include the average observed time matrix gathered from through time study, which incorporates the learning curve concept. Subsequently, a simulation model was made to duplicate the arrival rate of demand which includes the different batches and types of bill per day. Next, a mathematical linear programming model was formulated. Its objective is to minimize direct labor cost per bill by allocating the different types of bills to the different skill-levels of operators. Finally, a hypothesis test was done to validate the model, comparing the actual and simulated results. The analysis of results revealed that the there’s low utilization of effective capacity because of its failure to determine the product-mix, skill-mix, and simulated demand as model parameters. Moreover, failure to consider the effects of learning curve leads to overestimation of labor needs. From 107 current number of operators, the proposed model gives a result of 79 operators. This results to an increase of utilization of effective capacity to 14.94%. It is recommended that the excess 28 operators would be reallocated to the other areas of the department. Finally, a manpower capacity planning model is also recommended in support to management’s decisions on what to do when the current capacity would reach its limit with the expected increasing demand.

Keywords: optimization modelling, linear programming, simulation, time and motion study, capacity planning

Procedia PDF Downloads 501
20767 The Per Capita Income, Energy production and Environmental Degradation: A Comprehensive Assessment of the existence of the Environmental Kuznets Curve Hypothesis in Bangladesh

Authors: Ashique Mahmud, MD. Ataul Gani Osmani, Shoria Sharmin

Abstract:

In the first quarter of the twenty-first century, the most substantial global concern is environmental contamination, and it has gained the prioritization of both the national and international community. Keeping in mind this crucial fact, this study conducted different statistical and econometrical methods to identify whether the gross national income of the country has a significant impact on electricity production from nonrenewable sources and different air pollutants like carbon dioxide, nitrous oxide, and methane emissions. Besides, the primary objective of this research was to analyze whether the environmental Kuznets curve hypothesis holds for the examined variables. After analyzing different statistical properties of the variables, this study came to the conclusion that the environmental Kuznets curve hypothesis holds for gross national income and carbon dioxide emission in Bangladesh in the short run as well as the long run. This study comes to this conclusion based on the findings of ordinary least square estimations, ARDL bound tests, short-run causality analysis, the Error Correction Model, and other pre-diagnostic and post-diagnostic tests that have been employed in the structural model. Moreover, this study wants to demonstrate that the outline of gross national income and carbon dioxide emissions is in its initial stage of development and will increase up to the optimal peak. The compositional effect will then force the emission to decrease, and the environmental quality will be restored in the long run.

Keywords: environmental Kuznets curve hypothesis, carbon dioxide emission in Bangladesh, gross national income in Bangladesh, autoregressive distributed lag model, granger causality, error correction model

Procedia PDF Downloads 135
20766 Simulation of Gamma Rays Attenuation Coefficient for Some common Shielding Materials Using Monte Carlo Program

Authors: Cherief Houria, Fouka Mourad

Abstract:

In this work, the simulation of the radiation attenuation is carried out in a photon detector consisting of different common shielding material using a Monte Carlo program called PTM. The aim of the study is to investigate the effect of atomic weight and the thickness of shielding materials on the gamma radiation attenuation ability. The linear attenuation coefficients of Aluminum (Al), Iron (Fe), and lead (Pb) elements were evaluated at photons energy of 661:7KeV that are considered to be emitted from a standard radioactive point source Cs 137. The experimental measurements have been performed for three materials to obtain these linear attenuation coefficients, using a Gamma NaI(Tl) scintillation detector. Our results have been compared with the simulation results of the linear attenuation coefficient using the XCOM database and Geant4 codes and reveal that they are well agreed with both simulation data.

Keywords: gamma photon, Monte Carlo program, radiation attenuation, shielding material, the linear attenuation coefficient

Procedia PDF Downloads 190
20765 A Fully Automated New-Fangled VESTAL to Label Vertebrae and Intervertebral Discs

Authors: R. Srinivas, K. V. Ramana

Abstract:

This paper presents a novel method called VESTAL to label vertebrae and inter vertebral discs. Each vertebra has certain statistical features properties. To label vertebrae and discs, a new equation to model the path of spinal cord is derived using statistical properties of the spinal canal. VESTAL uses this equation for labeling vertebrae and discs. For each vertebrae and inter vertebral discs both posterior, interior width, height are measured. The calculated values are compared with real values which are measured using venires calipers and the comparison produced 95% efficiency and accurate results. The VESTAL is applied on 50 patients 350 MR images and obtained 100% accuracy in labeling.

Keywords: spine, vertebrae, inter vertebral disc, labeling, statistics, texture, disc

Procedia PDF Downloads 354
20764 Solving Fuzzy Multi-Objective Linear Programming Problems with Fuzzy Decision Variables

Authors: Mahnaz Hosseinzadeh, Aliyeh Kazemi

Abstract:

In this paper, a method is proposed for solving Fuzzy Multi-Objective Linear Programming problems (FMOLPP) with fuzzy right hand side and fuzzy decision variables. To illustrate the proposed method, it is applied to the problem of selecting suppliers for an automotive parts producer company in Iran in order to find the number of optimal orders allocated to each supplier considering the conflicting objectives. Finally, the obtained results are discussed.

Keywords: fuzzy multi-objective linear programming problems, triangular fuzzy numbers, fuzzy ranking, supplier selection problem

Procedia PDF Downloads 369
20763 Low Power, Highly Linear, Wideband LNA in Wireless SOC

Authors: Amir Mahdavi

Abstract:

In this paper a highly linear CMOS low noise amplifier (LNA) for ultra-wideband (UWB) applications is proposed. The proposed LNA uses a linearization technique to improve second and third-order intercept points (IIP3). The linearity is cured by repealing the common-mode section of all intermodulation components from the cascade topology current with optimization of biasing current use symmetrical and asymmetrical circuits for biasing. Simulation results show that maximum gain and noise figure are 6.9dB and 3.03-4.1dB over a 3.1–10.6 GHz, respectively. Power consumption of the LNA core and IIP3 are 2.64 mW and +4.9dBm respectively. The wideband input impedance matching of LNA is obtained by employing a degenerating inductor (|S11|<-9.1 dB). The circuit proposed UWB LNA is implemented using 0.18 μm based CMOS technology.

Keywords: highly linear LNA, low-power LNA, optimal bias techniques

Procedia PDF Downloads 271
20762 Effect of Variable Fluxes on Optimal Flux Distribution in a Metabolic Network

Authors: Ehsan Motamedian

Abstract:

Finding all optimal flux distributions of a metabolic model is an important challenge in systems biology. In this paper, a new algorithm is introduced to identify all alternate optimal solutions of a large scale metabolic network. The algorithm reduces the model to decrease computations for finding optimal solutions. The algorithm was implemented on the Escherichia coli metabolic model to find all optimal solutions for lactate and acetate production. There were more optimal flux distributions when acetate production was optimized. The model was reduced from 1076 to 80 variable fluxes for lactate while it was reduced to 91 variable fluxes for acetate. These 11 more variable fluxes resulted in about three times more optimal flux distributions. Variable fluxes were from 12 various metabolic pathways and most of them belonged to nucleotide salvage and extra cellular transport pathways.

Keywords: flux variability, metabolic network, mixed-integer linear programming, multiple optimal solutions

Procedia PDF Downloads 422
20761 Solving Linear Systems Involved in Convex Programming Problems

Authors: Yixun Shi

Abstract:

Many interior point methods for convex programming solve an (n+m)x(n+m)linear system in each iteration. Many implementations solve this system in each iteration by considering an equivalent mXm system (4) as listed in the paper, and thus the job is reduced into solving the system (4). However, the system(4) has to be solved exactly since otherwise the error would be entirely passed onto the last m equations of the original system. Often the Cholesky factorization is computed to obtain the exact solution of (4). One Cholesky factorization is to be done in every iteration, resulting in higher computational costs. In this paper, two iterative methods for solving linear systems using vector division are combined together and embedded into interior point methods. Instead of computing one Cholesky factorization in each iteration, it requires only one Cholesky factorization in the entire procedure, thus significantly reduces the amount of computation needed for solving the problem. Based on that, a hybrid algorithm for solving convex programming problems is proposed.

Keywords: convex programming, interior point method, linear systems, vector division

Procedia PDF Downloads 391
20760 Numerical Simulation of Kangimi Reservoir Sedimentation, Kaduna State, Nigeria

Authors: Abdurrasheed Sa'id, Abubakar Isma'il, Waheed Alayande

Abstract:

This study focused on carrying out numerical simulations of Kangimi reservoir sedimentation by reviewing different numerical sediment transport models, and GSTARS3 was selected. The model was developed using the 1977 data. It was calibrated by simulating the 2012 profile and sediment deposition and compared with 2012 hydrographic survey results of NWRI. The model was validated by simulating the 2016 deposition and compared the results with NWRI estimates. Also, the performance of the proposed model was tested using statistical parameters such as MSE (Mean Square Error), MAPE (Mean Average Percentage Error) and R2 (Coefficient of determination) with values of 1.32m, 0.17% and 0.914 respectively which shows strong agreement. After the calibration, validation and performance testing the model was used to simulate the 2032 and 2062 profiles and deposition. The results showed that by 2032 the reservoir will be silted by 25.34MCM or 43.3% of the design capacity and 60.7% of the capacity by the year 2062. A number of sedimentation mitigation measures were recommended.

Keywords: NWRI- national water resources institute, sedimentation, GSTARS3, model

Procedia PDF Downloads 210
20759 Real-Time Classification of Hemodynamic Response by Functional Near-Infrared Spectroscopy Using an Adaptive Estimation of General Linear Model Coefficients

Authors: Sahar Jahani, Meryem Ayse Yucel, David Boas, Seyed Kamaledin Setarehdan

Abstract:

Near-infrared spectroscopy allows monitoring of oxy- and deoxy-hemoglobin concentration changes associated with hemodynamic response function (HRF). HRF is usually affected by natural physiological hemodynamic (systemic interferences) which occur in all body tissues including brain tissue. This makes HRF extraction a very challenging task. In this study, we used Kalman filter based on a general linear model (GLM) of brain activity to define the proportion of systemic interference in the brain hemodynamic. The performance of the proposed algorithm is evaluated in terms of the peak to peak error (Ep), mean square error (MSE), and Pearson’s correlation coefficient (R2) criteria between the estimated and the simulated hemodynamic responses. This technique also has the ability of real time estimation of single trial functional activations as it was applied to classify finger tapping versus resting state. The average real-time classification accuracy of 74% over 11 subjects demonstrates the feasibility of developing an effective functional near infrared spectroscopy for brain computer interface purposes (fNIRS-BCI).

Keywords: hemodynamic response function, functional near-infrared spectroscopy, adaptive filter, Kalman filter

Procedia PDF Downloads 142
20758 Statistical Modelling of Maximum Temperature in Rwanda Using Extreme Value Analysis

Authors: Emmanuel Iyamuremye, Edouard Singirankabo, Alexis Habineza, Yunvirusaba Nelson

Abstract:

Temperature is one of the most important climatic factors for crop production. However, severe temperatures cause drought, feverish and cold spells that have various consequences for human life, agriculture, and the environment in general. It is necessary to provide reliable information related to the incidents and the probability of such extreme events occurring. In the 21st century, the world faces a huge number of threats, especially from climate change, due to global warming and environmental degradation. The rise in temperature has a direct effect on the decrease in rainfall. This has an impact on crop growth and development, which in turn decreases crop yield and quality. Countries that are heavily dependent on agriculture use to suffer a lot and need to take preventive steps to overcome these challenges. The main objective of this study is to model the statistical behaviour of extreme maximum temperature values in Rwanda. To achieve such an objective, the daily temperature data spanned the period from January 2000 to December 2017 recorded at nine weather stations collected from the Rwanda Meteorological Agency were used. The two methods, namely the block maxima (BM) method and the Peaks Over Threshold (POT), were applied to model and analyse extreme temperature. Model parameters were estimated, while the extreme temperature return periods and confidence intervals were predicted. The model fit suggests Gumbel and Beta distributions to be the most appropriate models for the annual maximum of daily temperature. The results show that the temperature will continue to increase, as shown by estimated return levels.

Keywords: climate change, global warming, extreme value theory, rwanda, temperature, generalised extreme value distribution, generalised pareto distribution

Procedia PDF Downloads 162
20757 The Impact of Model Specification Decisions on the Teacher ValuE-added Effectiveness: Choosing the Correct Predictors

Authors: Ismail Aslantas

Abstract:

Value-Added Models (VAMs), the statistical methods for evaluating the effectiveness of teachers and schools based on student achievement growth, has attracted decision-makers’ and researchers’ attention over the last decades. As a result of this attention, many studies have conducted in recent years to discuss these statistical models from different aspects. This research focused on the importance of conceptual variables in VAM estimations; therefor, this research was undertaken to examine the extent to which value-added effectiveness estimates for teachers can be affected by using context predictions. Using longitudinal data over three years from the international school context, value-added teacher effectiveness was estimated by ordinary least-square value-added models, and the effectiveness of the teachers was examined. The longitudinal dataset in this study consisted of three major sources: students’ attainment scores up to three years and their characteristics, teacher background information, and school characteristics. A total of 1,027 teachers and their 35,355 students who were in eighth grade were examined for understanding the impact of model specifications on the value-added teacher effectiveness evaluation. Models were created using selection methods that adding a predictor on each step, then removing it and adding another one on a subsequent step and evaluating changes in model fit was checked by reviewing changes in R² values. Cohen’s effect size statistics were also employed in order to find out the degree of the relationship between teacher characteristics and their effectiveness. Overall, the results indicated that prior attainment score is the most powerful predictor of the current attainment score. 47.1 percent of the variation in grade 8 math score can be explained by the prior attainment score in grade 7. The research findings raise issues to be considered in VAM implementations for teacher evaluations and make suggestions to researchers and practitioners.

Keywords: model specification, teacher effectiveness, teacher performance evaluation, value-added model

Procedia PDF Downloads 118
20756 Three-Dimensional Numerical Investigation for Reinforced Concrete Slabs with Opening

Authors: Abdelrahman Elsehsah, Hany Madkour, Khalid Farah

Abstract:

This article presents a 3-D modified non-linear elastic model in the strain space. The Helmholtz free energy function is introduced with the existence of a dissipation potential surface in the space of thermodynamic conjugate forces. The constitutive equation and the damage evolution were derived as well. The modified damage has been examined to model the nonlinear behavior of reinforced concrete (RC) slabs with an opening. A parametric study with RC was carried out to investigate the impact of different factors on the behavior of RC slabs. These factors are the opening area, the opening shape, the place of opening, and the thickness of the slabs. And the numerical results have been compared with the experimental data from literature. Finally, the model showed its ability to be applied to the structural analysis of RC slabs.

Keywords: damage mechanics, 3-D numerical analysis, RC, slab with opening

Procedia PDF Downloads 162
20755 Mathematical Modeling of District Cooling Systems

Authors: Dana Alghool, Tarek ElMekkawy, Mohamed Haouari, Adel Elomari

Abstract:

District cooling systems have captured the attentions of many researchers recently due to the enormous benefits offered by such system in comparison with traditional cooling technologies. It is considered a major component of urban cities due to the significant reduction of energy consumption. This paper aims to find the optimal design and operation of district cooling systems by developing a mixed integer linear programming model to minimize the annual total system cost and satisfy the end-user cooling demand. The proposed model is experimented with different cooling demand scenarios. The results of the very high cooling demand scenario are only presented in this paper. A sensitivity analysis on different parameters of the model was performed.

Keywords: Annual Cooling Demand, Compression Chiller, Mathematical Modeling, District Cooling Systems, Optimization

Procedia PDF Downloads 188
20754 Vendor Selection and Supply Quotas Determination by Using Revised Weighting Method and Multi-Objective Programming Methods

Authors: Tunjo Perič, Marin Fatović

Abstract:

In this paper a new methodology for vendor selection and supply quotas determination (VSSQD) is proposed. The problem of VSSQD is solved by the model that combines revised weighting method for determining the objective function coefficients, and a multiple objective linear programming (MOLP) method based on the cooperative game theory for VSSQD. The criteria used for VSSQD are: (1) purchase costs and (2) product quality supplied by individual vendors. The proposed methodology is tested on the example of flour purchase for a bakery with two decision makers.

Keywords: cooperative game theory, multiple objective linear programming, revised weighting method, vendor selection

Procedia PDF Downloads 342
20753 Estimating the Value of Statistical Life under the Subsidization and Cultural Effects

Authors: Mohammad A. Alolayan, John S. Evans, James K. Hammitt

Abstract:

The value of statistical life has been estimated for a middle eastern country with high economical subsidization system. In this study, in-person interviews were conducted on a stratified random sample to estimate the value of mortality risk. Double-bounded dichotomous choice questions followed by open-ended question were used in the interview to investigate the willingness to pay of the respondent for mortality risk reduction. High willingness to pay was found to be associated with high income and education. Also, females were found to have lower willingness to pay than males. The estimated value of statistical life is larger than the ones estimated for western countries where taxation system exists. This estimate provides a baseline for monetizing the health benefits for proposed policy or program to the decision makers in an eastern country. Also, the value of statistical life for a country in the region can be extrapolated from this this estimate by using the benefit transfer method.

Keywords: mortality, risk, VSL, willingness-to-pay

Procedia PDF Downloads 305
20752 Simple Rheological Method to Estimate the Branch Structures of Polyethylene under Reactive Modification

Authors: Mahdi Golriz

Abstract:

The aim of this work is to estimate the change in molecular structure of linear low-density polyethylene (LLDPE) during peroxide modification can be detected by a simple rheological method. For this purpose a commercial grade LLDPE (Exxon MobileTM LL4004EL) was reacted with different doses of dicumyl peroxide (DCP). The samples were analyzed by size-exclusion chromatography coupled with a light scattering detector. The dynamic shear oscillatory measurements showed a deviation of the δ-׀G ׀٭curve from that of the linear LLDPE, which can be attributed to the presence of long-chain branching (LCB). By the use of a simple rheological method that utilizes melt rheology, transformations in molecular architecture induced on an originally linear low density polyethylene during the early stages of reactive modification were indicated. Reasonable and consistent estimates are obtained, concerning the degree of LCB, the volume fraction of the various molecular species produced in peroxide modification of LLDPE.

Keywords: linear low-density polyethylene, peroxide modification, long-chain branching, rheological method

Procedia PDF Downloads 142
20751 Application of Statistical Linearized Models for Investigations of Digital Dynamic Pulse-Frequency Control Systems

Authors: B. H. Aitchanov, Sh. K. Aitchanova, O. A. Baimuratov

Abstract:

This paper is focused on dynamic pulse-frequency modulation (DPFM) control systems. Currently, the control law based on DPFM control signals is widely used in direct digital control subsystems introduced in the automated control systems of technological processes. Statistical analysis of automatic control systems is reduced to its construction of functional relationships between the statistical characteristics of the errors processes and input processes. Structural and dynamic Volterra models of digital pulse-frequency control systems can be used to develop methods for generating the dependencies, differing accuracy, requiring the amount of information about the statistical characteristics of input processes and computing labor intensity of their use.

Keywords: digital dynamic pulse-frequency control systems, dynamic pulse-frequency modulation, control object, discrete filter, impulse device, microcontroller

Procedia PDF Downloads 477
20750 Exploring Syntactic and Semantic Features for Text-Based Authorship Attribution

Authors: Haiyan Wu, Ying Liu, Shaoyun Shi

Abstract:

Authorship attribution is to extract features to identify authors of anonymous documents. Many previous works on authorship attribution focus on statistical style features (e.g., sentence/word length), content features (e.g., frequent words, n-grams). Modeling these features by regression or some transparent machine learning methods gives a portrait of the authors' writing style. But these methods do not capture the syntactic (e.g., dependency relationship) or semantic (e.g., topics) information. In recent years, some researchers model syntactic trees or latent semantic information by neural networks. However, few works take them together. Besides, predictions by neural networks are difficult to explain, which is vital in authorship attribution tasks. In this paper, we not only utilize the statistical style and content features but also take advantage of both syntactic and semantic features. Different from an end-to-end neural model, feature selection and prediction are two steps in our method. An attentive n-gram network is utilized to select useful features, and logistic regression is applied to give prediction and understandable representation of writing style. Experiments show that our extracted features can improve the state-of-the-art methods on three benchmark datasets.

Keywords: authorship attribution, attention mechanism, syntactic feature, feature extraction

Procedia PDF Downloads 123
20749 A Probabilistic Theory of the Buy-Low and Sell-High for Algorithmic Trading

Authors: Peter Shi

Abstract:

Algorithmic trading is a rapidly expanding domain within quantitative finance, constituting a substantial portion of trading volumes in the US financial market. The demand for rigorous and robust mathematical theories underpinning these trading algorithms is ever-growing. In this study, the author establishes a new stock market model that integrates the Efficient Market Hypothesis and the statistical arbitrage. The model, for the first time, finds probabilistic relations between the rational price and the market price in terms of the conditional expectation. The theory consequently leads to a mathematical justification of the old market adage: buy-low and sell-high. The thresholds for “low” and “high” are precisely derived using a max-min operation on Bayes’s error. This explicit connection harmonizes the Efficient Market Hypothesis and Statistical Arbitrage, demonstrating their compatibility in explaining market dynamics. The amalgamation represents a pioneering contribution to quantitative finance. The study culminates in comprehensive numerical tests using historical market data, affirming that the “buy-low” and “sell-high” algorithm derived from this theory significantly outperforms the general market over the long term in four out of six distinct market environments.

Keywords: efficient market hypothesis, behavioral finance, Bayes' decision, algorithmic trading, risk control, stock market

Procedia PDF Downloads 61
20748 Shock Compressibility of Iron Alloys Calculated in the Framework of Quantum-Statistical Models

Authors: Maxim A. Kadatskiy, Konstantin V. Khishchenko

Abstract:

Iron alloys are widespread components in various types of structural materials which are exposed to intensive thermal and mechanical loads. Various quantum-statistical cell models with the approximation of self-consistent field can be used for the prediction of the behavior of these materials under extreme conditions. The application of these models is even more valid, the higher the temperature and the density of matter. Results of Hugoniot calculation for iron alloys in the framework of three quantum-statistical (the Thomas–Fermi, the Thomas–Fermi with quantum and exchange corrections and the Hartree–Fock–Slater) models are presented. Results of quantum-statistical calculations are compared with results from other reliable models and available experimental data. It is revealed a good agreement between results of calculation and experimental data for terra pascal pressures. Advantages and disadvantages of this approach are shown.

Keywords: alloy, Hugoniot, iron, terapascal pressure

Procedia PDF Downloads 331
20747 Day of the Week Patterns and the Financial Trends' Role: Evidence from the Greek Stock Market during the Euro Era

Authors: Nikolaos Konstantopoulos, Aristeidis Samitas, Vasileiou Evangelos

Abstract:

The purpose of this study is to examine if the financial trends influence not only the stock markets’ returns, but also their anomalies. We choose to study the day of the week effect (DOW) for the Greek stock market during the Euro period (2002-12), because during the specific period there are not significant structural changes and there are long term financial trends. Moreover, in order to avoid possible methodological counterarguments that usually arise in the literature, we apply several linear (OLS) and nonlinear (GARCH family) models to our sample until we reach to the conclusion that the TGARCH model fits better to our sample than any other. Our results suggest that in the Greek stock market there is a long term predisposition for positive/negative returns depending on the weekday. However, the statistical significance is influenced from the financial trend. This influence may be the reason why there are conflict findings in the literature through the time. Finally, we combine the DOW’s empirical findings from 1985-2012 and we may assume that in the Greek case there is a tendency for long lived turn of the week effect.

Keywords: day of the week effect, GARCH family models, Athens stock exchange, economic growth, crisis

Procedia PDF Downloads 397
20746 Robustness Analysis of the Carbon and Nitrogen Co-Metabolism Model of Mucor mucedo

Authors: Nahid Banihashemi

Abstract:

An emerging important area of the life sciences is systems biology, which involves understanding the integrated behavior of large numbers of components interacting via non-linear reaction terms. A centrally important problem in this area is an understanding of the co-metabolism of protein and carbohydrate, as it has been clearly demonstrated that the ratio of these metabolites in diet is a major determinant of obesity and related chronic disease. In this regard, we have considered a systems biology model for the co-metabolism of carbon and nitrogen in colonies of the fungus Mucor mucedo. Oscillations are an important diagnostic of underlying dynamical processes of this model. The maintenance of specific patterns of oscillation and its relation to the robustness of this system are the important issues which have been targeted in this paper. In this regard, parametric sensitivity approach as a theoretical approach has been considered for the analysis of the robustness of this model. As a result, the parameters of the model which produce the largest sensitivities have been identified. Furthermore, the largest changes that can be made in each parameter of the model without losing the oscillations in biomass production have been computed. The results are obtained from the implementation of parametric sensitivity analysis in Matlab.

Keywords: system biology, parametric sensitivity analysis, robustness, carbon and nitrogen co-metabolism, Mucor mucedo

Procedia PDF Downloads 315
20745 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength

Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos

Abstract:

Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.

Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables

Procedia PDF Downloads 324
20744 Impact of Crises on Official Statistics: Environmental Statistics at Statistical Centre for the Cooperation Council for the Arab Countries of the Gulf during the COVID-19 Pandemic: A Case Study

Authors: Ibtihaj Al-Siyabi

Abstract:

The crisis of COVID-19 posed enormous challenges to the statistical providers. While official statistics were disrupted by the pandemic and related containment measures, there was a growing and pressing need for real-time data and statistics to inform decisions. This paper gives an account of the way the pandemic impacted the operations of the National Statistical Offices (NSOs) in general in terms of data collection and methods used and the main challenges encountered by them based on international surveys. It highlights the performance of the Statistical Centre for the Cooperation Council for the Arab Countries of the Gulf, GCC-STAT, and its responsiveness to the pandemic placing special emphasis on environmental statistics. The paper concludes by confirming the GCC-STAT’s resilience and success in facing the challenges.

Keywords: NSO, COVID-19, statistics, crisis, pandemic

Procedia PDF Downloads 116
20743 Harnessing Entrepreneurial Opportunities for National Security

Authors: Itiola Kehinde Adeniran

Abstract:

This paper investigated the influence of harnessing entrepreneurial opportunities on the national security in Nigeria with a specific focus on the security situation of the post-amnesty programmes of the Federal Government in Ondo State. The self-administered structured questionnaire was employed to collect data from one hundred and twenty participants through purposive sampling method. Inferential statistics was used to analyze the data, specifically; ordinary least squares linear regression method was employed with the aid of statistical package for social science (SPSS) version 20 in order to determine the influence of independent variable (entrepreneurial opportunities) on dependent variable (national security). The result showed that business opportunities have a significant influence on the rate of criminal activities. The study also revealed that entrepreneurial opportunity creation and discovery as well as providing a model on how these entrepreneurial opportunities could be effectively and efficiently utilized jointly predict better national security, which counted for 69% variance of crime rate reduction. The paper, therefore, recommended that citizens should be encouraged to develop an interest in the skill-based activities in order to change their mindset towards self-employment which can motivate them in identify entrepreneurial opportunities.

Keywords: entrepreneurship, entrepreneurial opportunities, national security, unemployment

Procedia PDF Downloads 315
20742 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach

Authors: Gong Zhilin, Jing Yang, Jian Yin

Abstract:

The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).

Keywords: credit card, data mining, fraud detection, money transactions

Procedia PDF Downloads 117
20741 A Hybrid Method for Determination of Effective Poles Using Clustering Dominant Pole Algorithm

Authors: Anuj Abraham, N. Pappa, Daniel Honc, Rahul Sharma

Abstract:

In this paper, an analysis of some model order reduction techniques is presented. A new hybrid algorithm for model order reduction of linear time invariant systems is compared with the conventional techniques namely Balanced Truncation, Hankel Norm reduction and Dominant Pole Algorithm (DPA). The proposed hybrid algorithm is known as Clustering Dominant Pole Algorithm (CDPA) is able to compute the full set of dominant poles and its cluster center efficiently. The dominant poles of a transfer function are specific eigenvalues of the state space matrix of the corresponding dynamical system. The effectiveness of this novel technique is shown through the simulation results.

Keywords: balanced truncation, clustering, dominant pole, Hankel norm, model reduction

Procedia PDF Downloads 589
20740 Electroencephalography Correlates of Memorability While Viewing Advertising Content

Authors: Victor N. Anisimov, Igor E. Serov, Ksenia M. Kolkova, Natalia V. Galkina

Abstract:

The problem of memorability of the advertising content is closely connected with the key issues of neuromarketing. The memorability of the advertising content contributes to the marketing effectiveness of the promoted product. Significant directions of studying the phenomenon of memorability are the memorability of the brand (detected through the memorability of the logo) and the memorability of the product offer (detected through the memorization of dynamic audiovisual advertising content - commercial). The aim of this work is to reveal the predictors of memorization of static and dynamic audiovisual stimuli (logos and commercials). An important direction of the research was revealing differences in psychophysiological correlates of memorability between static and dynamic audiovisual stimuli. We assumed that static and dynamic images are perceived in different ways and may have a difference in the memorization process. Objective methods of recording psychophysiological parameters while watching static and dynamic audiovisual materials are well suited to achieve the aim. The electroencephalography (EEG) method was performed with the aim of identifying correlates of the memorability of various stimuli in the electrical activity of the cerebral cortex. All stimuli (in the groups of statics and dynamics separately) were divided into 2 groups – remembered and not remembered based on the results of the questioning method. The questionnaires were filled out by survey participants after viewing the stimuli not immediately, but after a time interval (for detecting stimuli recorded through long-term memorization). Using statistical method, we developed the classifier (statistical model) that predicts which group (remembered or not remembered) stimuli gets, based on psychophysiological perception. The result of the statistical model was compared with the results of the questionnaire. Conclusions: Predictors of the memorability of static and dynamic stimuli have been identified, which allows prediction of which stimuli will have a higher probability of remembering. Further developments of this study will be the creation of stimulus memory model with the possibility of recognizing the stimulus as previously seen or new. Thus, in the process of remembering the stimulus, it is planned to take into account the stimulus recognition factor, which is one of the most important tasks for neuromarketing.

Keywords: memory, commercials, neuromarketing, EEG, branding

Procedia PDF Downloads 239
20739 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation

Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski

Abstract:

In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.

Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming

Procedia PDF Downloads 394