Search results for: Wiener model
14247 Recognition of Voice Commands of Mentor Robot in Noisy Environment Using Hidden Markov Model
Authors: Khenfer Koummich Fatma, Hendel Fatiha, Mesbahi Larbi
Abstract:
This paper presents an approach based on Hidden Markov Models (HMM: Hidden Markov Model) using HTK tools. The goal is to create a human-machine interface with a voice recognition system that allows the operator to teleoperate a mentor robot to execute specific tasks as rotate, raise, close, etc. This system should take into account different levels of environmental noise. This approach has been applied to isolated words representing the robot commands pronounced in two languages: French and Arabic. The obtained recognition rate is the same in both speeches, Arabic and French in the neutral words. However, there is a slight difference in favor of the Arabic speech when Gaussian white noise is added with a Signal to Noise Ratio (SNR) equals 30 dB, in this case; the Arabic speech recognition rate is 69%, and the French speech recognition rate is 80%. This can be explained by the ability of phonetic context of each speech when the noise is added.Keywords: Arabic speech recognition, Hidden Markov Model (HMM), HTK, noise, TIMIT, voice command
Procedia PDF Downloads 39114246 Wolof Voice Response Recognition System: A Deep Learning Model for Wolof Audio Classification
Authors: Krishna Mohan Bathula, Fatou Bintou Loucoubar, FNU Kaleemunnisa, Christelle Scharff, Mark Anthony De Castro
Abstract:
Voice recognition algorithms such as automatic speech recognition and text-to-speech systems with African languages can play an important role in bridging the digital divide of Artificial Intelligence in Africa, contributing to the establishment of a fully inclusive information society. This paper proposes a Deep Learning model that can classify the user responses as inputs for an interactive voice response system. A dataset with Wolof language words ‘yes’ and ‘no’ is collected as audio recordings. A two stage Data Augmentation approach is adopted for enhancing the dataset size required by the deep neural network. Data preprocessing and feature engineering with Mel-Frequency Cepstral Coefficients are implemented. Convolutional Neural Networks (CNNs) have proven to be very powerful in image classification and are promising for audio processing when sounds are transformed into spectra. For performing voice response classification, the recordings are transformed into sound frequency feature spectra and then applied image classification methodology using a deep CNN model. The inference model of this trained and reusable Wolof voice response recognition system can be integrated with many applications associated with both web and mobile platforms.Keywords: automatic speech recognition, interactive voice response, voice response recognition, wolof word classification
Procedia PDF Downloads 11814245 Modelling Patient Condition-Based Demand for Managing Hospital Inventory
Authors: Esha Saha, Pradip Kumar Ray
Abstract:
A hospital inventory comprises of a large number and great variety of items for the proper treatment and care of patients, such as pharmaceuticals, medical equipment, surgical items, etc. Improper management of these items, i.e. stockouts, may lead to delay in treatment or other fatal consequences, even death of the patient. So, generally the hospitals tend to overstock items to avoid the risk of stockout which leads to unnecessary investment of money, difficulty in storing, more expiration and wastage, etc. Thus, in such challenging environment, it is necessary for hospitals to follow an inventory policy considering the stochasticity of demand in a hospital. Statistical analysis captures the correlation of patient condition based on bed occupancy with the patient demand which changes stochastically. Due to the dependency on bed occupancy, the markov model is developed that helps to map the changes in demand of hospital inventory based on the changes in the patient condition represented by the movements of bed occupancy states (acute care state, rehabilitative state and long-care state) during the length-of-stay of patient in a hospital. An inventory policy is developed for a hospital based on the fulfillment of patient demand with the objective of minimizing the frequency and quantity of placement of orders of inventoried items. The analytical structure of the model based on probability calculation is provided to show the optimal inventory-related decisions. A case-study is illustrated in this paper for the development of hospital inventory model based on patient demand for multiple inpatient pharmaceutical items. A sensitivity analysis is conducted to investigate the impact of inventory-related parameters on the developed optimal inventory policy. Therefore, the developed model and solution approach may help the hospital managers and pharmacists in managing the hospital inventory in case of stochastic demand of inpatient pharmaceutical items.Keywords: bed occupancy, hospital inventory, markov model, patient condition, pharmaceutical items
Procedia PDF Downloads 32614244 Order Fulfilment Strategy in E-Commerce Warehouse Based on Simulation: Business Customers Case
Authors: Aurelija Burinskiene
Abstract:
This paper presents the study for an e-commerce warehouse. The study is aiming to improve order fulfillment activity by identifying the strategy presenting the best performance. A simulation model was proposed to reach the target of this research. This model enables various scenario tests in an e-commerce warehouse, allowing them to find out for the best order fulfillment strategy. By using simulation, model authors investigated customers’ orders representing on-line purchases for one month. Experiments were designed to evaluate various order picking methods applicable to the fulfillment of customers’ orders. The research uses cost components analysis and helps to identify the best possible order picking method improving the overall performance of e-commerce warehouse and fulfillment service to the customers. The results presented show that the application of order batching strategy is the most applicable because it brings distance savings of around 6.7 percentage. This result could be improved by taking an assortment clustering action until 8.34 percentage. So, the recommendations were given to apply the method for future e-commerce warehouse operations.Keywords: e-commerce, order, fulfilment, strategy, simulation
Procedia PDF Downloads 15114243 Reliability-Centered Maintenance Application for the Development of Maintenance Strategy for a Cement Plant
Authors: Nabil Hameed Al-Farsi
Abstract:
This study’s main goal is to develop a model and a maintenance strategy for a cement factory called Arabian Cement Company, Rabigh Plant. The proposed work here depends on Reliability centric maintenance approach to develop a strategy and maintenance schedule that ensures increasing the reliability of the production system components, thus ensuring continuous productivity. The cost-effective maintenance of the plant’s dependability performance is the key goal of durability-based maintenance is. The cement plant consists of 7 important steps, so, developing a maintenance plan based on Reliability centric maintenance (RCM) method is made up of 10 steps accordingly starting from selecting units and data until performing and updating the model. The processing unit chosen for the analysis of this case is the calcinatory unit regarding model’s validation and the Travancore Titanium Products Ltd (TTP) using the claimed data history acquired from the maintenance department maintenance from the mentioned company. After applying the proposed model, the results of the maintenance simulation justified the plant's existing scheduled maintenance policy being reconsidered. Results represent the need for preventive maintenance for all Class A criticality equipment instead of the planned maintenance and the breakdown one for all other equipment depends on its criticality and an FMEA report. Consequently, the additional cost of preventive maintenance would be offset by the cost savings from breakdown maintenance for the remaining equipment.Keywords: engineering, reliability, strategy, maintenance, failure modes, effects and criticality analysis (FMEA)
Procedia PDF Downloads 17514242 Hydrological Evaluation of Satellite Precipitation Products Using IHACRES Rainfall-Runoff Model over a Basin in Iran
Authors: Mahmoud Zakeri Niri, Saber Moazami, Arman Abdollahipour, Hossein Ghalkhani
Abstract:
The objective of this research is to hydrological evaluation of four widely-used satellite precipitation products named PERSIANN, TMPA-3B42V7, TMPA-3B42RT, and CMORPH over Zarinehrood basin in Iran. For this aim, at first, daily streamflow of Sarough-cahy river of Zarinehrood basin was simulated using IHACRES rainfall-runoff model with daily rain gauge and temperature as input data from 1988 to 2008. Then, the model was calibrated in two different periods through comparison the simulated discharge with the observed one at hydrometric stations. Moreover, in order to evaluate the performance of satellite precipitation products in streamflow simulation, the calibrated model was validated using daily satellite rainfall estimates from the period of 2003 to 2008. The obtained results indicated that TMPA-3B42V7 with CC of 0.69, RMSE of 5.93 mm/day, MAE of 4.76 mm/day, and RBias of -5.39% performs better simulation of streamflow than those PERSIANN and CMORPH over the study area. It is noteworthy that in Iran, the availability of ground measuring station data is very limited because of the sparse density of hydro-meteorological networks. On the other hand, large spatial and temporal variability of precipitations and lack of a reliable and extensive observing system are the most important challenges to rainfall analysis, flood prediction, and other hydrological applications in this country.Keywords: hydrological evaluation, IHACRES, satellite precipitation product, streamflow simulation
Procedia PDF Downloads 24214241 Computational Fluid Dynamics Simulation and Comparison of Flow through Mechanical Heart Valve Using Newtonian and Non-Newtonian Fluid
Authors: D. Šedivý, S. Fialová
Abstract:
The main purpose of this study is to show differences between the numerical solution of the flow through the artificial heart valve using Newtonian or non-Newtonian fluid. The simulation was carried out by a commercial computational fluid dynamics (CFD) package based on finite-volume method. An aortic bileaflet heart valve (Sorin Bicarbon) was used as a pattern for model of real heart valve replacement. Computed tomography (CT) was used to gain the accurate parameters of the valve. Data from CT were transferred in the commercial 3D designer, where the model for CFD was made. Carreau rheology model was applied as non-Newtonian fluid. Physiological data of cardiac cycle were used as boundary conditions. Outputs were taken the leaflets excursion from opening to closure and the fluid dynamics through the valve. This study also includes experimental measurement of pressure fields in ambience of valve for verification numerical outputs. Results put in evidence a favorable comparison between the computational solutions of flow through the mechanical heart valve using Newtonian and non-Newtonian fluid.Keywords: computational modeling, dynamic mesh, mechanical heart valve, non-Newtonian fluid
Procedia PDF Downloads 38814240 Modeling Of The Random Impingement Erosion Due To The Impact Of The Solid Particles
Authors: Siamack A. Shirazi, Farzin Darihaki
Abstract:
Solid particles could be found in many multiphase flows, including transport pipelines and pipe fittings. Such particles interact with the pipe material and cause erosion which threats the integrity of the system. Therefore, predicting the erosion rate is an important factor in the design and the monitor of such systems. Mechanistic models can provide reliable predictions for many conditions while demanding only relatively low computational cost. Mechanistic models utilize a representative particle trajectory to predict the impact characteristics of the majority of the particle impacts that cause maximum erosion rate in the domain. The erosion caused by particle impacts is not only due to the direct impacts but also random impingements. In the present study, an alternative model has been introduced to describe the erosion due to random impingement of particles. The present model provides a realistic trend for erosion with changes in the particle size and particle Stokes number. The present model is examined against the experimental data and CFD simulation results and indicates better agreement with the data incomparison to the available models in the literature.Keywords: erosion, mechanistic modeling, particles, multiphase flow, gas-liquid-solid
Procedia PDF Downloads 17014239 Mathematical Modeling for Continuous Reactive Extrusion of Poly Lactic Acid Formation by Ring Opening Polymerization Considering Metal/Organic Catalyst and Alternative Energies
Authors: Satya P. Dubey, Hrushikesh A Abhyankar, Veronica Marchante, James L. Brighton, Björn Bergmann
Abstract:
Aims: To develop a mathematical model that simulates the ROP of PLA taking into account the effect of alternative energy to be implemented in a continuous reactive extrusion production process of PLA. Introduction: The production of large amount of waste is one of the major challenges at the present time, and polymers represent 70% of global waste. PLA has emerged as a promising polymer as it is compostable, biodegradable thermoplastic polymer made from renewable sources. However, the main limitation for the application of PLA is the traces of toxic metal catalyst in the final product. Thus, a safe and efficient production process needs to be developed to avoid the potential hazards and toxicity. It has been found that alternative energy sources (LASER, ultrasounds, microwaves) could be a prominent option to facilitate the ROP of PLA via continuous reactive extrusion. This process may result in complete extraction of the metal catalysts and facilitate less active organic catalysts. Methodology: Initial investigation were performed using the data available in literature for the reaction mechanism of ROP of PLA based on conventional metal catalyst stannous octoate. A mathematical model has been developed by considering significant parameters such as different initial concentration ratio of catalyst, co-catalyst and impurity. Effects of temperature variation and alternative energies have been implemented in the model. Results: The validation of the mathematical model has been made by using data from literature as well as actual experiments. Validation of the model including alternative energies is in progress based on experimental data for partners of the InnoREX project consortium. Conclusion: The model developed reproduces accurately the polymerisation reaction when applying alternative energy. Alternative energies have a great positive effect to increase the conversion and molecular weight of the PLA. This model could be very useful tool to complement Ludovic® software to predict the large scale production process when using reactive extrusion.Keywords: polymer, poly-lactic acid (PLA), ring opening polymerization (ROP), metal-catalyst, bio-degradable, renewable source, alternative energy (AE)
Procedia PDF Downloads 36314238 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis
Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate
Abstract:
This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull
Procedia PDF Downloads 7414237 Heat and Mass Transfer Modelling of Industrial Sludge Drying at Different Pressures and Temperatures
Authors: L. Al Ahmad, C. Latrille, D. Hainos, D. Blanc, M. Clausse
Abstract:
A two-dimensional finite volume axisymmetric model is developed to predict the simultaneous heat and mass transfers during the drying of industrial sludge. The simulations were run using COMSOL-Multiphysics 3.5a. The input parameters of the numerical model were acquired from a preliminary experimental work. Results permit to establish correlations describing the evolution of the various parameters as a function of the drying temperature and the sludge water content. The selection and coupling of the equation are validated based on the drying kinetics acquired experimentally at a temperature range of 45-65 °C and absolute pressure range of 200-1000 mbar. The model, incorporating the heat and mass transfer mechanisms at different operating conditions, shows simulated values of temperature and water content. Simulated results are found concordant with the experimental values, only at the first and last drying stages where sludge shrinkage is insignificant. Simulated and experimental results show that sludge drying is favored at high temperatures and low pressure. As experimentally observed, the drying time is reduced by 68% for drying at 65 °C compared to 45 °C under 1 atm. At 65 °C, a 200-mbar absolute pressure vacuum leads to an additional reduction in drying time estimated by 61%. However, the drying rate is underestimated in the intermediate stage. This rate underestimation could be improved in the model by considering the shrinkage phenomena that occurs during sludge drying.Keywords: industrial sludge drying, heat transfer, mass transfer, mathematical modelling
Procedia PDF Downloads 13614236 Homeless Population Modeling and Trend Prediction Through Identifying Key Factors and Machine Learning
Authors: Shayla He
Abstract:
Background and Purpose: According to Chamie (2017), it’s estimated that no less than 150 million people, or about 2 percent of the world’s population, are homeless. The homeless population in the United States has grown rapidly in the past four decades. In New York City, the sheltered homeless population has increased from 12,830 in 1983 to 62,679 in 2020. Knowing the trend on the homeless population is crucial at helping the states and the cities make affordable housing plans, and other community service plans ahead of time to better prepare for the situation. This study utilized the data from New York City, examined the key factors associated with the homelessness, and developed systematic modeling to predict homeless populations of the future. Using the best model developed, named HP-RNN, an analysis on the homeless population change during the months of 2020 and 2021, which were impacted by the COVID-19 pandemic, was conducted. Moreover, HP-RNN was tested on the data from Seattle. Methods: The methodology involves four phases in developing robust prediction methods. Phase 1 gathered and analyzed raw data of homeless population and demographic conditions from five urban centers. Phase 2 identified the key factors that contribute to the rate of homelessness. In Phase 3, three models were built using Linear Regression, Random Forest, and Recurrent Neural Network (RNN), respectively, to predict the future trend of society's homeless population. Each model was trained and tuned based on the dataset from New York City for its accuracy measured by Mean Squared Error (MSE). In Phase 4, the final phase, the best model from Phase 3 was evaluated using the data from Seattle that was not part of the model training and tuning process in Phase 3. Results: Compared to the Linear Regression based model used by HUD et al (2019), HP-RNN significantly improved the prediction metrics of Coefficient of Determination (R2) from -11.73 to 0.88 and MSE by 99%. HP-RNN was then validated on the data from Seattle, WA, which showed a peak %error of 14.5% between the actual and the predicted count. Finally, the modeling results were collected to predict the trend during the COVID-19 pandemic. It shows a good correlation between the actual and the predicted homeless population, with the peak %error less than 8.6%. Conclusions and Implications: This work is the first work to apply RNN to model the time series of the homeless related data. The Model shows a close correlation between the actual and the predicted homeless population. There are two major implications of this result. First, the model can be used to predict the homeless population for the next several years, and the prediction can help the states and the cities plan ahead on affordable housing allocation and other community service to better prepare for the future. Moreover, this prediction can serve as a reference to policy makers and legislators as they seek to make changes that may impact the factors closely associated with the future homeless population trend.Keywords: homeless, prediction, model, RNN
Procedia PDF Downloads 12214235 Investigations of Bergy Bits and Ship Interactions in Extreme Waves Using Smoothed Particle Hydrodynamics
Authors: Mohammed Islam, Jungyong Wang, Dong Cheol Seo
Abstract:
The Smoothed Particle Hydrodynamics (SPH) method is a novel, meshless, and Lagrangian technique based numerical method that has shown promises to accurately predict the hydrodynamics of water and structure interactions in violent flow conditions. The main goal of this study is to build confidence on the versatility of the Smoothed Particle Hydrodynamics (SPH) based tool, to use it as a complementary tool to the physical model testing capabilities and support research need for the performance evaluation of ships and offshore platforms exposed to an extreme and harsh environment. In the current endeavor, an open-sourced SPH-based tool was used and validated for modeling and predictions of the hydrodynamic interactions of a 6-DOF ship and bergy bits. The study involved the modeling of a modern generic drillship and simplified bergy bits in floating and towing scenarios and in regular and irregular wave conditions. The predictions were validated using the model-scale measurements on a moored ship towed at multiple oblique angles approaching a floating bergy bit in waves. Overall, this study results in a thorough comparison between the model scale measurements and the prediction outcomes from the SPH tool for performance and accuracy. The SPH predicted ship motions and forces were primarily within ±5% of the measurements. The velocity and pressure distribution and wave characteristics over the free surface depicts realistic interactions of the wave, ship, and the bergy bit. This work identifies and presents several challenges in preparing the input file, particularly while defining the mass properties of complex geometry, the computational requirements, and the post-processing of the outcomes.Keywords: SPH, ship and bergy bit, hydrodynamic interactions, model validation, physical model testing
Procedia PDF Downloads 13314234 Structure Function and Violation of Scale Invariance in NCSM: Theory and Numerical Analysis
Authors: M. R. Bekli, N. Mebarki, I. Chadou
Abstract:
In this study, we focus on the structure functions and violation of scale invariance in the context of non-commutative standard model (NCSM). We find that this violation appears in the first order of perturbation theory and a non-commutative version of the DGLAP evolution equation is deduced. Numerical analysis and comparison with experimental data imposes a new bound on the non-commutative parameter.Keywords: NCSM, structure function, DGLAP equation, standard model
Procedia PDF Downloads 61214233 Comparing Forecasting Performances of the Bass Diffusion Model and Time Series Methods for Sales of Electric Vehicles
Authors: Andreas Gohs, Reinhold Kosfeld
Abstract:
This study should be of interest for practitioners who want to predict precisely the sales numbers of vehicles equipped with an innovative propulsion technology as well as for researchers interested in applied (regional) time series analysis. The study is based on the numbers of new registrations of pure electric and hybrid cars. Methods of time series analysis like ARIMA are compared with the Bass Diffusion-model concerning their forecasting performances for new registrations in Germany at the national and federal state levels. Especially it is investigated if the additional information content from regional data increases the forecasting accuracy for the national level by adding predictions for the federal states. Results of parameters of the Bass Diffusion Model estimated for Germany and its sixteen federal states are reported. While the focus of this research is on the German market, estimation results are also provided for selected European and other countries. Concerning Bass-parameters and forecasting performances, we get very different results for Germany's federal states and the member states of the European Union. This corresponds to differences across the EU-member states in the adoption process of this innovative technology. Concerning the German market, the adoption is rather proceeded in southern Germany and stays behind in Eastern Germany except for Berlin.Keywords: bass diffusion model, electric vehicles, forecasting performance, market diffusion
Procedia PDF Downloads 17014232 Management of Local Towns (Tambon) According to Philosophy of Sufficiency Economy
Authors: Wichian Sriprachan, Chutikarn Sriviboon
Abstract:
The objectives of this research were to study the management of local towns and to develop a better model of town management according to the Philosophy of Sufficiency Economy. This study utilized qualitative research, field research, as well as documentary research at the same time. A total of 10 local towns or Tambons of Supanburi province, Thailand were selected for an in-depth interview. The findings revealed that the model of local town management according to Philosophy of Sufficient Economy was in a level of “good” and the model of management has the five basic guidelines: 1) ability to manage budget information and keep it up-to-date, 2) ability to decision making according to democracy rules, 3) ability to use check and balance system, 4) ability to control, follow, and evaluation, and 5) ability to allow the general public to participate. In addition, the findings also revealed that the human resource management according to Philosophy of Sufficient Economy includes obeying laws, using proper knowledge, and having integrity in five areas: plan, recruit, select, train, and maintain human resources.Keywords: management, local town (Tambon), principles of sufficiency economy, marketing management
Procedia PDF Downloads 34914231 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards
Authors: Golnush Masghati-Amoli, Paul Chin
Abstract:
Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering
Procedia PDF Downloads 13714230 Modelling the Dynamics and Optimal Control Strategies of Terrorism within the Southern Borno State Nigeria
Authors: Lubem Matthew Kwaghkor
Abstract:
Terrorism, which remains one of the largest threats faced by various nations and communities around the world, including Nigeria, is the calculated use of violence to create a general climate of fear in a population to attain particular goals that might be political, religious, or economical. Several terrorist groups are currently active in Nigeria, leading to attacks on both civil and military targets. Among these groups, Boko Haram is the deadliest terrorist group operating majorly in Borno State. The southern part of Borno State in North-Eastern Nigeria has been plagued by terrorism, insurgency, and conflict for several years. Understanding the dynamics of terrorism is crucial for developing effective strategies to mitigate its impact on communities and to facilitate peace-building efforts. This research aims to develop a mathematical model that captures the dynamics of terrorism within the southern part of Borno State, Nigeria, capturing both government and local community intervention strategies as control measures in combating terrorism. A compartmental model of five nonlinear differential equations is formulated. The model analyses show that a feasible solution set of the model exists and is bounded. Stability analyses show that both the terrorism free equilibrium and the terrorism endermic equilibrium are asymptotically stable, making the model to have biological meaning. Optimal control theory will be employed to identify the most effective strategy to prevent or minimize acts of terrorism. The research outcomes are expected to contribute towards enhancing security and stability in Southern Borno State while providing valuable insights for policymakers, security agencies, and researchers. This is an ongoing research.Keywords: modelling, terrorism, optimal control, susceptible, non-susceptible, community intervention
Procedia PDF Downloads 2614229 Business-to-Business Deals Based on a Co-Utile Collaboration Mechanism: Designing Trust Company of the Future
Authors: Riccardo Bonazzi, Michaël Poli, Abeba Nigussie Turi
Abstract:
This paper presents an applied research of a new module for the financial administration and management industry, Personalizable and Automated Checklists Integrator, Overseeing Legal Investigations (PACIOLI). It aims at designing the business model of the trust company of the future. By identifying the key stakeholders, we draw a general business process design of the industry. The business model focuses on disintermediating the traditional form of business through the new technological solutions of a software company based in Switzerland and hence creating a new interactive platform. The key stakeholders of this interactive platform are identified as IT experts, legal experts, and the New Edge Trust Company (NATC). The mechanism we design and propose has a great importance in improving the efficiency of the financial business administration and management industry, and it also helps to foster the provision of high value added services in the sector.Keywords: new edge trust company, business model design, automated checklists, financial technology
Procedia PDF Downloads 37614228 Demonstration of Land Use Changes Simulation Using Urban Climate Model
Authors: Barbara Vojvodikova, Katerina Jupova, Iva Ticha
Abstract:
Cities in their historical evolution have always adapted their internal structure to the needs of society (for example protective city walls during classicism era lost their defense function, became unnecessary, were demolished and gave space for new features such as roads, museums or parks). Today it is necessary to modify the internal structure of the city in order to minimize the impact of climate changes on the environment of the population. This article discusses the results of the Urban Climate model owned by VITO, which was carried out as part of a project from the European Union's Horizon grant agreement No 730004 Pan-European Urban Climate Services Climate-Fit city. The use of the model was aimed at changes in land use and land cover in cities related to urban heat islands (UHI). The task of the application was to evaluate possible land use change scenarios in connection with city requirements and ideas. Two pilot areas in the Czech Republic were selected. One is Ostrava and the other Hodonín. The paper provides a demonstration of the application of the model for various possible future development scenarios. It contains an assessment of the suitability or inappropriateness of scenarios of future development depending on the temperature increase. Cities that are preparing to reconstruct the public space are interested in eliminating proposals that would lead to an increase in temperature stress as early as in the assignment phase. If they have evaluation on the unsuitability of some type of design, they can limit it into the proposal phases. Therefore, especially in the application of models on Local level - in 1 m spatial resolution, it was necessary to show which type of proposals would create a significant temperature island in its implementation. Such a type of proposal is considered unsuitable. The model shows that the building itself can create a shady place and thus contribute to the reduction of the UHI. If it sensitively approaches the protection of existing greenery, this new construction may not pose a significant problem. More massive interventions leading to the reduction of existing greenery create a new heat island space.Keywords: climate model, heat islands, Hodonin, land use changes, Ostrava
Procedia PDF Downloads 14414227 Development of a 3D Model of Real Estate Properties in Fort Bonifacio, Taguig City, Philippines Using Geographic Information Systems
Authors: Lyka Selene Magnayi, Marcos Vinas, Roseanne Ramos
Abstract:
As the real estate industry continually grows in the Philippines, Geographic Information Systems (GIS) provide advantages in generating spatial databases for efficient delivery of information and services. The real estate sector is not only providing qualitative data about real estate properties but also utilizes various spatial aspects of these properties for different applications such as hazard mapping and assessment. In this study, a three-dimensional (3D) model and a spatial database of real estate properties in Fort Bonifacio, Taguig City are developed using GIS and SketchUp. Spatial datasets include political boundaries, buildings, road network, digital terrain model (DTM) derived from Interferometric Synthetic Aperture Radar (IFSAR) image, Google Earth satellite imageries, and hazard maps. Multiple model layers were created based on property listings by a partner real estate company, including existing and future property buildings. Actual building dimensions, building facade, and building floorplans are incorporated in these 3D models for geovisualization. Hazard model layers are determined through spatial overlays, and different scenarios of hazards are also presented in the models. Animated maps and walkthrough videos were created for company presentation and evaluation. Model evaluation is conducted through client surveys requiring scores in terms of the appropriateness, information content, and design of the 3D models. Survey results show very satisfactory ratings, with the highest average evaluation score equivalent to 9.21 out of 10. The output maps and videos obtained passing rates based on the criteria and standards set by the intended users of the partner real estate company. The methodologies presented in this study were found useful and have remarkable advantages in the real estate industry. This work may be extended to automated mapping and creation of online spatial databases for better storage, access of real property listings and interactive platform using web-based GIS.Keywords: geovisualization, geographic information systems, GIS, real estate, spatial database, three-dimensional model
Procedia PDF Downloads 16014226 Quasistationary States and Mean Field Model
Authors: Sergio Curilef, Boris Atenas
Abstract:
Systems with long-range interactions are very common in nature. They are observed from the atomic scale to the astronomical scale and exhibit anomalies, such as inequivalence of ensembles, negative heat capacity, ergodicity breaking, nonequilibrium phase transitions, quasistationary states, and anomalous diffusion. These anomalies are exacerbated when special initial conditions are imposed; in particular, we use the so-called water bag initial conditions that stand for a uniform distribution. Several theoretical and practical implications are discussed here. A potential energy inspired by dipole-dipole interactions is proposed to build the dipole-type Hamiltonian mean-field model. As expected, the dynamics is novel and general to the behavior of systems with long-range interactions, which is obtained through molecular dynamics technique. Two plateaus sequentially emerge before arriving at equilibrium, which are corresponding to two different quasistationary states. The first plateau is a type of quasistationary state the lifetime of which depends on a power law of N and the second plateau seems to be a true quasistationary state as reported in the literature. The general behavior of the model according to its dynamics and thermodynamics is described. Using numerical simulation we characterize the mean kinetic energy, caloric curve, and the diffusion law through the mean square of displacement. The present challenge is to characterize the distributions in phase space. Certainly, the equilibrium state is well characterized by the Gaussian distribution, but quasistationary states in general depart from any Gaussian function.Keywords: dipole-type interactions, dynamics and thermodynamics, mean field model, quasistationary states
Procedia PDF Downloads 21214225 The Choosing the Right Projects With Multi-Criteria Decision Making to Ensure the Sustainability of the Projects
Authors: Saniye Çeşmecioğlu
Abstract:
The importance of project sustainability and success has become increasingly significant due to the proliferation of external environmental factors that have decreased project resistance in contemporary times. The primary approach to forestall the failure of projects is to ensure their long-term viability through the strategic selection of projects as creating judicious project selection framework within the organization. Decision-makers require precise decision contexts (models) that conform to the company's business objectives and sustainability expectations during the project selection process. The establishment of a rational model for project selection enables organizations to create a distinctive and objective framework for the selection process. Additionally, for the optimal implementation of this decision-making model, it is crucial to establish a Project Management Office (PMO) team and Project Steering Committee within the organizational structure to oversee the framework. These teams enable updating project selection criteria and weights in response to changing conditions, ensuring alignment with the company's business goals, and facilitating the selection of potentially viable projects. This paper presents a multi-criteria decision model for selecting project sustainability and project success criteria that ensures timely project completion and retention. The model was developed using MACBETH (Measuring Attractiveness by a Categorical Based Evaluation Technique) and was based on broadcaster companies’ expectations. The ultimate results of this study provide a model that endorses the process of selecting the appropriate project objectively by utilizing project selection and sustainability criteria along with their respective weights for organizations. Additionally, the study offers suggestions that may ascertain helpful in future endeavors.Keywords: project portfolio management, project selection, multi-criteria decision making, project sustainability and success criteria, MACBETH
Procedia PDF Downloads 6414224 Teaching, Learning and Evaluation Enhancement of Information Communication Technology Education in Schools through Pedagogical and E-Learning Techniques in the Sri Lankan Context
Authors: M. G. N. A. S. Fernando
Abstract:
This study uses a researchable framework to improve the quality of ICT education and the Teaching Learning Assessment/ Evaluation (TLA/TLE) process. It utilizes existing resources while improving the methodologies along with pedagogical techniques and e-Learning approaches used in the secondary schools of Sri Lanka. The study was carried out in two phases. Phase I focused on investigating the factors which affect the quality of ICT education. Based on the key factors of phase I, the Phase II focused on the design of an Experimental Application Model with 6 activity levels. Each Level in the Activity Model covers one or more levels in the Revised Bloom’s Taxonomy. Towards further enhancement of activity levels, other pedagogical techniques (activity based learning, e-learning techniques, problem solving activities and peer discussions etc.) were incorporated to each level in the activity model as appropriate. The application model was validated by a panel of teachers including a domain expert and was tested in the school environment too. The validity of performance was proved using 6 hypotheses testing and other methodologies. The analysis shows that student performance with problem solving activities increased by 19.5% due to the different treatment levels used. Compared to existing process it was also proved that the embedded techniques (mixture of traditional and modern pedagogical methods and their applications) are more effective with skills development of teachers and students.Keywords: activity models, Bloom’s taxonomy, ICT education, pedagogies
Procedia PDF Downloads 16514223 Statistical and Analytical Comparison of GIS Overlay Modelings: An Appraisal on Groundwater Prospecting in Precambrian Metamorphics
Authors: Tapas Acharya, Monalisa Mitra
Abstract:
Overlay modeling is the most widely used conventional analysis for spatial decision support system. Overlay modeling requires a set of themes with different weightage computed in varied manners, which gives a resultant input for further integrated analysis. In spite of the popularity and most widely used technique; it gives inconsistent and erroneous results for similar inputs while processed in various GIS overlay techniques. This study is an attempt to compare and analyse the differences in the outputs of different overlay methods using GIS platform with same set of themes of the Precambrian metamorphic to obtain groundwater prospecting in Precambrian metamorphic rocks. The objective of the study is to emphasize the most suitable overlay method for groundwater prospecting in older Precambrian metamorphics. Seven input thematic layers like slope, Digital Elevation Model (DEM), soil thickness, lineament intersection density, average groundwater table fluctuation, stream density and lithology have been used in the spatial overlay models of fuzzy overlay, weighted overlay and weighted sum overlay methods to yield the suitable groundwater prospective zones. Spatial concurrence analysis with high yielding wells of the study area and the statistical comparative studies among the outputs of various overlay models using RStudio reveal that the Weighted Overlay model is the most efficient GIS overlay model to delineate the groundwater prospecting zones in the Precambrian metamorphic rocks.Keywords: fuzzy overlay, GIS overlay model, groundwater prospecting, Precambrian metamorphics, weighted overlay, weighted sum overlay
Procedia PDF Downloads 12914222 Corpus-Based Model of Key Concepts Selection for the Master English Language Course "Government Relations"
Authors: Elena Pozdnyakova
Abstract:
“Government Relations” is a field of knowledge presently taught at the majority of universities around the globe. English as the default language can become the language of teaching since the issues discussed are both global and national in character. However for this field of knowledge key concepts and their word representations in English don’t often coincide with those in other languages. International master’s degree students abroad as well as students, taught the course in English at their national universities, are exposed to difficulties, connected with correct conceptualizing of terminology of GR in British and American academic traditions. The study was carried out during the GR English language course elaboration (pilot research: 2013 -2015) at Moscow State Institute of Foreign Relations (University), Russian Federation. Within this period, English language instructors designed and elaborated the three-semester course of GR. Methodologically the course design was based on elaboration model with the special focus on conceptual elaboration sequence and theoretical elaboration sequence. The course designers faced difficulties in concept selection and theoretical elaboration sequence. To improve the results and eliminate the problems with concept selection, a new, corpus-based approach was worked out. The computer-based tool WordSmith 6.0 was used with the aim to build a model of key concept selection. The corpus of GR English texts consisted of 1 million words (the study corpus). The approach was based on measuring effect size, i.e. the percent difference of the frequency of a word in the study corpus when compared to that in the reference corpus. The results obtained proved significant improvement in the process of concept selection. The corpus-based model also facilitated theoretical elaboration of teaching materials.Keywords: corpus-based study, English as the default language, key concepts, measuring effect size, model of key concept selection
Procedia PDF Downloads 30714221 Soil Loss Assessment at Steep Slope: A Case Study at the Guthrie Corridor Expressway, Selangor, Malaysia
Authors: Rabiul Islam
Abstract:
The study was in order to assess soil erosion at plot scale Universal Soil Loss Equation (USLE) erosion model and Geographic Information System (GIS) technique have been used for the study 8 plots in Guthrie Corridor Expressway, Selangor, Malaysia. The USLE model estimates an average soil loss soil integrating several factors such as rainfall erosivity factor(R ), Soil erodibility factor (K), slope length and steepness factor (LS), vegetation cover factor as well as conservation practice factor (C &P) and Results shows that the four plots have very low rates of soil loss, i.e. NLDNM, NDNM, PLDM, and NDM having an average soil loss of 0.059, 0.106, 0.386 and 0.372 ton/ha/ year, respectively. The NBNM, PLDNM and NLDM plots had a relatively higher rate of soil loss, with an average of 0.678, 0.757 and 0.493ton/ha/year. Whereas, the NBM is one of the highest rate of soil loss from 0.842 ton/ha/year to maximum 16.466 ton/ha/year. The NBM plot was located at bare the land; hence the magnitude of C factor(C=0.15) was the highest one.Keywords: USLE model, GIS, Guthrie Corridor Expressway (GCE), Malaysia
Procedia PDF Downloads 52914220 A-Score, Distress Prediction Model with Earning Response during the Financial Crisis: Evidence from Emerging Market
Authors: Sumaira Ashraf, Elisabete G.S. Félix, Zélia Serrasqueiro
Abstract:
Traditional financial distress prediction models performed well to predict bankrupt and insolvent firms of the developed markets. Previous studies particularly focused on the predictability of financial distress, financial failure, and bankruptcy of firms. This paper contributes to the literature by extending the definition of financial distress with the inclusion of early warning signs related to quotation of face value, dividend/bonus declaration, annual general meeting, and listing fee. The study used five well-known distress prediction models to see if they have the ability to predict early warning signs of financial distress. Results showed that the predictive ability of the models varies over time and decreases specifically for the sample with early warning signs of financial distress. Furthermore, the study checked the differences in the predictive ability of the models with respect to the financial crisis. The results conclude that the predictive ability of the traditional financial distress prediction models decreases for the firms with early warning signs of financial distress and during the time of financial crisis. The study developed a new model comprising significant variables from the five models and one new variable earning response. This new model outperforms the old distress prediction models before, during and after the financial crisis. Thus, it can be used by researchers, organizations and all other concerned parties to indicate early warning signs for the emerging markets.Keywords: financial distress, emerging market, prediction models, Z-Score, logit analysis, probit model
Procedia PDF Downloads 24514219 Financial Fraud Prediction for Russian Non-Public Firms Using Relational Data
Authors: Natalia Feruleva
Abstract:
The goal of this paper is to develop the fraud risk assessment model basing on both relational and financial data and test the impact of the relationships between Russian non-public companies on the likelihood of financial fraud commitment. Relationships mean various linkages between companies such as parent-subsidiary relationship and person-related relationships. These linkages may provide additional opportunities for committing fraud. Person-related relationships appear when firms share a director, or the director owns another firm. The number of companies belongs to CEO and managed by CEO, the number of subsidiaries was calculated to measure the relationships. Moreover, the dummy variable describing the existence of parent company was also included in model. Control variables such as financial leverage and return on assets were also implemented because they describe the motivating factors of fraud. To check the hypotheses about the influence of the chosen parameters on the likelihood of financial fraud, information about person-related relationships between companies, existence of parent company and subsidiaries, profitability and the level of debt was collected. The resulting sample consists of 160 Russian non-public firms. The sample includes 80 fraudsters and 80 non-fraudsters operating in 2006-2017. The dependent variable is dichotomous, and it takes the value 1 if the firm is engaged in financial crime, otherwise 0. Employing probit model, it was revealed that the number of companies which belong to CEO of the firm or managed by CEO has significant impact on the likelihood of financial fraud. The results obtained indicate that the more companies are affiliated with the CEO, the higher the likelihood that the company will be involved in financial crime. The forecast accuracy of the model is about is 80%. Thus, the model basing on both relational and financial data gives high level of forecast accuracy.Keywords: financial fraud, fraud prediction, non-public companies, regression analysis, relational data
Procedia PDF Downloads 12114218 Simulation to Detect Virtual Fractional Flow Reserve in Coronary Artery Idealized Models
Authors: Nabila Jaman, K. E. Hoque, S. Sawall, M. Ferdows
Abstract:
Coronary artery disease (CAD) is one of the most lethal diseases of the cardiovascular diseases. Coronary arteries stenosis and bifurcation angles closely interact for myocardial infarction. We want to use computer-aided design model coupled with computational hemodynamics (CHD) simulation for detecting several types of coronary artery stenosis with different locations in an idealized model for identifying virtual fractional flow reserve (vFFR). The vFFR provides us the information about the severity of stenosis in the computational models. Another goal is that we want to imitate patient-specific computed tomography coronary artery angiography model for constructing our idealized models with different left anterior descending (LAD) and left circumflex (LCx) bifurcation angles. Further, we want to analyze whether the bifurcation angles has an impact on the creation of narrowness in coronary arteries or not. The numerical simulation provides the CHD parameters such as wall shear stress (WSS), velocity magnitude and pressure gradient (PGD) that allow us the information of stenosis condition in the computational domain.Keywords: CAD, CHD, vFFR, bifurcation angles, coronary stenosis
Procedia PDF Downloads 158