Search results for: daily probability model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19541

Search results for: daily probability model

15341 Grid Architecture Model for Smart Grid

Authors: Nick Farid, Roghoyeh Salmeh

Abstract:

The planning and operation of the power grid is becoming much more complex because of the introduction of renewable energy resources, the digitalization of the electricity industry, as well as the coupling of efficiency and greener energy trends. These changes, along with new trends, make interactions between grid users and the other stakeholders more complex. This paper focuses on the main “physical” and “logical” interactions between grid users and the grid stakeholders, both from power system equipment and information management standpoints, and proposes a new interoperability model for Smart Grids.

Keywords: user interface, interoperability layers, grid architecture framework, smart grid

Procedia PDF Downloads 91
15340 A Cellular Automaton Model Examining the Effects of Oxygen, Hydrogen Ions, and Lactate on Early Tumour Growth

Authors: Maymona Al-Husari, Craig Murdoch, Steven Webb

Abstract:

Some tumors are known to exhibit an extracellular pH that is more acidic than the intracellular, creating a 'reversed pH gradient' across the cell membrane and this has been shown to affect their invasive and metastatic potential. Tumour hypoxia also plays an important role in tumour development and has been directly linked to both tumour morphology and aggressiveness. In this paper, we present a hybrid mathematical model of intracellular pH regulation that examines the effect of oxygen and pH on tumour growth and morphology. In particular, we investigate the impact of pH regulatory mechanisms on the cellular pH gradient and tumour morphology. Analysis of the model shows that: low activity of the Na+/H+ exchanger or a high rate of anaerobic glycolysis can give rise to a 'fingering' tumour morphology; and a high activity of the lactate/H+ symporter can result in a reversed transmembrane pH gradient across a large portion of the tumour mass. Also, the reversed pH gradient is spatially heterogenous within the tumour, with a normal pH gradient observed within an intermediate growth layer, that is the layer between the proliferative inner and outermost layer of the tumour.

Keywords: acidic pH, cellular automaton, ebola, tumour growth

Procedia PDF Downloads 327
15339 Factors Influencing Prevalence of HIV/AIDS Among Men Who Have Sex With men (MSM) Aged 18-24 years in Mtwapa Town, Kilifi County, Kenya

Authors: Oscar Maina Irungu

Abstract:

Background: Men who have sex with men (MSM) in Mtwapa Town, Kilifi County are at high risk of HIV infection. Probability sample surveys to determine HIV prevalence among MSM in Mtwapa are needed to inform prevention and care services. Methods: In 2013, a cross-sectional survey was conducted among MSM aged 18-24 years old, using respondent-driven sampling (RDS) in Mtwapa. Consenting MSM were tested for HIV (fingerstick rapid test). Population-based prevalence and 95 % confidence intervals (CI) were estimated using RDS Analysis Tool (RDSAT). Results: Among 274 MSM, the median age was 20 years (IQR: 19-23 years). Fifty percent of MSM reported not selling sex, while 13.2 % reported sex work as their “main occupation”, and another 28.4 % reported selling sex in the past two months (but not as their main occupation).Overall HIV prevalence was 19.2 % (CI: 12.2-23.6%). HIV prevalence was higher among MSM who reported sex work as their main occupation (28.3%,CI: 12.1-42.3%) or selling sex in the past two months (26.6 %, CI: 17.2-35.7 %),than among MSM who did not sell sex (11.6%,CI: 7.0-18.1%). Conclusion: HIV prevalence among MSM were high than among Kilifi’s general population aged 15-64 years (8.8%; 2010 KAIS) and highest in male sex workers. Health programs need to address concerns and modify services to meet needs of diverse subgroups of MSM. We recommend continued, periodic surveillance to monitor HIV prevalence among MSM in Mtwapa, and expansion to other areas in Kenya.

Keywords: power point, Kenya, homosexuality, sex

Procedia PDF Downloads 373
15338 Antecedents and Consequences of Organizational Intelligence in an R and D Organization

Authors: Akriti Srivastava, Soumi Awasthy

Abstract:

One of the disciplines that provoked increased interest in the importance of intelligence is the management and organization development literature. Organization intelligence is a key enabling force underlying many vital activities and processes dominating organizational life. Hence, the factors which lead to organizational intelligence and the result which comes out of the whole procedure is important to be understood with the understanding of OI. The focus of this research was to uncover potential antecedents and consequences of organizational intelligence, thus a non-experimental explanatory survey research design was used. A non-experimental research design is in which the manipulation of variables and randomization of samples are not present. The data was collected with the help of the questionnaire from 321 scientists from different laboratories of an R & D organization. Out of which 304 data were found suitable for the analysis. There were 194 males (age, M= 35.03, SD=7.63) and 110 females (age, M= 34.34, SD=8.44). This study tested a conceptual model linking antecedent variables (leadership and organizational culture) to organizational intelligence, followed by organizational innovational capability and organizational performance. Structural equation modeling techniques were used to analyze the hypothesized model. But, before that, confirmatory factor analysis of organizational intelligence scale was done which resulted in an insignificant model. Then, exploratory factor analysis was done which gave six factors for organizational intelligence scale. This structure was used throughout the study. Following this, the final analysis revealed relatively good fit of data to the hypothesized model with certain modifications. Leadership and organizational culture emerged out as the significant antecedents of organizational intelligence. Organizational innovational capability and organizational performance came out to be the consequent factors of organizational intelligence. But organizational intelligence did not predict organizational performance via organizational innovational capability. With this, additional significant pathway emerged out between leadership and organizational performance. The model offers a fresh and comprehensive view of the organizational intelligence. In this study, prior studies in related literature were reviewed to offer a basic framework of organizational intelligence. The study proved to be beneficial for organizational intelligence scholarship, seeing its importance in the competitive environment.

Keywords: leadership, organizational culture, organizational intelligence, organizational innovational capability

Procedia PDF Downloads 336
15337 The Effect of Post-Acute Stroke Inpatient Rehabilitation under per Diem Payment: A Pilot Study

Authors: Chung-Yuan Wang, Kai-Chun Lee, Min-Hung Wang, Yu-Ren Chen, Hung-Sheng Lin, Sen-Shan Fan

Abstract:

Taiwan National Health Insurance (NHI) was launched in 1995. It is an important social welfare policy in Taiwan. Regardless of the diversified social and economic status, universal coverage of NHI was assured. In order to regain better self-care performance, stroke people received in-patient and out-patient rehabilitation. Though NHI limited the rehabilitation frequency to one per day, the cost of rehabilitation still increased rapidly. Through the intensive rehabilitation during the post-stroke rehabilitation golden period, stroke patients might decrease their disability and shorten the rehabilitation period. Therefore, the aim of this study was to investigate the effect of intensive post-acute stroke rehabilitation in hospital under per diem payment. This study was started from 2014/03/01. The stroke patients who were admitted to our hospital or medical center were indicated to the study. The neurologists would check his modified Rankin Scale (mRS). Only patients with their mRS score between 2 and 4 were included to the study. Patients with unclear consciousness, unstable medical condition, unclear stroke onset date and no willing for 3 weeks in-patient intensive rehabilitation were excluded. After the physiatrist’s systemic evaluation, the subjects received intensive rehabilitation programs. The frequency of rehabilitation was thrice per day. Physical therapy, occupational therapy and speech/swallowing therapy were included in the programs for the needs of the stroke patients. Activity daily life performance (Barthel Index) and functional balance ability (Berg Balance Scale) were used to measure the training effect. During 3/1 to 5/31, thirteen subjects (five male and eight female) were included. Seven subjects were aged below 60. Three subjects were aged over 70. Most of the subjects (seven subjects) received intensive post-stroke rehabilitation for three weeks. Three subjects drop out from the programs and went back home respectively after receiving only 7, 10, and 13 days rehabilitation. Among these 13 subjects, nine of them got improvement in activity daily life performance (Barthel Index score). Ten of them got improvement in functional balance ability (Berg Balance Scale). The intensive post-acute stroke rehabilitation did help stroke patients promote their health in our study. Not only their functional performance improved, but also their self-confidence improved. Furthermore, their family also got better health status. Stroke rehabilitation under per diem payment was noted in long-term care institution in developed countries. Over 95% populations in Taiwan were supported under the Taiwan's National Health Insurance system, but there was no national long-term care insurance system. Most of the stroke patients in Taiwan live with his family and continue their rehabilitation programs from out-patient department. This pilot study revealed the effect of intensive post-acute stroke rehabilitation in hospital under per diem payment. The number of the subjects and the study period were limited. Thus, further study will be needed.

Keywords: rehabilitation, post-acute stroke, per diem payment, NHI

Procedia PDF Downloads 309
15336 Recognition of Voice Commands of Mentor Robot in Noisy Environment Using Hidden Markov Model

Authors: Khenfer Koummich Fatma, Hendel Fatiha, Mesbahi Larbi

Abstract:

This paper presents an approach based on Hidden Markov Models (HMM: Hidden Markov Model) using HTK tools. The goal is to create a human-machine interface with a voice recognition system that allows the operator to teleoperate a mentor robot to execute specific tasks as rotate, raise, close, etc. This system should take into account different levels of environmental noise. This approach has been applied to isolated words representing the robot commands pronounced in two languages: French and Arabic. The obtained recognition rate is the same in both speeches, Arabic and French in the neutral words. However, there is a slight difference in favor of the Arabic speech when Gaussian white noise is added with a Signal to Noise Ratio (SNR) equals 30 dB, in this case; the Arabic speech recognition rate is 69%, and the French speech recognition rate is 80%. This can be explained by the ability of phonetic context of each speech when the noise is added.

Keywords: Arabic speech recognition, Hidden Markov Model (HMM), HTK, noise, TIMIT, voice command

Procedia PDF Downloads 375
15335 Wolof Voice Response Recognition System: A Deep Learning Model for Wolof Audio Classification

Authors: Krishna Mohan Bathula, Fatou Bintou Loucoubar, FNU Kaleemunnisa, Christelle Scharff, Mark Anthony De Castro

Abstract:

Voice recognition algorithms such as automatic speech recognition and text-to-speech systems with African languages can play an important role in bridging the digital divide of Artificial Intelligence in Africa, contributing to the establishment of a fully inclusive information society. This paper proposes a Deep Learning model that can classify the user responses as inputs for an interactive voice response system. A dataset with Wolof language words ‘yes’ and ‘no’ is collected as audio recordings. A two stage Data Augmentation approach is adopted for enhancing the dataset size required by the deep neural network. Data preprocessing and feature engineering with Mel-Frequency Cepstral Coefficients are implemented. Convolutional Neural Networks (CNNs) have proven to be very powerful in image classification and are promising for audio processing when sounds are transformed into spectra. For performing voice response classification, the recordings are transformed into sound frequency feature spectra and then applied image classification methodology using a deep CNN model. The inference model of this trained and reusable Wolof voice response recognition system can be integrated with many applications associated with both web and mobile platforms.

Keywords: automatic speech recognition, interactive voice response, voice response recognition, wolof word classification

Procedia PDF Downloads 108
15334 Order Fulfilment Strategy in E-Commerce Warehouse Based on Simulation: Business Customers Case

Authors: Aurelija Burinskiene

Abstract:

This paper presents the study for an e-commerce warehouse. The study is aiming to improve order fulfillment activity by identifying the strategy presenting the best performance. A simulation model was proposed to reach the target of this research. This model enables various scenario tests in an e-commerce warehouse, allowing them to find out for the best order fulfillment strategy. By using simulation, model authors investigated customers’ orders representing on-line purchases for one month. Experiments were designed to evaluate various order picking methods applicable to the fulfillment of customers’ orders. The research uses cost components analysis and helps to identify the best possible order picking method improving the overall performance of e-commerce warehouse and fulfillment service to the customers. The results presented show that the application of order batching strategy is the most applicable because it brings distance savings of around 6.7 percentage. This result could be improved by taking an assortment clustering action until 8.34 percentage. So, the recommendations were given to apply the method for future e-commerce warehouse operations.

Keywords: e-commerce, order, fulfilment, strategy, simulation

Procedia PDF Downloads 148
15333 Reliability-Centered Maintenance Application for the Development of Maintenance Strategy for a Cement Plant

Authors: Nabil Hameed Al-Farsi

Abstract:

This study’s main goal is to develop a model and a maintenance strategy for a cement factory called Arabian Cement Company, Rabigh Plant. The proposed work here depends on Reliability centric maintenance approach to develop a strategy and maintenance schedule that ensures increasing the reliability of the production system components, thus ensuring continuous productivity. The cost-effective maintenance of the plant’s dependability performance is the key goal of durability-based maintenance is. The cement plant consists of 7 important steps, so, developing a maintenance plan based on Reliability centric maintenance (RCM) method is made up of 10 steps accordingly starting from selecting units and data until performing and updating the model. The processing unit chosen for the analysis of this case is the calcinatory unit regarding model’s validation and the Travancore Titanium Products Ltd (TTP) using the claimed data history acquired from the maintenance department maintenance from the mentioned company. After applying the proposed model, the results of the maintenance simulation justified the plant's existing scheduled maintenance policy being reconsidered. Results represent the need for preventive maintenance for all Class A criticality equipment instead of the planned maintenance and the breakdown one for all other equipment depends on its criticality and an FMEA report. Consequently, the additional cost of preventive maintenance would be offset by the cost savings from breakdown maintenance for the remaining equipment.

Keywords: engineering, reliability, strategy, maintenance, failure modes, effects and criticality analysis (FMEA)

Procedia PDF Downloads 166
15332 Computational Fluid Dynamics Simulation and Comparison of Flow through Mechanical Heart Valve Using Newtonian and Non-Newtonian Fluid

Authors: D. Šedivý, S. Fialová

Abstract:

The main purpose of this study is to show differences between the numerical solution of the flow through the artificial heart valve using Newtonian or non-Newtonian fluid. The simulation was carried out by a commercial computational fluid dynamics (CFD) package based on finite-volume method. An aortic bileaflet heart valve (Sorin Bicarbon) was used as a pattern for model of real heart valve replacement. Computed tomography (CT) was used to gain the accurate parameters of the valve. Data from CT were transferred in the commercial 3D designer, where the model for CFD was made. Carreau rheology model was applied as non-Newtonian fluid. Physiological data of cardiac cycle were used as boundary conditions. Outputs were taken the leaflets excursion from opening to closure and the fluid dynamics through the valve. This study also includes experimental measurement of pressure fields in ambience of valve for verification numerical outputs. Results put in evidence a favorable comparison between the computational solutions of flow through the mechanical heart valve using Newtonian and non-Newtonian fluid.

Keywords: computational modeling, dynamic mesh, mechanical heart valve, non-Newtonian fluid

Procedia PDF Downloads 382
15331 Modeling Of The Random Impingement Erosion Due To The Impact Of The Solid Particles

Authors: Siamack A. Shirazi, Farzin Darihaki

Abstract:

Solid particles could be found in many multiphase flows, including transport pipelines and pipe fittings. Such particles interact with the pipe material and cause erosion which threats the integrity of the system. Therefore, predicting the erosion rate is an important factor in the design and the monitor of such systems. Mechanistic models can provide reliable predictions for many conditions while demanding only relatively low computational cost. Mechanistic models utilize a representative particle trajectory to predict the impact characteristics of the majority of the particle impacts that cause maximum erosion rate in the domain. The erosion caused by particle impacts is not only due to the direct impacts but also random impingements. In the present study, an alternative model has been introduced to describe the erosion due to random impingement of particles. The present model provides a realistic trend for erosion with changes in the particle size and particle Stokes number. The present model is examined against the experimental data and CFD simulation results and indicates better agreement with the data incomparison to the available models in the literature.

Keywords: erosion, mechanistic modeling, particles, multiphase flow, gas-liquid-solid

Procedia PDF Downloads 166
15330 Mathematical Modeling for Continuous Reactive Extrusion of Poly Lactic Acid Formation by Ring Opening Polymerization Considering Metal/Organic Catalyst and Alternative Energies

Authors: Satya P. Dubey, Hrushikesh A Abhyankar, Veronica Marchante, James L. Brighton, Björn Bergmann

Abstract:

Aims: To develop a mathematical model that simulates the ROP of PLA taking into account the effect of alternative energy to be implemented in a continuous reactive extrusion production process of PLA. Introduction: The production of large amount of waste is one of the major challenges at the present time, and polymers represent 70% of global waste. PLA has emerged as a promising polymer as it is compostable, biodegradable thermoplastic polymer made from renewable sources. However, the main limitation for the application of PLA is the traces of toxic metal catalyst in the final product. Thus, a safe and efficient production process needs to be developed to avoid the potential hazards and toxicity. It has been found that alternative energy sources (LASER, ultrasounds, microwaves) could be a prominent option to facilitate the ROP of PLA via continuous reactive extrusion. This process may result in complete extraction of the metal catalysts and facilitate less active organic catalysts. Methodology: Initial investigation were performed using the data available in literature for the reaction mechanism of ROP of PLA based on conventional metal catalyst stannous octoate. A mathematical model has been developed by considering significant parameters such as different initial concentration ratio of catalyst, co-catalyst and impurity. Effects of temperature variation and alternative energies have been implemented in the model. Results: The validation of the mathematical model has been made by using data from literature as well as actual experiments. Validation of the model including alternative energies is in progress based on experimental data for partners of the InnoREX project consortium. Conclusion: The model developed reproduces accurately the polymerisation reaction when applying alternative energy. Alternative energies have a great positive effect to increase the conversion and molecular weight of the PLA. This model could be very useful tool to complement Ludovic® software to predict the large scale production process when using reactive extrusion.

Keywords: polymer, poly-lactic acid (PLA), ring opening polymerization (ROP), metal-catalyst, bio-degradable, renewable source, alternative energy (AE)

Procedia PDF Downloads 358
15329 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis

Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate

Abstract:

This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.

Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull

Procedia PDF Downloads 68
15328 Heat and Mass Transfer Modelling of Industrial Sludge Drying at Different Pressures and Temperatures

Authors: L. Al Ahmad, C. Latrille, D. Hainos, D. Blanc, M. Clausse

Abstract:

A two-dimensional finite volume axisymmetric model is developed to predict the simultaneous heat and mass transfers during the drying of industrial sludge. The simulations were run using COMSOL-Multiphysics 3.5a. The input parameters of the numerical model were acquired from a preliminary experimental work. Results permit to establish correlations describing the evolution of the various parameters as a function of the drying temperature and the sludge water content. The selection and coupling of the equation are validated based on the drying kinetics acquired experimentally at a temperature range of 45-65 °C and absolute pressure range of 200-1000 mbar. The model, incorporating the heat and mass transfer mechanisms at different operating conditions, shows simulated values of temperature and water content. Simulated results are found concordant with the experimental values, only at the first and last drying stages where sludge shrinkage is insignificant. Simulated and experimental results show that sludge drying is favored at high temperatures and low pressure. As experimentally observed, the drying time is reduced by 68% for drying at 65 °C compared to 45 °C under 1 atm. At 65 °C, a 200-mbar absolute pressure vacuum leads to an additional reduction in drying time estimated by 61%. However, the drying rate is underestimated in the intermediate stage. This rate underestimation could be improved in the model by considering the shrinkage phenomena that occurs during sludge drying.

Keywords: industrial sludge drying, heat transfer, mass transfer, mathematical modelling

Procedia PDF Downloads 127
15327 Regionalization of IDF Curves with L-Moments for Storm Events

Authors: Noratiqah Mohd Ariff, Abdul Aziz Jemain, Mohd Aftar Abu Bakar

Abstract:

The construction of Intensity-Duration-Frequency (IDF) curves is one of the most common and useful tools in order to design hydraulic structures and to provide a mathematical relationship between rainfall characteristics. IDF curves, especially those in Peninsular Malaysia, are often built using moving windows of rainfalls. However, these windows do not represent the actual rainfall events since the duration of rainfalls is usually prefixed. Hence, instead of using moving windows, this study aims to find regionalized distributions for IDF curves of extreme rainfalls based on storm events. Homogeneity test is performed on annual maximum of storm intensities to identify homogeneous regions of storms in Peninsular Malaysia. The L-moment method is then used to regionalized Generalized Extreme Value (GEV) distribution of these annual maximums and subsequently. IDF curves are constructed using the regional distributions. The differences between the IDF curves obtained and IDF curves found using at-site GEV distributions are observed through the computation of the coefficient of variation of root mean square error, mean percentage difference and the coefficient of determination. The small differences implied that the construction of IDF curves could be simplified by finding a general probability distribution of each region. This will also help in constructing IDF curves for sites with no rainfall station.

Keywords: IDF curves, L-moments, regionalization, storm events

Procedia PDF Downloads 523
15326 Evaluation of Toxic Elements in Thai Rice Samples

Authors: W. Srinuttrakul, V. Permnamtip

Abstract:

Toxic elements in rice samples are great concern in Thailand because rice (Oryza sativa) is a staple food for Thai people. Furthermore, rice is an economic crop of Thailand for export. In this study, the concentrations of arsenic (As), cadmium (Cd) and lead (Pb) in rice samples collected from the paddy fields in the northern, northeastern and southern regions of Thailand were determined by inductively coupled plasma mass spectrometry. The mean concentrations of As, Cd and Pb in 55 rice samples were 0.112±0.056, 0.029±0.037 and 0.031±0.033 mg kg-1, respectively. All rice samples showed As, Cd and Pb lower than the limit data of Codex. The estimated daily intakes (EDIs) of As, Cd, and Pb from rice consumption were 0.026±0.013, 0.007±0.009 and 0.007±0.008 mg day-1, respectively. The percentage contribution to Provisional Tolerable Weekly Intake (PTWI) values of As, Cd and Pb for Thai male (body weight of 69 kg) was 17.6%, 9.7%, and 2.9%, respectively, and for Thai female (body weight of 57 kg) was 21.3%, 11.7% and 3.5%, respectively. The findings indicated that all studied rice samples are safe for consumption.

Keywords: arsenic, cadmium, ICP-MS, lead, rice

Procedia PDF Downloads 256
15325 Homeless Population Modeling and Trend Prediction Through Identifying Key Factors and Machine Learning

Authors: Shayla He

Abstract:

Background and Purpose: According to Chamie (2017), it’s estimated that no less than 150 million people, or about 2 percent of the world’s population, are homeless. The homeless population in the United States has grown rapidly in the past four decades. In New York City, the sheltered homeless population has increased from 12,830 in 1983 to 62,679 in 2020. Knowing the trend on the homeless population is crucial at helping the states and the cities make affordable housing plans, and other community service plans ahead of time to better prepare for the situation. This study utilized the data from New York City, examined the key factors associated with the homelessness, and developed systematic modeling to predict homeless populations of the future. Using the best model developed, named HP-RNN, an analysis on the homeless population change during the months of 2020 and 2021, which were impacted by the COVID-19 pandemic, was conducted. Moreover, HP-RNN was tested on the data from Seattle. Methods: The methodology involves four phases in developing robust prediction methods. Phase 1 gathered and analyzed raw data of homeless population and demographic conditions from five urban centers. Phase 2 identified the key factors that contribute to the rate of homelessness. In Phase 3, three models were built using Linear Regression, Random Forest, and Recurrent Neural Network (RNN), respectively, to predict the future trend of society's homeless population. Each model was trained and tuned based on the dataset from New York City for its accuracy measured by Mean Squared Error (MSE). In Phase 4, the final phase, the best model from Phase 3 was evaluated using the data from Seattle that was not part of the model training and tuning process in Phase 3. Results: Compared to the Linear Regression based model used by HUD et al (2019), HP-RNN significantly improved the prediction metrics of Coefficient of Determination (R2) from -11.73 to 0.88 and MSE by 99%. HP-RNN was then validated on the data from Seattle, WA, which showed a peak %error of 14.5% between the actual and the predicted count. Finally, the modeling results were collected to predict the trend during the COVID-19 pandemic. It shows a good correlation between the actual and the predicted homeless population, with the peak %error less than 8.6%. Conclusions and Implications: This work is the first work to apply RNN to model the time series of the homeless related data. The Model shows a close correlation between the actual and the predicted homeless population. There are two major implications of this result. First, the model can be used to predict the homeless population for the next several years, and the prediction can help the states and the cities plan ahead on affordable housing allocation and other community service to better prepare for the future. Moreover, this prediction can serve as a reference to policy makers and legislators as they seek to make changes that may impact the factors closely associated with the future homeless population trend.

Keywords: homeless, prediction, model, RNN

Procedia PDF Downloads 118
15324 Investigations of Bergy Bits and Ship Interactions in Extreme Waves Using Smoothed Particle Hydrodynamics

Authors: Mohammed Islam, Jungyong Wang, Dong Cheol Seo

Abstract:

The Smoothed Particle Hydrodynamics (SPH) method is a novel, meshless, and Lagrangian technique based numerical method that has shown promises to accurately predict the hydrodynamics of water and structure interactions in violent flow conditions. The main goal of this study is to build confidence on the versatility of the Smoothed Particle Hydrodynamics (SPH) based tool, to use it as a complementary tool to the physical model testing capabilities and support research need for the performance evaluation of ships and offshore platforms exposed to an extreme and harsh environment. In the current endeavor, an open-sourced SPH-based tool was used and validated for modeling and predictions of the hydrodynamic interactions of a 6-DOF ship and bergy bits. The study involved the modeling of a modern generic drillship and simplified bergy bits in floating and towing scenarios and in regular and irregular wave conditions. The predictions were validated using the model-scale measurements on a moored ship towed at multiple oblique angles approaching a floating bergy bit in waves. Overall, this study results in a thorough comparison between the model scale measurements and the prediction outcomes from the SPH tool for performance and accuracy. The SPH predicted ship motions and forces were primarily within ±5% of the measurements. The velocity and pressure distribution and wave characteristics over the free surface depicts realistic interactions of the wave, ship, and the bergy bit. This work identifies and presents several challenges in preparing the input file, particularly while defining the mass properties of complex geometry, the computational requirements, and the post-processing of the outcomes.

Keywords: SPH, ship and bergy bit, hydrodynamic interactions, model validation, physical model testing

Procedia PDF Downloads 129
15323 Structure Function and Violation of Scale Invariance in NCSM: Theory and Numerical Analysis

Authors: M. R. Bekli, N. Mebarki, I. Chadou

Abstract:

In this study, we focus on the structure functions and violation of scale invariance in the context of non-commutative standard model (NCSM). We find that this violation appears in the first order of perturbation theory and a non-commutative version of the DGLAP evolution equation is deduced. Numerical analysis and comparison with experimental data imposes a new bound on the non-commutative parameter.

Keywords: NCSM, structure function, DGLAP equation, standard model

Procedia PDF Downloads 610
15322 Comparing Forecasting Performances of the Bass Diffusion Model and Time Series Methods for Sales of Electric Vehicles

Authors: Andreas Gohs, Reinhold Kosfeld

Abstract:

This study should be of interest for practitioners who want to predict precisely the sales numbers of vehicles equipped with an innovative propulsion technology as well as for researchers interested in applied (regional) time series analysis. The study is based on the numbers of new registrations of pure electric and hybrid cars. Methods of time series analysis like ARIMA are compared with the Bass Diffusion-model concerning their forecasting performances for new registrations in Germany at the national and federal state levels. Especially it is investigated if the additional information content from regional data increases the forecasting accuracy for the national level by adding predictions for the federal states. Results of parameters of the Bass Diffusion Model estimated for Germany and its sixteen federal states are reported. While the focus of this research is on the German market, estimation results are also provided for selected European and other countries. Concerning Bass-parameters and forecasting performances, we get very different results for Germany's federal states and the member states of the European Union. This corresponds to differences across the EU-member states in the adoption process of this innovative technology. Concerning the German market, the adoption is rather proceeded in southern Germany and stays behind in Eastern Germany except for Berlin.

Keywords: bass diffusion model, electric vehicles, forecasting performance, market diffusion

Procedia PDF Downloads 162
15321 Design and Implementation of Active Radio Frequency Identification on Wireless Sensor Network-Based System

Authors: Che Z. Zulkifli, Nursyahida M. Noor, Siti N. Semunab, Shafawati A. Malek

Abstract:

Wireless sensors, also known as wireless sensor nodes, have been making a significant impact on human daily life. The Radio Frequency Identification (RFID) and Wireless Sensor Network (WSN) are two complementary technologies; hence, an integrated implementation of these technologies expands the overall functionality in obtaining long-range and real-time information on the location and properties of objects and people. An approach for integrating ZigBee and RFID networks is proposed in this paper, to create an energy-efficient network improved by the benefits of combining ZigBee and RFID architecture. Furthermore, the compatibility and requirements of the ZigBee device and communication links in the typical RFID system which is presented with the real world experiment on the capabilities of the proposed RFID system.

Keywords: mesh network, RFID, wireless sensor network, zigbee

Procedia PDF Downloads 454
15320 Traffic Calming Measures at Rural Roads in Dhofar

Authors: Mohammed Bakhit Kashoob, Mohammed Salim Al-Maashani, Ahmed Abdullah Al-Marhoon

Abstract:

Traffic calming measures are different design features or strategies used to reduce the speed of a traveling vehicle on a particular road. These calming measures are common on rural roads of Oman. Some of these measures are road speed limits, vertical deflections, horizontal deflections, and road signs. In general, vertical deflections such as rumble strips, road studs (cat’s eye), speed tables, and speed humps are widely used. In this paper, as vehicle speeding is a major cause of road traffic crashes and high fatalities in Oman, the effectiveness of existing traffic calming measures at current locations on rural roads is assessed. The study was conducted on the rural roads of Dhofar Governorate, which is located in the south of Oman. A special focus is given to the calming measures implemented on the mountain roads of Dhofar. It is shown that vertical deflection calming measures are effective in reducing vehicle speed to 20 to 40 kph, depending on the vertical deflection type and spacing. Calming measures are also proposed at locations with a high probability of traffic crashes based on the number of traffic crashes at these locations, road type, and road geometry.

Keywords: road safety, rural roads, speed, traffic calming measures, traffic crash

Procedia PDF Downloads 111
15319 Management of Local Towns (Tambon) According to Philosophy of Sufficiency Economy

Authors: Wichian Sriprachan, Chutikarn Sriviboon

Abstract:

The objectives of this research were to study the management of local towns and to develop a better model of town management according to the Philosophy of Sufficiency Economy. This study utilized qualitative research, field research, as well as documentary research at the same time. A total of 10 local towns or Tambons of Supanburi province, Thailand were selected for an in-depth interview. The findings revealed that the model of local town management according to Philosophy of Sufficient Economy was in a level of “good” and the model of management has the five basic guidelines: 1) ability to manage budget information and keep it up-to-date, 2) ability to decision making according to democracy rules, 3) ability to use check and balance system, 4) ability to control, follow, and evaluation, and 5) ability to allow the general public to participate. In addition, the findings also revealed that the human resource management according to Philosophy of Sufficient Economy includes obeying laws, using proper knowledge, and having integrity in five areas: plan, recruit, select, train, and maintain human resources.

Keywords: management, local town (Tambon), principles of sufficiency economy, marketing management

Procedia PDF Downloads 341
15318 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards

Authors: Golnush Masghati-Amoli, Paul Chin

Abstract:

Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.

Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering

Procedia PDF Downloads 130
15317 Modelling the Dynamics and Optimal Control Strategies of Terrorism within the Southern Borno State Nigeria

Authors: Lubem Matthew Kwaghkor

Abstract:

Terrorism, which remains one of the largest threats faced by various nations and communities around the world, including Nigeria, is the calculated use of violence to create a general climate of fear in a population to attain particular goals that might be political, religious, or economical. Several terrorist groups are currently active in Nigeria, leading to attacks on both civil and military targets. Among these groups, Boko Haram is the deadliest terrorist group operating majorly in Borno State. The southern part of Borno State in North-Eastern Nigeria has been plagued by terrorism, insurgency, and conflict for several years. Understanding the dynamics of terrorism is crucial for developing effective strategies to mitigate its impact on communities and to facilitate peace-building efforts. This research aims to develop a mathematical model that captures the dynamics of terrorism within the southern part of Borno State, Nigeria, capturing both government and local community intervention strategies as control measures in combating terrorism. A compartmental model of five nonlinear differential equations is formulated. The model analyses show that a feasible solution set of the model exists and is bounded. Stability analyses show that both the terrorism free equilibrium and the terrorism endermic equilibrium are asymptotically stable, making the model to have biological meaning. Optimal control theory will be employed to identify the most effective strategy to prevent or minimize acts of terrorism. The research outcomes are expected to contribute towards enhancing security and stability in Southern Borno State while providing valuable insights for policymakers, security agencies, and researchers. This is an ongoing research.

Keywords: modelling, terrorism, optimal control, susceptible, non-susceptible, community intervention

Procedia PDF Downloads 11
15316 Business-to-Business Deals Based on a Co-Utile Collaboration Mechanism: Designing Trust Company of the Future

Authors: Riccardo Bonazzi, Michaël Poli, Abeba Nigussie Turi

Abstract:

This paper presents an applied research of a new module for the financial administration and management industry, Personalizable and Automated Checklists Integrator, Overseeing Legal Investigations (PACIOLI). It aims at designing the business model of the trust company of the future. By identifying the key stakeholders, we draw a general business process design of the industry. The business model focuses on disintermediating the traditional form of business through the new technological solutions of a software company based in Switzerland and hence creating a new interactive platform. The key stakeholders of this interactive platform are identified as IT experts, legal experts, and the New Edge Trust Company (NATC). The mechanism we design and propose has a great importance in improving the efficiency of the financial business administration and management industry, and it also helps to foster the provision of high value added services in the sector.

Keywords: new edge trust company, business model design, automated checklists, financial technology

Procedia PDF Downloads 365
15315 The Current Importance of the Rules of Civil Procedure in the Portuguese Legal Order: Between Legalism and Adequation

Authors: Guilherme Gomes, Jose Lebre de Freitas

Abstract:

The rules of Civil Procedure that are defined in the Portuguese Civil Procedure Code of 2013 particularly their articles 552 to 626- represent the model that the legislator thought that would be more suitable for national civil litigation, from the moment the action is brought by the plaintiff to the moment when the sentence is issued. However, procedural legalism is no longer a reality in the Portuguese Civil Procedural Law. According to the article 547 of the code of 2013, the civil judge has a duty to adopt the procedure that better suits the circumstances of the case, whether or not it is the one defined by law. The main goal of our paper is to answer the question whether the formal adequation imposed by this article diminishes the importance of the Portuguese rules of Civil Procedure and their daily application by national civil judges. We will start by explaining the appearance of the abovementioned rules in the Civil Procedure Code of 2013. Then we will analyse, using specific examples that were obtained by the books we read, how the legal procedure defined in the abovementioned code does not suit the circumstances of some specific cases and is totally inefficient in some situations. After that, we will, by using the data obtained in the practical research that we are conducting in the Portuguese civil courts within the scope of our Ph.D. thesis (until now, we have been able to consult 150 civil lawsuits), verify whether and how judges and parties make the procedure more efficient and effective in the case sub judice. In the scope of our research, we have already reached some preliminary findings: 1) despite the fact that the legal procedure does not suit the circumstances of some civil lawsuits, there are only two situations of frequent use of formal adequation (the judge allowing the plaintiff to respond to the procedural exceptions deduced in the written defense and the exemption from prior hearing for the judges who never summon it), 2) the other aspects of procedural adequation (anticipation of the production of expert evidence, waiving of oral argument at the final hearing, written allegations, dismissal of the dispatch on the controversial facts and the examination of witnesses at the domicile of one of the lawyers) are still little used and 3) formal adequation tends to happen by initiative of the judge, as plaintiffs and defendants are afraid of celebrating procedural agreements in most situations. In short, we can say that, in the Portuguese legal order of the 21st century, the flexibility of the legal procedure, as it is defined in the law and applied by procedural subjects, does not affect the importance of the rules of Civil Procedure of the code of 2013.

Keywords: casuistic adequation, civil procedure code of 2013, procedural subjects, rules of civil procedure

Procedia PDF Downloads 127
15314 Demonstration of Land Use Changes Simulation Using Urban Climate Model

Authors: Barbara Vojvodikova, Katerina Jupova, Iva Ticha

Abstract:

Cities in their historical evolution have always adapted their internal structure to the needs of society (for example protective city walls during classicism era lost their defense function, became unnecessary, were demolished and gave space for new features such as roads, museums or parks). Today it is necessary to modify the internal structure of the city in order to minimize the impact of climate changes on the environment of the population. This article discusses the results of the Urban Climate model owned by VITO, which was carried out as part of a project from the European Union's Horizon grant agreement No 730004 Pan-European Urban Climate Services Climate-Fit city. The use of the model was aimed at changes in land use and land cover in cities related to urban heat islands (UHI). The task of the application was to evaluate possible land use change scenarios in connection with city requirements and ideas. Two pilot areas in the Czech Republic were selected. One is Ostrava and the other Hodonín. The paper provides a demonstration of the application of the model for various possible future development scenarios. It contains an assessment of the suitability or inappropriateness of scenarios of future development depending on the temperature increase. Cities that are preparing to reconstruct the public space are interested in eliminating proposals that would lead to an increase in temperature stress as early as in the assignment phase. If they have evaluation on the unsuitability of some type of design, they can limit it into the proposal phases. Therefore, especially in the application of models on Local level - in 1 m spatial resolution, it was necessary to show which type of proposals would create a significant temperature island in its implementation. Such a type of proposal is considered unsuitable. The model shows that the building itself can create a shady place and thus contribute to the reduction of the UHI. If it sensitively approaches the protection of existing greenery, this new construction may not pose a significant problem. More massive interventions leading to the reduction of existing greenery create a new heat island space.

Keywords: climate model, heat islands, Hodonin, land use changes, Ostrava

Procedia PDF Downloads 136
15313 Development of a 3D Model of Real Estate Properties in Fort Bonifacio, Taguig City, Philippines Using Geographic Information Systems

Authors: Lyka Selene Magnayi, Marcos Vinas, Roseanne Ramos

Abstract:

As the real estate industry continually grows in the Philippines, Geographic Information Systems (GIS) provide advantages in generating spatial databases for efficient delivery of information and services. The real estate sector is not only providing qualitative data about real estate properties but also utilizes various spatial aspects of these properties for different applications such as hazard mapping and assessment. In this study, a three-dimensional (3D) model and a spatial database of real estate properties in Fort Bonifacio, Taguig City are developed using GIS and SketchUp. Spatial datasets include political boundaries, buildings, road network, digital terrain model (DTM) derived from Interferometric Synthetic Aperture Radar (IFSAR) image, Google Earth satellite imageries, and hazard maps. Multiple model layers were created based on property listings by a partner real estate company, including existing and future property buildings. Actual building dimensions, building facade, and building floorplans are incorporated in these 3D models for geovisualization. Hazard model layers are determined through spatial overlays, and different scenarios of hazards are also presented in the models. Animated maps and walkthrough videos were created for company presentation and evaluation. Model evaluation is conducted through client surveys requiring scores in terms of the appropriateness, information content, and design of the 3D models. Survey results show very satisfactory ratings, with the highest average evaluation score equivalent to 9.21 out of 10. The output maps and videos obtained passing rates based on the criteria and standards set by the intended users of the partner real estate company. The methodologies presented in this study were found useful and have remarkable advantages in the real estate industry. This work may be extended to automated mapping and creation of online spatial databases for better storage, access of real property listings and interactive platform using web-based GIS.

Keywords: geovisualization, geographic information systems, GIS, real estate, spatial database, three-dimensional model

Procedia PDF Downloads 157
15312 Quasistationary States and Mean Field Model

Authors: Sergio Curilef, Boris Atenas

Abstract:

Systems with long-range interactions are very common in nature. They are observed from the atomic scale to the astronomical scale and exhibit anomalies, such as inequivalence of ensembles, negative heat capacity, ergodicity breaking, nonequilibrium phase transitions, quasistationary states, and anomalous diffusion. These anomalies are exacerbated when special initial conditions are imposed; in particular, we use the so-called water bag initial conditions that stand for a uniform distribution. Several theoretical and practical implications are discussed here. A potential energy inspired by dipole-dipole interactions is proposed to build the dipole-type Hamiltonian mean-field model. As expected, the dynamics is novel and general to the behavior of systems with long-range interactions, which is obtained through molecular dynamics technique. Two plateaus sequentially emerge before arriving at equilibrium, which are corresponding to two different quasistationary states. The first plateau is a type of quasistationary state the lifetime of which depends on a power law of N and the second plateau seems to be a true quasistationary state as reported in the literature. The general behavior of the model according to its dynamics and thermodynamics is described. Using numerical simulation we characterize the mean kinetic energy, caloric curve, and the diffusion law through the mean square of displacement. The present challenge is to characterize the distributions in phase space. Certainly, the equilibrium state is well characterized by the Gaussian distribution, but quasistationary states in general depart from any Gaussian function.

Keywords: dipole-type interactions, dynamics and thermodynamics, mean field model, quasistationary states

Procedia PDF Downloads 208