Search results for: cluster model approach
25117 A Time since of Injection Model for Hepatitis C Amongst People Who Inject Drugs
Authors: Nader Al-Rashidi, David Greenhalgh
Abstract:
Mathematical modelling techniques are now being used by health organizations worldwide to help understand the likely impact that intervention strategies treatment options and combinations of these have on the prevalence and incidence of hepatitis C virus (HCV) in the people who inject drugs (PWID) population. In this poster, we develop a deterministic, compartmental mathematical model to approximate the spread of the HCV in a PWID population that has been divided into two groups by time since onset of injection. The model assumes that after injection needles adopt the most infectious state of their previous state or that of the PWID who last injected with them. Using analytical techniques, we find that the model behaviour is determined by the basic reproductive number R₀, where R₀ = 1 is a critical threshold separating two different outcomes. The disease-free equilibrium is globally stable if R₀ ≤ 1 and unstable if R₀ > 1. Additionally, we make some simulations where have confirmed that the model tends to this endemic equilibrium value with realistic parameter values giving an HCV prevalence.Keywords: hepatitis C, people who inject drugs, HCV, PWID
Procedia PDF Downloads 14425116 Training for Digital Manufacturing: A Multilevel Teaching Model
Authors: Luís Rocha, Adam Gąska, Enrico Savio, Michael Marxer, Christoph Battaglia
Abstract:
The changes observed in the last years in the field of manufacturing and production engineering, popularly known as "Fourth Industry Revolution", utilizes the achievements in the different areas of computer sciences, introducing new solutions at almost every stage of the production process, just to mention such concepts as mass customization, cloud computing, knowledge-based engineering, virtual reality, rapid prototyping, or virtual models of measuring systems. To effectively speed up the production process and make it more flexible, it is necessary to tighten the bonds connecting individual stages of the production process and to raise the awareness and knowledge of employees of individual sectors about the nature and specificity of work in other stages. It is important to discover and develop a suitable education method adapted to the specificities of each stage of the production process, becoming an extremely crucial issue to exploit the potential of the fourth industrial revolution properly. Because of it, the project “Train4Dim” (T4D) intends to develop complex training material for digital manufacturing, including content for design, manufacturing, and quality control, with a focus on coordinate metrology and portable measuring systems. In this paper, the authors present an approach to using an active learning methodology for digital manufacturing. T4D main objective is to develop a multi-degree (apprenticeship up to master’s degree studies) and educational approach that can be adapted to different teaching levels. It’s also described the process of creating the underneath methodology. The paper will share the steps to achieve the aims of the project (training model for digital manufacturing): 1) surveying the stakeholders, 2) Defining the learning aims, 3) producing all contents and curriculum, 4) training for tutors, and 5) Pilot courses test and improvements.Keywords: learning, Industry 4.0, active learning, digital manufacturing
Procedia PDF Downloads 9725115 Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series
Authors: Atbin Mahabbati, Jason Beringer, Matthias Leopold
Abstract:
To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm.Keywords: carbon flux, Eddy covariance, extreme gradient boosting, gap-filling comparison, hybrid model, OzFlux network
Procedia PDF Downloads 13925114 Impact of Surface Roughness on Light Absorption
Authors: V. Gareyan, Zh. Gevorkian
Abstract:
We study oblique incident light absorption in opaque media with rough surfaces. An analytical approach with modified boundary conditions taking into account the surface roughness in metallic or dielectric films has been discussed. Our approach reveals interference-linked terms that modify the absorption dependence on different characteristics. We have discussed the limits of our approach that hold valid from the visible to the microwave region. Polarization and angular dependences of roughness-induced absorption are revealed. The existence of an incident angle or a wavelength for which the absorptance of a rough surface becomes equal to that of a flat surface is predicted. Based on this phenomenon, a method of determining roughness correlation length is suggested.Keywords: light, absorption, surface, roughness
Procedia PDF Downloads 5425113 Geo-Additive Modeling of Family Size in Nigeria
Authors: Oluwayemisi O. Alaba, John O. Olaomi
Abstract:
The 2013 Nigerian Demographic Health Survey (NDHS) data was used to investigate the determinants of family size in Nigeria using the geo-additive model. The fixed effect of categorical covariates were modelled using the diffuse prior, P-spline with second-order random walk for the nonlinear effect of continuous variable, spatial effects followed Markov random field priors while the exchangeable normal priors were used for the random effects of the community and household. The Negative Binomial distribution was used to handle overdispersion of the dependent variable. Inference was fully Bayesian approach. Results showed a declining effect of secondary and higher education of mother, Yoruba tribe, Christianity, family planning, mother giving birth by caesarean section and having a partner who has secondary education on family size. Big family size is positively associated with age at first birth, number of daughters in a household, being gainfully employed, married and living with partner, community and household effects.Keywords: Bayesian analysis, family size, geo-additive model, negative binomial
Procedia PDF Downloads 54125112 Structural Behavior of Composite Hollow RC Column under Combined Loads
Authors: Abdul Qader Melhm, Hussein Elrafidi
Abstract:
This paper is dealing with studying the structural behavior of a steel-composite hollow reinforced concrete (RC) column model under combined eccentric loading. The composite model consists of an inner steel tube surrounded via a concrete core with longitudinal and circular transverse reinforcement. The radius of gyration according to American and Euro specifications be calculated, in order to calculate the thinnest ratio for this type of composite column model, in addition to the flexural rigidity. Formulas for interaction diagram is given for this type of model, which is a general loading conditions in which an element is exposed to an axial load with bending at the same time. The structural capacity of this model, elastic, plastic loads and strains will be computed and compared with experimental results. The total eccentric axial load of the column model is calculated based on the effective length KL available from several relationships provided in the paper. Furthermore, the inner tube experiences buckling failure after reaching its maximum strength will be investigated.Keywords: column, composite, eccentric, inner tube, interaction, reinforcement
Procedia PDF Downloads 19225111 On Unification of the Electromagnetic, Strong and Weak Interactions
Authors: Hassan Youssef Mohamed
Abstract:
In this paper, we show new wave equations, and by using the equations, we concluded that the strong force and the weak force are not fundamental, but they are quantum effects for electromagnetism. This result is different from the current scientific understanding about strong and weak interactions at all. So, we introduce three evidences for our theory. First, we prove the asymptotic freedom phenomenon in the strong force by using our model. Second, we derive the nuclear shell model as an approximation of our model. Third, we prove that the leptons do not participate in the strong interactions, and we prove the short ranges of weak and strong interactions. So, our model is consistent with the current understanding of physics. Finally, we introduce the electron-positron model as the basic ingredients for protons, neutrons, and all matters, so we can study all particles interactions and nuclear interaction as many-body problems of electrons and positrons. Also, we prove the violation of parity conservation in weak interaction as evidence of our theory in the weak interaction. Also, we calculate the average of the binding energy per nucleon.Keywords: new wave equations, the strong force, the grand unification theory, hydrogen atom, weak force, the nuclear shell model, the asymptotic freedom, electron-positron model, the violation of parity conservation, the binding energy
Procedia PDF Downloads 18525110 Modified Plastic-Damage Model for FRP-Confined Repaired Concrete Columns
Authors: I. A Tijani, Y. F Wu, C.W. Lim
Abstract:
Concrete Damaged Plasticity Model (CDPM) is capable of modeling the stress-strain behavior of confined concrete. Nevertheless, the accuracy of the model largely depends on its parameters. To date, most research works mainly focus on the identification and modification of the parameters for fiber reinforced polymer (FRP) confined concrete prior to damage. And, it has been established that the FRP-strengthened concrete behaves differently to FRP-repaired concrete. This paper presents a modified plastic damage model within the context of the CDPM in ABAQUS for modelling of a uniformly FRP-confined repaired concrete under monotonic loading. The proposed model includes infliction damage, elastic stiffness, yield criterion and strain hardening rule. The distinct feature of damaged concrete is elastic stiffness reduction; this is included in the model. Meanwhile, the test results were obtained from a physical testing of repaired concrete. The dilation model is expressed as a function of the lateral stiffness of the FRP-jacket. The finite element predictions are shown to be in close agreement with the obtained test results of the repaired concrete. It was observed from the study that with necessary modifications, finite element method is capable of modeling FRP-repaired concrete structures.Keywords: Concrete, FRP, Damage, Repairing, Plasticity, and Finite element method
Procedia PDF Downloads 13725109 Spring Water Quality Appraisement for Drinking and Irrigation Application in Nigeria: A Muliti-Criteria Approach
Authors: Hillary Onyeka Abugu, Valentine Chinakwugwo Ezea, Janefrances Ngozi Ihedioha, Nwachukwu Romanus Ekere
Abstract:
The study assessed the spring water quality in Igbo-Etiti, Nigeria, for drinking and irrigation application using Physico-chemical parameters, water quality index, mineral and trace elements, pollution indices and risk assessment. Standard methods were used to determine the physicochemical properties of the spring water in rainy and dry seasons. Trace metals such as Pb, Cd, Zn and Cu were determined with atomic absorption spectrophotometer. The results showed that most of the physicochemical properties studied were within the guideline values set by Nigeria Standard for Drinking Water Quality (NSDWQ), WHO and US EPA for drinking water purposes. However, pH of all the spring water (4.27- 4.73; and 4.95- 5.73), lead (Pb) (0.01-1.08 mg/L) and cadmium (Cd) (0.01-0.15 mg/L) concentrations were above the guideline values in both seasons. This could be attributed to the lithography of the study area, which is the Nsukka formation. Leaching of lead and sulphides from the embedded coal deposits could have led to the increased lead levels and made the water acidic. Two-way ANOVA showed significant differences in most of the parameters studied in dry and rainy seasons. Pearson correlation analysis and cluster analysis showed strong significant positive and negative correlations in some of the parameters studied in both seasons. The water quality index showed that none of the spring water had excellent water status. However, one spring (Iyi Ase) had poor water status in dry season and is considered unsafe for drinking. Iyi Ase was also considered not suitable for irrigation application as predicted by most of the pollution indices, while others were generally considered suitable for irrigation application. Probable cancer and non-cancer risk assessment revealed a probable risk associated with the consumption of the spring in the Igbo-Ettiti area, Nigeria.Keywords: water quality, pollution index, risk assessment, physico-chemical parameters
Procedia PDF Downloads 16725108 Pure and Mixed Nash Equilibria Domain of a Discrete Game Model with Dichotomous Strategy Space
Authors: A. S. Mousa, F. Shoman
Abstract:
We present a discrete game theoretical model with homogeneous individuals who make simultaneous decisions. In this model the strategy space of all individuals is a discrete and dichotomous set which consists of two strategies. We fully characterize the coherent, split and mixed strategies that form Nash equilibria and we determine the corresponding Nash domains for all individuals. We find all strategic thresholds in which individuals can change their mind if small perturbations in the parameters of the model occurs.Keywords: coherent strategy, split strategy, pure strategy, mixed strategy, Nash equilibrium, game theory
Procedia PDF Downloads 14825107 Studying Projection Distance and Flow Properties by Shape Variations of Foam Monitor
Authors: Hyun-Kyu Cho, Jun-Su Kim, Choon-Geun Huh, Geon Lee Young-Chul Park
Abstract:
In this study, the relationship between flow properties and fluid projection distance look into connection for shape variations of foam monitor. A numerical analysis technique for fluid analysis of a foam monitor was developed for the prediction. Shape of foam monitor the flow path of fluid flow according to the shape, The fluid losses were calculated from flow analysis result.. The modified model used the length increase model of the flow path, and straight line of the model. Inlet pressure was 7 [bar] and external was atmosphere codition. am. The results showed that the length increase model of the flow path and straight line of the model was improved in the nozzle projection distance.Keywords: injection performance, finite element method, foam monitor, Projection distance
Procedia PDF Downloads 34725106 Development of an in vitro Fermentation Chicken Ileum Microbiota Model
Authors: Bello Gonzalez, Setten Van M., Brouwer M.
Abstract:
The chicken small intestine represents a dynamic and complex organ in which the enzymatic digestion and absorption of nutrients take place. The development of an in vitro fermentation chicken small intestinal model could be used as an alternative to explore the interaction between the microbiota and nutrient metabolism and to enhance the efficacy of targeting interventions to improve animal health. In the present study we have developed an in vitro fermentation chicken ileum microbiota model for unrevealing the complex interaction of ileum microbial community under physiological conditions. A two-vessel continuous fermentation process simulating in real-time the physiological conditions of the ileum content (pH, temperature, microaerophilic/anoxic conditions, and peristaltic movements) has been standardized as a proof of concept. As inoculum, we use a pool of ileum microbial community obtained from chicken broilers at the age of day 14. The development and validation of the model provide insight into the initial characterization of the ileum microbial community and its dynamics over time-related to nutrient assimilation and fermentation. Samples can be collected at different time points and can be used to determine the microbial compositional structure, dynamics, and diversity over time. The results of studies using this in vitro model will serve as the foundation for the development of a whole small intestine in vitro fermentation chicken gastrointestinal model to complement our already established in vitro fermentation chicken caeca model. The insight gained from this model could provide us with some information about the nutritional strategies to restore and maintain chicken gut homeostasis. Moreover, the in vitro fermentation model will also allow us to study relationships between gut microbiota composition and its dynamics over time associated with nutrients, antimicrobial compounds, and disease modelling.Keywords: broilers, in vitro model, ileum microbiota, fermentation
Procedia PDF Downloads 5725105 Exploring Tweet Geolocation: Leveraging Large Language Models for Post-Hoc Explanations
Authors: Sarra Hasni, Sami Faiz
Abstract:
In recent years, location prediction on social networks has gained significant attention, with short and unstructured texts like tweets posing additional challenges. Advanced geolocation models have been proposed, increasing the need to explain their predictions. In this paper, we provide explanations for a geolocation black-box model using LIME and SHAP, two state-of-the-art XAI (eXplainable Artificial Intelligence) methods. We extend our evaluations to Large Language Models (LLMs) as post hoc explainers for tweet geolocation. Our preliminary results show that LLMs outperform LIME and SHAP by generating more accurate explanations. Additionally, we demonstrate that prompts with examples and meta-prompts containing phonetic spelling rules improve the interpretability of these models, even with informal input data. This approach highlights the potential of advanced prompt engineering techniques to enhance the effectiveness of black-box models in geolocation tasks on social networks.Keywords: large language model, post hoc explainer, prompt engineering, local explanation, tweet geolocation
Procedia PDF Downloads 2625104 Modified Form of Margin Based Angular Softmax Loss for Speaker Verification
Authors: Jamshaid ul Rahman, Akhter Ali, Adnan Manzoor
Abstract:
Learning-based systems have received increasing interest in recent years; recognition structures, including end-to-end speak recognition, are one of the hot topics in this area. A famous work on end-to-end speaker verification by using Angular Softmax Loss gained significant importance and is considered useful to directly trains a discriminative model instead of the traditional adopted i-vector approach. The margin-based strategy in angular softmax is beneficial to learn discriminative speaker embeddings where the random selection of margin values is a big issue in additive angular margin and multiplicative angular margin. As a better solution in this matter, we present an alternative approach by introducing a bit similar form of an additive parameter that was originally introduced for face recognition, and it has a capacity to adjust automatically with the corresponding margin values and is applicable to learn more discriminative features than the Softmax. Experiments are conducted on the part of Fisher dataset, where it observed that the additive parameter with angular softmax to train the front-end and probabilistic linear discriminant analysis (PLDA) in the back-end boosts the performance of the structure.Keywords: additive parameter, angular softmax, speaker verification, PLDA
Procedia PDF Downloads 10325103 Extending Image Captioning to Video Captioning Using Encoder-Decoder
Authors: Sikiru Ademola Adewale, Joe Thomas, Bolanle Hafiz Matti, Tosin Ige
Abstract:
This project demonstrates the implementation and use of an encoder-decoder model to perform a many-to-many mapping of video data to text captions. The many-to-many mapping occurs via an input temporal sequence of video frames to an output sequence of words to form a caption sentence. Data preprocessing, model construction, and model training are discussed. Caption correctness is evaluated using 2-gram BLEU scores across the different splits of the dataset. Specific examples of output captions were shown to demonstrate model generality over the video temporal dimension. Predicted captions were shown to generalize over video action, even in instances where the video scene changed dramatically. Model architecture changes are discussed to improve sentence grammar and correctness.Keywords: decoder, encoder, many-to-many mapping, video captioning, 2-gram BLEU
Procedia PDF Downloads 10825102 Energy States of Some Diatomic Molecules: Exact Quantization Rule Approach
Authors: Babatunde J. Falaye
Abstract:
In this study, we obtain the approximate analytical solutions of the radial Schrödinger equation for the Deng-Fan diatomic molecular potential by using exact quantization rule approach. The wave functions have been expressed by hypergeometric functions via the functional analysis approach. An extension to rotational-vibrational energy eigenvalues of some diatomic molecules are also presented. It is shown that the calculated energy levels are in good agreement with the ones obtained previously E_nl-D (shifted Deng-Fan).Keywords: Schrödinger equation, exact quantization rule, functional analysis, Deng-Fan potential
Procedia PDF Downloads 50025101 Towards Value-Based Healthcare through a Nursing Sector Management Approach
Authors: Hadeer Hegazy, Wael Ewieda, Ranin Soliman, Samah Elway, Asmaa Tawfik, Ragaa Sayed, Sahar Mousa
Abstract:
The current healthcare system is facing major challenges in terms of cost, quality of care, and access to services. In response, the concept of value-based healthcare has emerged as a new approach to healthcare delivery. This concept puts the focus on patient values rather than on the traditional medical model of care. To achieve this, healthcare organizations must be agile and able to anticipate and respond quickly to changing needs. Agile management is essential for healthcare organizations to achieve value-based care, as it allows them to rapidly adjust their strategies to changing circumstances. Additionally, it is argued that agile management can help healthcare organizations gain a better understanding of the needs of their patients and develop better care delivery models. Besides, it can help healthcare organizations develop new services, innovate, and become more efficient. The authors provide evidence to support their argument, drawing on examples from successful value-based healthcare initiatives at children’s cancer hospital Egypt-57357. The paper offers insight into how agile management can be used to facilitate the shift towards value-based healthcare and how it can be used to maximize value in the healthcare system.Keywords: value-based healthcare, agility in healthcare, nursing department, patients outcomes
Procedia PDF Downloads 76825100 Glycoside Hydrolase Clan GH-A-like Structure Complete Evaluation
Authors: Narin Salehiyan
Abstract:
The three iodothyronine selenodeiodinases catalyze the start and end of thyroid hormone impacts in vertebrates. Auxiliary examinations of these proteins have been prevented by their indispensably film nature and the wasteful eukaryotic-specific pathway for selenoprotein blend. Hydrophobic cluster examination utilized in combination with Position-specific Iterated Impact uncovers that their extramembrane parcel has a place to the thioredoxin-fold superfamily for which test structure data exists. Besides, a expansive deiodinase locale imbedded within the thioredoxin overlay offers solid similitudes with the dynamic location of iduronidase, a part of the clan GH-A-fold of glycoside hydrolases. This show can clarify a number of comes about from past mutagenesis examinations and grants unused irrefutable experiences into the auxiliary and utilitarian properties of these proteins.Keywords: glycoside, hydrolase, GH-A-like structure, catalyze
Procedia PDF Downloads 7025099 Optimizing Perennial Plants Image Classification by Fine-Tuning Deep Neural Networks
Authors: Khairani Binti Supyan, Fatimah Khalid, Mas Rina Mustaffa, Azreen Bin Azman, Amirul Azuani Romle
Abstract:
Perennial plant classification plays a significant role in various agricultural and environmental applications, assisting in plant identification, disease detection, and biodiversity monitoring. Nevertheless, attaining high accuracy in perennial plant image classification remains challenging due to the complex variations in plant appearance, the diverse range of environmental conditions under which images are captured, and the inherent variability in image quality stemming from various factors such as lighting conditions, camera settings, and focus. This paper proposes an adaptation approach to optimize perennial plant image classification by fine-tuning the pre-trained DNNs model. This paper explores the efficacy of fine-tuning prevalent architectures, namely VGG16, ResNet50, and InceptionV3, leveraging transfer learning to tailor the models to the specific characteristics of perennial plant datasets. A subset of the MYLPHerbs dataset consisted of 6 perennial plant species of 13481 images under various environmental conditions that were used in the experiments. Different strategies for fine-tuning, including adjusting learning rates, training set sizes, data augmentation, and architectural modifications, were investigated. The experimental outcomes underscore the effectiveness of fine-tuning deep neural networks for perennial plant image classification, with ResNet50 showcasing the highest accuracy of 99.78%. Despite ResNet50's superior performance, both VGG16 and InceptionV3 achieved commendable accuracy of 99.67% and 99.37%, respectively. The overall outcomes reaffirm the robustness of the fine-tuning approach across different deep neural network architectures, offering insights into strategies for optimizing model performance in the domain of perennial plant image classification.Keywords: perennial plants, image classification, deep neural networks, fine-tuning, transfer learning, VGG16, ResNet50, InceptionV3
Procedia PDF Downloads 6625098 Algorithm for Modelling Land Surface Temperature and Land Cover Classification and Their Interaction
Authors: Jigg Pelayo, Ricardo Villar, Einstine Opiso
Abstract:
The rampant and unintended spread of urban areas resulted in increasing artificial component features in the land cover types of the countryside and bringing forth the urban heat island (UHI). This paved the way to wide range of negative influences on the human health and environment which commonly relates to air pollution, drought, higher energy demand, and water shortage. Land cover type also plays a relevant role in the process of understanding the interaction between ground surfaces with the local temperature. At the moment, the depiction of the land surface temperature (LST) at city/municipality scale particularly in certain areas of Misamis Oriental, Philippines is inadequate as support to efficient mitigations and adaptations of the surface urban heat island (SUHI). Thus, this study purposely attempts to provide application on the Landsat 8 satellite data and low density Light Detection and Ranging (LiDAR) products in mapping out quality automated LST model and crop-level land cover classification in a local scale, through theoretical and algorithm based approach utilizing the principle of data analysis subjected to multi-dimensional image object model. The paper also aims to explore the relationship between the derived LST and land cover classification. The results of the presented model showed the ability of comprehensive data analysis and GIS functionalities with the integration of object-based image analysis (OBIA) approach on automating complex maps production processes with considerable efficiency and high accuracy. The findings may potentially lead to expanded investigation of temporal dynamics of land surface UHI. It is worthwhile to note that the environmental significance of these interactions through combined application of remote sensing, geographic information tools, mathematical morphology and data analysis can provide microclimate perception, awareness and improved decision-making for land use planning and characterization at local and neighborhood scale. As a result, it can aid in facilitating problem identification, support mitigations and adaptations more efficiently.Keywords: LiDAR, OBIA, remote sensing, local scale
Procedia PDF Downloads 28225097 Influence of a High-Resolution Land Cover Classification on Air Quality Modelling
Authors: C. Silveira, A. Ascenso, J. Ferreira, A. I. Miranda, P. Tuccella, G. Curci
Abstract:
Poor air quality is one of the main environmental causes of premature deaths worldwide, and mainly in cities, where the majority of the population lives. It is a consequence of successive land cover (LC) and use changes, as a result of the intensification of human activities. Knowing these landscape modifications in a comprehensive spatiotemporal dimension is, therefore, essential for understanding variations in air pollutant concentrations. In this sense, the use of air quality models is very useful to simulate the physical and chemical processes that affect the dispersion and reaction of chemical species into the atmosphere. However, the modelling performance should always be evaluated since the resolution of the input datasets largely dictates the reliability of the air quality outcomes. Among these data, the updated LC is an important parameter to be considered in atmospheric models, since it takes into account the Earth’s surface changes due to natural and anthropic actions, and regulates the exchanges of fluxes (emissions, heat, moisture, etc.) between the soil and the air. This work aims to evaluate the performance of the Weather Research and Forecasting model coupled with Chemistry (WRF-Chem), when different LC classifications are used as an input. The influence of two LC classifications was tested: i) the 24-classes USGS (United States Geological Survey) LC database included by default in the model, and the ii) CLC (Corine Land Cover) and specific high-resolution LC data for Portugal, reclassified according to the new USGS nomenclature (33-classes). Two distinct WRF-Chem simulations were carried out to assess the influence of the LC on air quality over Europe and Portugal, as a case study, for the year 2015, using the nesting technique over three simulation domains (25 km2, 5 km2 and 1 km2 horizontal resolution). Based on the 33-classes LC approach, particular emphasis was attributed to Portugal, given the detail and higher LC spatial resolution (100 m x 100 m) than the CLC data (5000 m x 5000 m). As regards to the air quality, only the LC impacts on tropospheric ozone concentrations were evaluated, because ozone pollution episodes typically occur in Portugal, in particular during the spring/summer, and there are few research works relating to this pollutant with LC changes. The WRF-Chem results were validated by season and station typology using background measurements from the Portuguese air quality monitoring network. As expected, a better model performance was achieved in rural stations: moderate correlation (0.4 – 0.7), BIAS (10 – 21µg.m-3) and RMSE (20 – 30 µg.m-3), and where higher average ozone concentrations were estimated. Comparing both simulations, small differences grounded on the Leaf Area Index and air temperature values were found, although the high-resolution LC approach shows a slight enhancement in the model evaluation. This highlights the role of the LC on the exchange of atmospheric fluxes, and stresses the need to consider a high-resolution LC characterization combined with other detailed model inputs, such as the emission inventory, to improve air quality assessment.Keywords: land use, spatial resolution, WRF-Chem, air quality assessment
Procedia PDF Downloads 15825096 Computing Transition Intensity Using Time-Homogeneous Markov Jump Process: Case of South African HIV/AIDS Disposition
Authors: A. Bayaga
Abstract:
This research provides a technical account of estimating Transition Probability using Time-homogeneous Markov Jump Process applying by South African HIV/AIDS data from the Statistics South Africa. It employs Maximum Likelihood Estimator (MLE) model to explore the possible influence of Transition Probability of mortality cases in which case the data was based on actual Statistics South Africa. This was conducted via an integrated demographic and epidemiological model of South African HIV/AIDS epidemic. The model was fitted to age-specific HIV prevalence data and recorded death data using MLE model. Though the previous model results suggest HIV in South Africa has declined and AIDS mortality rates have declined since 2002 – 2013, in contrast, our results differ evidently with the generally accepted HIV models (Spectrum/EPP and ASSA2008) in South Africa. However, there is the need for supplementary research to be conducted to enhance the demographic parameters in the model and as well apply it to each of the nine (9) provinces of South Africa.Keywords: AIDS mortality rates, epidemiological model, time-homogeneous markov jump process, transition probability, statistics South Africa
Procedia PDF Downloads 49725095 Grouping and the Use of Drums in the Teaching of Word Stress at the Middle Basic: A Pragmatic Approach
Authors: Onwumere O. J.
Abstract:
The teaching of stress at any level of education could be a daunting task for the second language teacher because most times, they are bereft of the right approach to use in teaching it even at the fact is that, teaching it. But the fact is that teaching stress even at the middle basic could be interesting if the right approach is employed. To this end, the researcher was of the view that grouping could be a very good strategy to employ in order to sustain the interest of the learner and that the use at drums would be a good way to concretise the teaching of stress at this level. He was able to do this by discussing stress, grouping as a good technique, and the use of drums in teaching stress. To establish the fact that the use of drums would be very effective, four research questions contained in a questionnaire were structured. Three hundred (300) teachers of English in four tertiary institutions, three secondary schools and three primary schools in Nigeria were used. Based on the data analysis and findings, suggestions were given on how teachers and learners could use drums to make the teaching and learning of stress enjoyable for both teachers and learners at the middle basic of education.Keywords: concretise, grouping, right approach, second language
Procedia PDF Downloads 54525094 Mathematics as the Foundation for the STEM Disciplines: Different Pedagogical Strategies Addressed
Authors: Marion G. Ben-Jacob, David Wang
Abstract:
There is a mathematics requirement for entry level college and university students, especially those who plan to study STEM (Science, Technology, Engineering and Mathematics). Most of them take College Algebra, and to continue their studies, they need to succeed in this course. Different pedagogical strategies are employed to promote the success of our students. There is, of course, the Traditional Method of teaching- lecture, examples, problems for students to solve. The Emporium Model, another pedagogical approach, replaces traditional lectures with a learning resource center model featuring interactive software and on-demand personalized assistance. This presentation will compare these two methods of pedagogy and the study done with its results on this comparison. Math is the foundation for science, technology, and engineering. Its work is generally used in STEM to find patterns in data. These patterns can be used to test relationships, draw general conclusions about data, and model the real world. In STEM, solutions to problems are analyzed, reasoned, and interpreted using math abilities in a assortment of real-world scenarios. This presentation will examine specific examples of how math is used in the different STEM disciplines. Math becomes practical in science when it is used to model natural and artificial experiments to identify a problem and develop a solution for it. As we analyze data, we are using math to find the statistical correlation between the cause of an effect. Scientists who use math include the following: data scientists, scientists, biologists and geologists. Without math, most technology would not be possible. Math is the basis of binary, and without programming, you just have the hardware. Addition, subtraction, multiplication, and division is also used in almost every program written. Mathematical algorithms are inherent in software as well. Mechanical engineers analyze scientific data to design robots by applying math and using the software. Electrical engineers use math to help design and test electrical equipment. They also use math when creating computer simulations and designing new products. Chemical engineers often use mathematics in the lab. Advanced computer software is used to aid in their research and production processes to model theoretical synthesis techniques and properties of chemical compounds. Mathematics mastery is crucial for success in the STEM disciplines. Pedagogical research on formative strategies and necessary topics to be covered are essential.Keywords: emporium model, mathematics, pedagogy, STEM
Procedia PDF Downloads 7525093 Computing Customer Lifetime Value in E-Commerce Websites with Regard to Returned Orders and Payment Method
Authors: Morteza Giti
Abstract:
As online shopping is becoming increasingly popular, computing customer lifetime value for better knowing the customers is also gaining more importance. Two distinct factors that can affect the value of a customer in the context of online shopping is the number of returned orders and payment method. Returned orders are those which have been shipped but not collected by the customer and are returned to the store. Payment method refers to the way that customers choose to pay for the price of the order which are usually two: Pre-pay and Cash-on-delivery. In this paper, a novel model called RFMSP is presented to calculated the customer lifetime value, taking these two parameters into account. The RFMSP model is based on the common RFM model while adding two extra parameter. The S represents the order status and the P indicates the payment method. As a case study for this model, the purchase history of customers in an online shop is used to compute the customer lifetime value over a period of twenty months.Keywords: RFMSP model, AHP, customer lifetime value, k-means clustering, e-commerce
Procedia PDF Downloads 32125092 Highly Glazed Office Spaces: Simulated Visual Comfort vs Real User Experiences
Authors: Zahra Hamedani, Ebrahim Solgi, Henry Skates, Gillian Isoardi
Abstract:
Daylighting plays a pivotal role in promoting productivity and user satisfaction in office spaces. There is an ongoing trend in designing office buildings with a high proportion of glazing which relatively increases the risk of high visual discomfort. Providing a more realistic lighting analysis can be of high value at the early stages of building design when necessary changes can be made at a very low cost. This holistic approach can be achieved by incorporating subjective evaluation and user behaviour in computer simulation and provide a comprehensive lighting analysis. In this research, a detailed computer simulation model has been made using Radiance and Daysim. Afterwards, this model was validated by measurements and user feedback. The case study building is the school of science at Griffith University, Gold Coast, Queensland, which features highly glazed office spaces. In this paper, the visual comfort predicted by the model is compared with a preliminary survey of the building users to evaluate how user behaviour such as desk position, orientation selection, and user movement caused by daylight changes and other visual variations can inform perceptions of visual comfort. This work supports preliminary design analysis of visual comfort incorporating the effects of gaze shift patterns and views with the goal of designing effective layout for office spaces.Keywords: lighting simulation, office buildings, user behaviour, validation, visual comfort
Procedia PDF Downloads 21325091 Modelling Spatial Dynamics of Terrorism
Authors: André Python
Abstract:
To this day, terrorism persists as a worldwide threat, exemplified by the recent deadly attacks in January 2015 in Paris and the ongoing massacres perpetrated by ISIS in Iraq and Syria. In response to this threat, states deploy various counterterrorism measures, the cost of which could be reduced through effective preventive measures. In order to increase the efficiency of preventive measures, policy-makers may benefit from accurate predictive models that are able to capture the complex spatial dynamics of terrorism occurring at a local scale. Despite empirical research carried out at country-level that has confirmed theories explaining the diffusion processes of terrorism across space and time, scholars have failed to assess diffusion’s theories on a local scale. Moreover, since scholars have not made the most of recent statistical modelling approaches, they have been unable to build up predictive models accurate in both space and time. In an effort to address these shortcomings, this research suggests a novel approach to systematically assess the theories of terrorism’s diffusion on a local scale and provide a predictive model of the local spatial dynamics of terrorism worldwide. With a focus on the lethal terrorist events that occurred after 9/11, this paper addresses the following question: why and how does lethal terrorism diffuse in space and time? Based on geolocalised data on worldwide terrorist attacks and covariates gathered from 2002 to 2013, a binomial spatio-temporal point process is used to model the probability of terrorist attacks on a sphere (the world), the surface of which is discretised in the form of Delaunay triangles and refined in areas of specific interest. Within a Bayesian framework, the model is fitted through an integrated nested Laplace approximation - a recent fitting approach that computes fast and accurate estimates of posterior marginals. Hence, for each location in the world, the model provides a probability of encountering a lethal terrorist attack and measures of volatility, which inform on the model’s predictability. Diffusion processes are visualised through interactive maps that highlight space-time variations in the probability and volatility of encountering a lethal attack from 2002 to 2013. Based on the previous twelve years of observation, the location and lethality of terrorist events in 2014 are statistically accurately predicted. Throughout the global scope of this research, local diffusion processes such as escalation and relocation are systematically examined: the former process describes an expansion from high concentration areas of lethal terrorist events (hotspots) to neighbouring areas, while the latter is characterised by changes in the location of hotspots. By controlling for the effect of geographical, economical and demographic variables, the results of the model suggest that the diffusion processes of lethal terrorism are jointly driven by contagious and non-contagious factors that operate on a local scale – as predicted by theories of diffusion. Moreover, by providing a quantitative measure of predictability, the model prevents policy-makers from making decisions based on highly uncertain predictions. Ultimately, this research may provide important complementary tools to enhance the efficiency of policies that aim to prevent and combat terrorism.Keywords: diffusion process, terrorism, spatial dynamics, spatio-temporal modeling
Procedia PDF Downloads 35125090 Smart Container Farming: Innovative Urban Strawberry Farming Model from Japan to the World
Authors: Nishantha Giguruwa
Abstract:
This research investigates the transformative potential of smart container farming, building upon the successful cultivation of Japanese mushrooms at Sakai Farms in Aichi Prefecture, Japan, under the strategic collaboration with the Daikei Group. Inspired by this success, the study focuses on establishing an advanced urban strawberry farming laboratory with the aim of understanding strawberry farming technologies, fostering collaboration, and strategizing marketing approaches for both local and global markets. Positioned within the business framework of Sakai Farms and the Daikei Group, the study underscores the sustainability and forward-looking solutions offered by smart container farming in agriculture. The global significance of strawberries is emphasized, acknowledging their economic and cultural importance. The detailed examination of strawberry farming intricacies informs the technological framework developed for smart containers, implemented at Sakai Farms. Integral to this research is the incorporation of controlled bee pollination, a groundbreaking addition to the smart container farming model. The study anticipates future trends, outlining avenues for continuing exploration, stakeholder collaborations, policy considerations, and expansion strategies. Notably, the author expresses a strategic intent to approach the global market, leveraging the foreign student/faculty base at Ritsumeikan Asia Pacific University, where the author is affiliated. This unique approach aims to disseminate the research findings globally, contributing to the broader landscape of agricultural innovation. The integration of controlled bee pollination within this innovative framework not only enhances sustainability but also marks a significant stride in the evolution of urban agriculture, aligning with global agricultural trends.Keywords: smart container farming, urban agriculture, strawberry farming technologies, controlled bee pollination, agricultural innovation
Procedia PDF Downloads 5625089 Reliability and Availability Analysis of Satellite Data Reception System using Reliability Modeling
Authors: Ch. Sridevi, S. P. Shailender Kumar, B. Gurudayal, A. Chalapathi Rao, K. Koteswara Rao, P. Srinivasulu
Abstract:
System reliability and system availability evaluation plays a crucial role in ensuring the seamless operation of complex satellite data reception system with consistent performance for longer periods. This paper presents a novel approach for the same using a case study on one of the antenna systems at satellite data reception ground station in India. The methodology involves analyzing system's components, their failure rates, system's architecture, generation of logical reliability block diagram model and estimating the reliability of the system using the component level mean time between failures considering exponential distribution to derive a baseline estimate of the system's reliability. The model is then validated with collected system level field failure data from the operational satellite data reception systems that includes failure occurred, failure time, criticality of the failure and repair times by using statistical techniques like median rank, regression and Weibull analysis to extract meaningful insights regarding failure patterns and practical reliability of the system and to assess the accuracy of the developed reliability model. The study mainly focused on identification of critical units within the system, which are prone to failures and have a significant impact on overall performance and brought out a reliability model of the identified critical unit. This model takes into account the interdependencies among system components and their impact on overall system reliability and provides valuable insights into the performance of the system to understand the Improvement or degradation of the system over a period of time and will be the vital input to arrive at the optimized design for future development. It also provides a plug and play framework to understand the effect on performance of the system in case of any up gradations or new designs of the unit. It helps in effective planning and formulating contingency plans to address potential system failures, ensuring the continuity of operations. Furthermore, to instill confidence in system users, the duration for which the system can operate continuously with the desired level of 3 sigma reliability was estimated that turned out to be a vital input to maintenance plan. System availability and station availability was also assessed by considering scenarios of clash and non-clash to determine the overall system performance and potential bottlenecks. Overall, this paper establishes a comprehensive methodology for reliability and availability analysis of complex satellite data reception systems. The results derived from this approach facilitate effective planning contingency measures, and provide users with confidence in system performance and enables decision-makers to make informed choices about system maintenance, upgrades and replacements. It also aids in identifying critical units and assessing system availability in various scenarios and helps in minimizing downtime and optimizing resource allocation.Keywords: exponential distribution, reliability modeling, reliability block diagram, satellite data reception system, system availability, weibull analysis
Procedia PDF Downloads 8425088 Advanced Model for Calculation of the Neutral Axis Shifting and the Wall Thickness Distribution in Rotary Draw Bending Processes
Abstract:
Rotary draw bending is a method which is being used in tube forming. In the tube bending process, the neutral axis moves towards the inner arc and the wall thickness distribution changes for tube’s cross section. Thinning takes place in the outer arc of the tube (extrados) due to the stretching of the material, whereas thickening occurs in the inner arc of the tube (intrados) due to the comparison of the material. The calculations of the wall thickness distribution, neutral axis shifting, and strain distribution have not been accurate enough, so far. The previous model (the geometrical model) describes the neutral axis shifting and wall thickness distribution. The geometrical of the tube, bending radius and bending angle are considered in the geometrical model, while the influence of the material properties of the tube forming are ignored. The advanced model is a modification of the previous model using material properties that depends on the correction factor. The correction factor is a purely empirically determined factor. The advanced model was compared with the Finite element simulation (FE simulation) using a different bending factor (Bf=bending radius/ diameter of the tube), wall thickness (Wf=diameter of the tube/ wall thickness), and material properties (strain hardening exponent). Finite element model of rotary draw bending has been performed in PAM-TUBE program (version: 2012). Results from the advanced model resemble the FE simulation and the experimental test.Keywords: rotary draw bending, material properties, neutral axis shifting, wall thickness distribution
Procedia PDF Downloads 397