Search results for: virtual and constructive models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7774

Search results for: virtual and constructive models

7144 Tip60 Histone Acetyltransferase Activators as Neuroepigenetic Therapeutic Modulators for Alzheimer’s Disease

Authors: Akanksha Bhatnagar, Sandhya Kortegare, Felice Elefant

Abstract:

Context: Alzheimer's disease (AD) is a neurodegenerative disorder that is characterized by progressive cognitive decline and memory loss. The cause of AD is not fully understood, but it is thought to be caused by a combination of genetic, environmental, and lifestyle factors. One of the hallmarks of AD is the loss of neurons in the hippocampus, a brain region that is important for memory and learning. This loss of neurons is thought to be caused by a decrease in histone acetylation, which is a process that regulates gene expression. Research Aim: The research aim of the study was to develop mall molecule compounds that can enhance the activity of Tip60, a histone acetyltransferase that is important for memory and learning. Methodology/Analysis: The researchers used in silico structural modeling and a pharmacophore-based virtual screening approach to design and synthesize small molecule compounds strongly predicted to target and enhance Tip60’s HAT activity. The compounds were then tested in vitro and in vivo to assess their ability to enhance Tip60 activity and rescue cognitive deficits in AD models. Findings: The researchers found that several of the compounds were able to enhance Tip60 activity and rescue cognitive deficits in AD models. The compounds were also developed to cross the blood-brain barrier, which is an important factor for the development of potential AD therapeutics. Theoretical Importance: The findings of this study suggest that Tip60 HAT activators have the potential to be developed as therapeutic agents for AD. The compounds are specific to Tip60, which suggests that they may have fewer side effects than other HDAC inhibitors. Additionally, the compounds are able to cross the blood-brain barrier, which is a major hurdle for the development of AD therapeutics. Data Collection: The study collected data from a variety of sources, including in vitro assays and animal models. The in vitro assays assessed the ability of compounds to enhance Tip60 activity using histone acetyltransferase (HAT) enzyme assays and chromatin immunoprecipitation assays. Animal models were used to assess the ability of the compounds to rescue cognitive deficits in AD models using a variety of behavioral tests, including locomotor ability, sensory learning, and recognition tasks. The human clinical trials will be used to assess the safety and efficacy of the compounds in humans. Questions: The question addressed by this study was whether Tip60 HAT activators could be developed as therapeutic agents for AD. Conclusions: The findings of this study suggest that Tip60 HAT activators have the potential to be developed as therapeutic agents for AD. The compounds are specific to Tip60, which suggests that they may have fewer side effects than other HDAC inhibitors. Additionally, the compounds are able to cross the blood-brain barrier, which is a major hurdle for the development of AD therapeutics. Further research is needed to confirm the safety and efficacy of these compounds in humans.

Keywords: Alzheimer's disease, cognition, neuroepigenetics, drug discovery

Procedia PDF Downloads 50
7143 Probing Language Models for Multiple Linguistic Information

Authors: Bowen Ding, Yihao Kuang

Abstract:

In recent years, large-scale pre-trained language models have achieved state-of-the-art performance on a variety of natural language processing tasks. The word vectors produced by these language models can be viewed as dense encoded presentations of natural language that in text form. However, it is unknown how much linguistic information is encoded and how. In this paper, we construct several corresponding probing tasks for multiple linguistic information to clarify the encoding capabilities of different language models and performed a visual display. We firstly obtain word presentations in vector form from different language models, including BERT, ELMo, RoBERTa and GPT. Classifiers with a small scale of parameters and unsupervised tasks are then applied on these word vectors to discriminate their capability to encode corresponding linguistic information. The constructed probe tasks contain both semantic and syntactic aspects. The semantic aspect includes the ability of the model to understand semantic entities such as numbers, time, and characters, and the grammatical aspect includes the ability of the language model to understand grammatical structures such as dependency relationships and reference relationships. We also compare encoding capabilities of different layers in the same language model to infer how linguistic information is encoded in the model.

Keywords: language models, probing task, text presentation, linguistic information

Procedia PDF Downloads 83
7142 Application Difference between Cox and Logistic Regression Models

Authors: Idrissa Kayijuka

Abstract:

The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.

Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio

Procedia PDF Downloads 434
7141 Comparison of Wake Oscillator Models to Predict Vortex-Induced Vibration of Tall Chimneys

Authors: Saba Rahman, Arvind K. Jain, S. D. Bharti, T. K. Datta

Abstract:

The present study compares the semi-empirical wake-oscillator models that are used to predict vortex-induced vibration of structures. These models include those proposed by Facchinetti, Farshidian, and Dolatabadi, and Skop and Griffin. These models combine a wake oscillator model resembling the Van der Pol oscillator model and a single degree of freedom oscillation model. In order to use these models for estimating the top displacement of chimneys, the first mode vibration of the chimneys is only considered. The modal equation of the chimney constitutes the single degree of freedom model (SDOF). The equations of the wake oscillator model and the SDOF are simultaneously solved using an iterative procedure. The empirical parameters used in the wake-oscillator models are estimated using a newly developed approach, and response is compared with experimental data, which appeared comparable. For carrying out the iterative solution, the ode solver of MATLAB is used. To carry out the comparative study, a tall concrete chimney of height 210m has been chosen with the base diameter as 28m, top diameter as 20m, and thickness as 0.3m. The responses of the chimney are also determined using the linear model proposed by E. Simiu and the deterministic model given in Eurocode. It is observed from the comparative study that the responses predicted by the Facchinetti model and the model proposed by Skop and Griffin are nearly the same, while the model proposed by Fashidian and Dolatabadi predicts a higher response. The linear model without considering the aero-elastic phenomenon provides a less response as compared to the non-linear models. Further, for large damping, the prediction of the response by the Euro code is relatively well compared to those of non-linear models.

Keywords: chimney, deterministic model, van der pol, vortex-induced vibration

Procedia PDF Downloads 200
7140 Digital Self-Identity and the Role of Interactivity in Psychiatric Assessment and Treatment

Authors: Kevin William Taylor

Abstract:

This work draws upon research in the fields of games development and mental health treatments to assess the influence that interactive entertainment has on the populous, and the potential of technology to affect areas of psychiatric assessment and treatment. It will use studies to establish the evolving direction of interactive media in the development of ‘digital self-identity,’ and how this can be incorporated into treatment to the benefit of psychiatry. It will determine that this approach will require collaborative production between developers and psychiatrists in order to ensure precise goals are met, improving the success of serious gaming for psychiatric assessment and treatment. Analysis documents the reach of video games across a growing global community of gamers, highlighting cases of the positives and negatives of video game usage. The games industry is largely oblivious to the psychological negatives, with psychiatrists encountering new conditions such as gaming addiction, which is now recognized by the World Health Organization. With an increasing amount of gamers worldwide, and an additional time per day invested in online gaming and character development, the concept of virtual identity as a means of expressing the id needs further study to ensure successful treatment. In conclusion, the assessment and treatment of game-related conditions are currently reactionary, and while some mental health professionals have begun utilizing interactive technologies to assist with the assessment and treatment of conditions, this study will determine how the success of these products can be enhanced. This will include collaboration between software developers and psychiatrists, allowing new avenues of skill-sharing in interactive design and development. Outlining how to innovate approaches to engagement will reap greater rewards in future interactive products developed for psychiatric assessment and treatment.

Keywords: virtual reality, virtual identity, interactivity, psychiatry

Procedia PDF Downloads 127
7139 Analysis of Moving Loads on Bridges Using Surrogate Models

Authors: Susmita Panda, Arnab Banerjee, Ajinkya Baxy, Bappaditya Manna

Abstract:

The design of short to medium-span high-speed bridges in critical locations is an essential aspect of vehicle-bridge interaction. Due to dynamic interaction between moving load and bridge, mathematical models or finite element modeling computations become time-consuming. Thus, to reduce the computational effort, a universal approximator using an artificial neural network (ANN) has been used to evaluate the dynamic response of the bridge. The data set generation and training of surrogate models have been conducted over the results obtained from mathematical modeling. Further, the robustness of the surrogate model has been investigated, which showed an error percentage of less than 10% with conventional methods. Additionally, the dependency of the dynamic response of the bridge on various load and bridge parameters has been highlighted through a parametric study.

Keywords: artificial neural network, mode superposition method, moving load analysis, surrogate models

Procedia PDF Downloads 82
7138 Applying Multiplicative Weight Update to Skin Cancer Classifiers

Authors: Animish Jain

Abstract:

This study deals with using Multiplicative Weight Update within artificial intelligence and machine learning to create models that can diagnose skin cancer using microscopic images of cancer samples. In this study, the multiplicative weight update method is used to take the predictions of multiple models to try and acquire more accurate results. Logistic Regression, Convolutional Neural Network (CNN), and Support Vector Machine Classifier (SVMC) models are employed within the Multiplicative Weight Update system. These models are trained on pictures of skin cancer from the ISIC-Archive, to look for patterns to label unseen scans as either benign or malignant. These models are utilized in a multiplicative weight update algorithm which takes into account the precision and accuracy of each model through each successive guess to apply weights to their guess. These guesses and weights are then analyzed together to try and obtain the correct predictions. The research hypothesis for this study stated that there would be a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The SVMC model had an accuracy of 77.88%. The CNN model had an accuracy of 85.30%. The Logistic Regression model had an accuracy of 79.09%. Using Multiplicative Weight Update, the algorithm received an accuracy of 72.27%. The final conclusion that was drawn was that there was a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The conclusion was made that using a CNN model would be the best option for this problem rather than a Multiplicative Weight Update system. This is due to the possibility that Multiplicative Weight Update is not effective in a binary setting where there are only two possible classifications. In a categorical setting with multiple classes and groupings, a Multiplicative Weight Update system might become more proficient as it takes into account the strengths of multiple different models to classify images into multiple categories rather than only two categories, as shown in this study. This experimentation and computer science project can help to create better algorithms and models for the future of artificial intelligence in the medical imaging field.

Keywords: artificial intelligence, machine learning, multiplicative weight update, skin cancer

Procedia PDF Downloads 53
7137 Chemometric Estimation of Inhibitory Activity of Benzimidazole Derivatives by Linear Least Squares and Artificial Neural Networks Modelling

Authors: Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević, Lidija R. Jevrić, Stela Jokić

Abstract:

The subject of this paper is to correlate antibacterial behavior of benzimidazole derivatives with their molecular characteristics using chemometric QSAR (Quantitative Structure–Activity Relationships) approach. QSAR analysis has been carried out on the inhibitory activity of benzimidazole derivatives against Staphylococcus aureus. The data were processed by linear least squares (LLS) and artificial neural network (ANN) procedures. The LLS mathematical models have been developed as a calibration models for prediction of the inhibitory activity. The quality of the models was validated by leave one out (LOO) technique and by using external data set. High agreement between experimental and predicted inhibitory acivities indicated the good quality of the derived models. These results are part of the CMST COST Action No. CM1306 "Understanding Movement and Mechanism in Molecular Machines".

Keywords: Antibacterial, benzimidazoles, chemometric, QSAR.

Procedia PDF Downloads 296
7136 Fusion of MOLA-based DEMs and HiRISE Images for Large-Scale Mars Mapping

Authors: Ahmed F. Elaksher, Islam Omar

Abstract:

In this project, we used MOLA-based DEMs to orthorectify HiRISE optical images. The MOLA data was interpolated using the kriging interpolation technique. Corresponding tie points were then digitized from both datasets. These points were employed in co-registering both datasets using GIS analysis tools. Different transformation models, including the affine and projective transformation models, were used with different sets and distributions of tie points. Additionally, we evaluated the use of the MOLA elevations in co-registering the MOLA and HiRISE datasets. The planimetric RMSEs achieved for each model are reported. Results suggested the use of 3D-2D transformation models.

Keywords: photogrammetry, Mars, MOLA, HiRISE

Procedia PDF Downloads 56
7135 Evaluation of QSRR Models by Sum of Ranking Differences Approach: A Case Study of Prediction of Chromatographic Behavior of Pesticides

Authors: Lidija R. Jevrić, Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević

Abstract:

The present study deals with the selection of the most suitable quantitative structure-retention relationship (QSRR) models which should be used in prediction of the retention behavior of basic, neutral, acidic and phenolic pesticides which belong to different classes: fungicides, herbicides, metabolites, insecticides and plant growth regulators. Sum of ranking differences (SRD) approach can give a different point of view on selection of the most consistent QSRR model. SRD approach can be applied not only for ranking of the QSRR models, but also for detection of similarity or dissimilarity among them. Applying the SRD analysis, the most similar models can be found easily. In this study, selection of the best model was carried out on the basis of the reference ranking (“golden standard”) which was defined as the row average values of logarithm of retention time (logtr) defined by high performance liquid chromatography (HPLC). Also, SRD analysis based on experimental logtr values as reference ranking revealed similar grouping of the established QSRR models already obtained by hierarchical cluster analysis (HCA).

Keywords: chemometrics, chromatography, pesticides, sum of ranking differences

Procedia PDF Downloads 359
7134 Cryptocurrency-Based Mobile Payments with Near-Field Communication-Enabled Devices

Authors: Marko Niinimaki

Abstract:

Cryptocurrencies are getting increasingly popular, but very few of them can be conveniently used in daily mobile phone purchases. To solve this problem, we demonstrate how to build a functional prototype of a mobile cryptocurrency-based e-commerce application the communicates with Near-Field Communication (NFC) tags. Using the system, users are able to purchase physical items with an NFC tag that contains an e-commerce URL. The payment is done simply by touching the tag with a mobile device and accepting the payment. Our method is constructive: we describe the design and technologies used in the implementation and evaluate the security and performance of the solution. Our main finding is that the analysis and measurements show that our solution is feasible for e-commerce.

Keywords: cryptocurrency, e-commerce, NFC, mobile devices

Procedia PDF Downloads 160
7133 Dual Language Immersion Models in Theory and Practice

Authors: S. Gordon

Abstract:

Dual language immersion is growing fast in language teaching today. This study provides an overview and evaluation of the different models of Dual language immersion programs in US K-12 schools. First, the paper provides a brief current literature review on the theory of Dual Language Immersion (DLI) in Second Language Acquisition (SLA) studies. Second, examples of several types of DLI language teaching models in US K-12 public schools are presented (including 50/50 models, 90/10 models, etc.). Third, we focus on the unique example of DLI education in the state of Utah, a successful, growing program in K-12 schools that includes: French, Chinese, Spanish, and Portuguese. The project investigates the theory and practice particularly of the case of public elementary and secondary school children that study half their school day in the L1 and the other half in the chosen L2, from kindergarten (age 5-6) through high school (age 17-18). Finally, the project takes the observations of Utah French DLI elementary through secondary programs as a case study. To conclude, we look at the principal challenges, pedagogical objectives and outcomes, and important implications for other US states and other countries (such as France currently) that are in the process of developing similar language learning programs.

Keywords: dual language immersion, second language acquisition, language teaching, pedagogy, teaching, French

Procedia PDF Downloads 149
7132 Fixed-Bed Column Studies of Green Malachite Removal by Use of Alginate-Encapsulated Aluminium Pillared Clay

Authors: Lazhar mouloud, Chemat Zoubida, Ouhoumna Faiza

Abstract:

The main objective of this study, concerns the modeling of breakthrough curves obtained in the adsorption column of malachite green into alginate-encapsulated aluminium pillared clay in fixed bed according to various operating parameters such as the initial concentration, the feed rate and the height fixed bed, applying mathematical models namely: the model of Bohart and Adams, Wolborska, Bed Depth Service Time, Clark and Yoon-Nelson. These models allow us to express the different parameters controlling the performance of the dynamic adsorption system. The results have shown that all models were found suitable for describing the whole or a definite part of the dynamic behavior of the column with respect to the flow rate, the inlet dye concentration and the height of fixed bed.

Keywords: adsorption column, malachite green, pillared clays, alginate, modeling, mathematic models, encapsulation.

Procedia PDF Downloads 490
7131 Augmented Reality to Support the Design of Innovative Agroforestry Systems

Authors: Laetitia Lemiere, Marie Gosme, Gerard Subsol, Marc Jaeger

Abstract:

Agroforestry is recognized as a way of developing sustainable and resilient agriculture that can fight against climate change. However, the number of species combinations, spatial configurations, and management options for trees and crops is vast. These choices must be adapted to the pedoclimatic and socio-economic contexts and to the objectives of the farmer, who therefore needs support in designing his system. Participative design workshops are a good way to integrate the knowledge of several experts in order to design such complex systems. The design of agroforestry systems should take into account both spatial aspects (e.g., spacing of trees within the lines and between lines, tree line orientation, tree-crop distance, species spatial patterns) and temporal aspects (e.g., crop rotations, tree thinning and pruning, tree planting in the case of successional agroforestry). Furthermore, the interactions between trees and crops evolve as the trees grow. However, agroforestry design workshops generally emphasize the spatial aspect only through the use of static tokens to represent the different species when designing the spatial configuration of the system. Augmented reality (AR) may overcome this limitation, allowing to visualize dynamic representations of trees and crops, and also their interactions, while at the same time retaining the possibility to physically interact with the system being designed (i.e., move trees, add or remove species, etc.). We propose an ergonomic digital solution capable of assisting a group of agroforestry experts to design an agroforestry system and to represent it. We investigated the use of web-based marker-based AR that does not require specific hardware and does not require specific installation so that all users could use their own smartphones right out of the pocket. We developed a prototype mobilizing the AR.js, ArToolKit.js, and Three.js open source libraries. In our implementation, we gradually build a virtual agroforestry system pattern scene from the users' interactions. A specific set of markers initialize the scene properties, and the various plant species are added and located during the workshop design session. The full virtual scene, including the trees positions with their neighborhood, are saved for further uses, such as virtual, augmented instantiation in the farmer fields. The number of tree species available in the application is gradually increasing; we mobilize 3D digital models for walnut, poplar, wild cherry, and other popular species used in agroforestry systems. The prototype allows shadow computations and the representation of trees at various growth stages, as well as different tree generations, and is thus able to visualize the dynamics of the system over time. Future work will focus on i) the design of complex patterns mobilizing several tree/shrub organizations, not restricted to lines; ii) the design of interfaces related to cultural practices, such as clearing or pruning; iii) the representation of tree-crop interactions. Beside tree shade (light competition), our objective is to represent also below-ground competitions (water, nitrogen) or other variables of interest for the design of agroforestry systems (e.g., predicted crop yield).

Keywords: agroforestry system design, augmented reality, marker-based AR, participative design, web-based AR

Procedia PDF Downloads 151
7130 Augmented Reality in Advertising and Brand Communication: An Experimental Study

Authors: O. Mauroner, L. Le, S. Best

Abstract:

Digital technologies offer many opportunities in the design and implementation of brand communication and advertising. Augmented reality (AR) is an innovative technology in marketing communication that focuses on the fact that virtual interaction with a product ad offers additional value to consumers. AR enables consumers to obtain (almost) real product experiences by the way of virtual information even before the purchase of a certain product. Aim of AR applications in relation with advertising is in-depth examination of product characteristics to enhance product knowledge as well as brand knowledge. Interactive design of advertising provides observers with an intense examination of a specific advertising message and therefore leads to better brand knowledge. The elaboration likelihood model and the central route to persuasion strongly support this argumentation. Nevertheless, AR in brand communication is still in an initial stage and therefore scientific findings about the impact of AR on information processing and brand attitude are rare. The aim of this paper is to empirically investigate the potential of AR applications in combination with traditional print advertising. To that effect an experimental design with different levels of interactivity is built to measure the impact of interactivity of an ad on different variables o advertising effectiveness.

Keywords: advertising effectiveness, augmented reality, brand communication, brand recall

Procedia PDF Downloads 282
7129 An Improvement of a Dynamic Model of the Secondary Sedimentation Tank and Field Validation

Authors: Zahir Bakiri, Saci Nacefa

Abstract:

In this paper a comparison in made between two models, with and without dispersion term, and focused on the characterization of the movement of the sludge blanket in the secondary sedimentation tank using the solid flux theory and the velocity settling. This allowed us develop a one-dimensional models, with and without dispersion based on a thorough experimental study carried out in situ and the application of online data which are the mass load flow, transfer concentration, and influent characteristic. On the other hand, in the proposed model, the new settling velocity law (double-exponential function) used is based on the Vesilind function.

Keywords: wastewater, activated sludge, sedimentation, settling velocity, settling models

Procedia PDF Downloads 368
7128 Relation of the Anomalous Magnetic Moment of Electron with the Proton and Neutron Masses

Authors: Sergei P. Efimov

Abstract:

The anomalous magnetic moment of the electron is calculated by introducing the effective mass of the virtual part of the electron structure. In this case, the anomalous moment is inversely proportional to the effective mass Meff, which is shown to be a linear combination of the neutron, proton, and electrostatic electron field masses. The spin of a rotating structure is assumed to be equal to 3/2, while the spin of a 'bare' electron is equal to unity, the resultant spin being 1/2. A simple analysis gives the coefficients for a linear combination of proton and electron masses, the approximation precision giving here nine significant digits after the decimal point. The summand proportional to α² adds four more digits. Thus, the conception of the effective mass Meff leads to the formula for the total magnetic moment of the electron, which is accurate to fourteen digits. Association with the virtual beta-decay reaction and possible reasons for simplicity of the derived formula are discussed.

Keywords: anomalous magnetic moment of electron, comparison with quantum electrodynamics. effective mass, fifteen significant figures, proton and neutron masses

Procedia PDF Downloads 110
7127 Development of Advanced Virtual Radiation Detection and Measurement Laboratory (AVR-DML) for Nuclear Science and Engineering Students

Authors: Lily Ranjbar, Haori Yang

Abstract:

Online education has been around for several decades, but the importance of online education became evident after the COVID-19 pandemic. Eventhough the online delivery approach works well for knowledge building through delivering content and oversight processes, it has limitations in developing hands-on laboratory skills, especially in the STEM field. During the pandemic, many education institutions faced numerous challenges in delivering lab-based courses, especially in the STEM field. Also, many students worldwide were unable to practice working with lab equipment due to social distancing or the significant cost of highly specialized equipment. The laboratory plays a crucial role in nuclear science and engineering education. It can engage students and improve their learning outcomes. In addition, online education and virtual labs have gained substantial popularity in engineering and science education. Therefore, developing virtual labs is vital for institutions to deliver high-class education to their students, including their online students. The School of Nuclear Science and Engineering (NSE) at Oregon State University, in partnership with SpectralLabs company, has developed an Advanced Virtual Radiation Detection and Measurement Lab (AVR-DML) to offer a fully online Master of Health Physics program. It was essential for us to use a system that could simulate nuclear modules that accurately replicate the underlying physics, the nature of radiation and radiation transport, and the mechanics of the instrumentations used in the real radiation detection lab. It was all accomplished using a Realistic, Adaptive, Interactive Learning System (RAILS). RAILS is a comprehensive software simulation-based learning system for use in training. It is comprised of a web-based learning management system that is located on a central server, as well as a 3D-simulation package that is downloaded locally to user machines. Users will find that the graphics, animations, and sounds in RAILS create a realistic, immersive environment to practice detecting different radiation sources. These features allow students to coexist, interact and engage with a real STEM lab in all its dimensions. It enables them to feel like they are in a real lab environment and to see the same system they would in a lab. Unique interactive interfaces were designed and developed by integrating all the tools and equipment needed to run each lab. These interfaces provide students full functionality for data collection, changing the experimental setup, and live data collection with real-time updates for each experiment. Students can manually do all experimental setups and parameter changes in this lab. Experimental results can then be tracked and analyzed in an oscilloscope, a multi-channel analyzer, or a single-channel analyzer (SCA). The advanced virtual radiation detection and measurement laboratory developed in this study enabled the NSE school to offer a fully online MHP program. This flexibility of course modality helped us to attract more non-traditional students, including international students. It is a valuable educational tool as students can walk around the virtual lab, make mistakes, and learn from them. They have an unlimited amount of time to repeat and engage in experiments. This lab will also help us speed up training in nuclear science and engineering.

Keywords: advanced radiation detection and measurement, virtual laboratory, realistic adaptive interactive learning system (rails), online education in stem fields, student engagement, stem online education, stem laboratory, online engineering education

Procedia PDF Downloads 74
7126 Mapping Poverty in the Philippines: Insights from Satellite Data and Spatial Econometrics

Authors: Htet Khaing Lin

Abstract:

This study explores the relationship between a diverse set of variables, encompassing both environmental and socio-economic factors, and poverty levels in the Philippines for the years 2012, 2015, and 2018. Employing Ordinary Least Squares (OLS), Spatial Lag Models (SLM), and Spatial Error Models (SEM), this study delves into the dynamics of key indicators, including daytime and nighttime land surface temperature, cropland surface, urban land surface, rainfall, population size, normalized difference water, vegetation, and drought indices. The findings reveal consistent patterns and unexpected correlations, highlighting the need for nuanced policies that address the multifaceted challenges arising from the interplay of environmental and socio-economic factors.

Keywords: poverty analysis, OLS, spatial lag models, spatial error models, Philippines, google earth engine, satellite data, environmental dynamics, socio-economic factors

Procedia PDF Downloads 73
7125 Geopotential Models Evaluation in Algeria Using Stochastic Method, GPS/Leveling and Topographic Data

Authors: M. A. Meslem

Abstract:

For precise geoid determination, we use a reference field to subtract long and medium wavelength of the gravity field from observations data when we use the remove-compute-restore technique. Therefore, a comparison study between considered models should be made in order to select the optimal reference gravity field to be used. In this context, two recent global geopotential models have been selected to perform this comparison study over Northern Algeria. The Earth Gravitational Model (EGM2008) and the Global Gravity Model (GECO) conceived with a combination of the first model with anomalous potential derived from a GOCE satellite-only global model. Free air gravity anomalies in the area under study have been used to compute residual data using both gravity field models and a Digital Terrain Model (DTM) to subtract the residual terrain effect from the gravity observations. Residual data were used to generate local empirical covariance functions and their fitting to the closed form in order to compare their statistical behaviors according to both cases. Finally, height anomalies were computed from both geopotential models and compared to a set of GPS levelled points on benchmarks using least squares adjustment. The result described in details in this paper regarding these two models has pointed out a slight advantage of GECO global model globally through error degree variances comparison and ground-truth evaluation.

Keywords: quasigeoid, gravity aomalies, covariance, GGM

Procedia PDF Downloads 122
7124 A Pilot Study Assessing the Effectiveness of a Virtual Reality Intervention for Alleviating Pain and Anxiety in the Pediatric Emergency Room

Authors: Muqadis Shazia Rajpar, Lawrence Mitelberg, Rubaiat S. Ahmed, Jemer Garrido, Rukhsana Hossain, Sergey M. Motov

Abstract:

Distraction techniques have been used as a means to reduce pain, anxiety, and stress in various healthcare settings to facilitate care and make visits less unpleasant. Using virtual reality (VR) in the pediatric emergency setting can be a valuable, effective, and safe non-pharmacological alternative to the current standard of care for pain and anxiety management in pediatric patients. Our pilot study aimed to evaluate the effectiveness of a VR-based intervention as an alternative distraction modality to alleviate pain and anxiety associated with pediatric emergency department (ED) visits and acute pain conditions. The pilot study period was from November 16 to December 9, 2022, for pediatric ED visits for pain, anxiety, or both. Patients were selected based on a novel VR protocol to receive the VR intervention with the administration of pre and post-intervention surveys concerning pain/anxiety ratings and pain scores (Wong-Baker FACES/NRS). Descriptive statistics, paired t-test, and a Fisher Exact Test were used for data analysis, assuming a p-value of 0.05 for significance. A total of 33 patients (21 females, 12 males), ages 5-20 (M = 10.5, SD = 3.43) participated in this study – 12 patients had pain, 2 patients had anxiety, and 19 patients had both pain and anxiety. There was a statistically significant decrease in post-intervention pain scores of less than one point on the rating scale (6.48 vs. 5.62, p < .001). There was a statistically significant reduction in the percentage of patients suffering from “considerable” or “great” pain after the VR intervention (51.6% to 42.3%, p < .001). Similarly, we noticed an increase in the number of patients with “slight” or “moderate” pain post–VR intervention (48.4% to 57.7%, p < .001). Lastly, we demonstrated a decrease in anxiety among patients after utilizing VR (63.6% vs. 36.4%, p < .001). To conclude, VR can alleviate pain and anxiety in pediatric patients and be a useful non-pharmacological tool in the emergency setting.

Keywords: anxiety, emergency room, pain management, pediatric emergency medicine, virtual reality

Procedia PDF Downloads 71
7123 The Role of Planning and Memory in the Navigational Ability

Authors: Greeshma Sharma, Sushil Chandra, Vijander Singh, Alok Prakash Mittal

Abstract:

Navigational ability requires spatial representation, planning, and memory. It covers three interdependent domains, i.e. cognitive and perceptual factors, neural information processing, and variability in brain microstructure. Many attempts have been made to see the role of spatial representation in the navigational ability, and the individual differences have been identified in the neural substrate. But, there is also a need to address the influence of planning, memory on navigational ability. The present study aims to evaluate relations of aforementioned factors in the navigational ability. Total 30 participants volunteered in the study of a virtual shopping complex and subsequently were classified into good and bad navigators based on their performances. The result showed that planning ability was the most correlated factor for the navigational ability and also the discriminating factor between the good and bad navigators. There was also found the correlations between spatial memory recall and navigational ability. However, non-verbal episodic memory and spatial memory recall were also found to be correlated with the learning variable. This study attempts to identify differences between people with more and less navigational ability on the basis of planning and memory.

Keywords: memory, planning navigational ability, virtual reality

Procedia PDF Downloads 309
7122 A Novel Study Contrasting Traditional Autopsy with Post-Mortem Computed Tomography in Falls Leading to Death

Authors: Balaji Devanathan, Gokul G., Abilash S., Abhishek Yadav, Sudhir K. Gupta

Abstract:

Background: As an alternative to the traditional autopsy, a virtual autopsy is carried out using scanning and imaging technologies, mainly post-mortem computed tomography (PMCT). This facility aims to supplement traditional autopsy results and reduce or eliminate internal dissection in subsequent autopsies. For emotional and religious reasons, the deceased's relatives have historically disapproved such interior dissection. The non-invasive, objective, and preservative PMCT is what friends and family would rather have than a traditional autopsy. Additionally, it aids in the examination of the technologies and the benefits and drawbacks of each, demonstrating the significance of contemporary imaging in the field of forensic medicine. Results: One hundred falls resulting in fatalities was analysed by the writers. Before the autopsy, each case underwent a PMCT examination using a 16-slice Multi-Slice CT spiral scanner. By using specialised software, MPR and VR reconstructions were carried out following the capture of the raw images. The accurate detection of fractures in the skull, face bones, clavicle, scapula, and vertebra was better observed in comparison to a routine autopsy. The interpretation of pneumothorax, Pneumoperitoneum, pneumocephalus, and hemosiuns are much enhanced by PMCT than traditional autopsy. Conclusion. It is useful to visualise the skeletal damage in fall from height cases using a virtual autopsy based on PMCT. So, the ideal tool in traumatising patients is a virtual autopsy based on PMCT scans. When assessing trauma victims, PMCT should be viewed as an additional helpful tool to traditional autopsy. This is because it can identify additional bone fractures in body parts that are challenging to examine during autopsy, such as posterior regions, which helps the pathologist reconstruct the victim's life and determine the cause of death.

Keywords: PMCT, fall from height, autopsy, fracture

Procedia PDF Downloads 18
7121 The Influence of Negative Online Word of Mouth on Consumer's Online Purchasing Intention in Sri Lanka through Virtual Snowball Sampling Method: A Special Reference from Northern Province

Authors: Sutharsini Jesuthasan, N. Umakanth

Abstract:

Presently the impact of electronic word of mouth on consumer’s purchasing intentions very popular one for a long time period. Even though now this E-WOM got a new evolution through social media. Before this new concept, general people were able to speak with any people on the internet. But likely social media enable people to talk with colleagues, friends and other people on the internet. Meanwhile, this new path way of E-WOM might be more powerful in terms of confusing purchase intention. And negative side of E-WOM very important in this competitive era. So, this study elaborates the negative E-WOM within the context of social media such as face book. And especially this study identifies the influence of negative E-WOM in social media on consumer’s purchase intention. Virtual snowball sampling method was used by researcher to identify the hidden population. Finally, spss 20.0 also used for data analysis purpose. And conclusion and recommendations are given based on the findings. And this research also will support to both parties such as researcher and participants.

Keywords: word of mouth, social media, purchase intention, electronic word of mouth

Procedia PDF Downloads 127
7120 Plant Identification Using Convolution Neural Network and Vision Transformer-Based Models

Authors: Virender Singh, Mathew Rees, Simon Hampton, Sivaram Annadurai

Abstract:

Plant identification is a challenging task that aims to identify the family, genus, and species according to plant morphological features. Automated deep learning-based computer vision algorithms are widely used for identifying plants and can help users narrow down the possibilities. However, numerous morphological similarities between and within species render correct classification difficult. In this paper, we tested custom convolution neural network (CNN) and vision transformer (ViT) based models using the PyTorch framework to classify plants. We used a large dataset of 88,000 provided by the Royal Horticultural Society (RHS) and a smaller dataset of 16,000 images from the PlantClef 2015 dataset for classifying plants at genus and species levels, respectively. Our results show that for classifying plants at the genus level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420 and other state-of-the-art CNN-based models suggested in previous studies on a similar dataset. ViT model achieved top accuracy of 83.3% for classifying plants at the genus level. For classifying plants at the species level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420, with a top accuracy of 92.5%. We show that the correct set of augmentation techniques plays an important role in classification success. In conclusion, these results could help end users, professionals and the general public alike in identifying plants quicker and with improved accuracy.

Keywords: plant identification, CNN, image processing, vision transformer, classification

Procedia PDF Downloads 73
7119 Evaluation of a Method for the Virtual Design of a Software-based Approach for Electronic Fuse Protection in Automotive Applications

Authors: Dominic Huschke, Rudolf Keil

Abstract:

New driving functionalities like highly automated driving have a major impact on the electrics/electronics architecture of future vehicles and inevitably lead to higher safety requirements. Partly due to these increased requirements, the vehicle industry is increasingly looking at semiconductor switches as an alternative to conventional melting fuses. The protective functionality of semiconductor switches can be implemented in hardware as well as in software. A current approach discussed in science and industry is the implementation of a model of the protected low voltage power cable on a microcontroller to calculate its temperature. Here, the information regarding the current is provided by the continuous current measurement of the semiconductor switch. The signal to open the semiconductor switch is provided by the microcontroller when a previously defined limit for the temperature of the low voltage power cable is exceeded. A setup for the testing of the described principle for electronic fuse protection of a low voltage power cable is built and successfullyvalidated with experiments afterwards. Here, the evaluation criterion is the deviation of the measured temperature of the low voltage power cable from the specified limit temperature when the semiconductor switch is opened. The analysis is carried out with an assumed ambient temperature as well as with a measured ambient temperature. Subsequently, the experimentally performed investigations are simulated in a virtual environment. The explicit focus is on the simulation of the behavior of the microcontroller with an implemented model of a low voltage power cable in a real-time environment. Subsequently, the generated results are compared with those of the experiments. Based on this, the completely virtual design of the described approach is assumed to be valid.

Keywords: automotive wire harness, electronic fuse protection, low voltage power cable, semiconductor-based fuses, software-based validation

Procedia PDF Downloads 88
7118 Sensitivity and Uncertainty Analysis of One Dimensional Shape Memory Alloy Constitutive Models

Authors: A. B. M. Rezaul Islam, Ernur Karadogan

Abstract:

Shape memory alloys (SMAs) are known for their shape memory effect and pseudoelasticity behavior. Their thermomechanical behaviors are modeled by numerous researchers using microscopic thermodynamic and macroscopic phenomenological point of view. Tanaka, Liang-Rogers and Ivshin-Pence models are some of the most popular SMA macroscopic phenomenological constitutive models. They describe SMA behavior in terms of stress, strain and temperature. These models involve material parameters and they have associated uncertainty present in them. At different operating temperatures, the uncertainty propagates to the output when the material is subjected to loading followed by unloading. The propagation of uncertainty while utilizing these models in real-life application can result in performance discrepancies or failure at extreme conditions. To resolve this, we used probabilistic approach to perform the sensitivity and uncertainty analysis of Tanaka, Liang-Rogers, and Ivshin-Pence models. Sobol and extended Fourier Amplitude Sensitivity Testing (eFAST) methods have been used to perform the sensitivity analysis for simulated isothermal loading/unloading at various operating temperatures. As per the results, it is evident that the models vary due to the change in operating temperature and loading condition. The average and stress-dependent sensitivity indices present the most significant parameters at several temperatures. This work highlights the sensitivity and uncertainty analysis results and shows comparison of them at different temperatures and loading conditions for all these models. The analysis presented will aid in designing engineering applications by eliminating the probability of model failure due to the uncertainty in the input parameters. Thus, it is recommended to have a proper understanding of sensitive parameters and the uncertainty propagation at several operating temperatures and loading conditions as per Tanaka, Liang-Rogers, and Ivshin-Pence model.

Keywords: constitutive models, FAST sensitivity analysis, sensitivity analysis, sobol, shape memory alloy, uncertainty analysis

Procedia PDF Downloads 123
7117 Hierarchical Queue-Based Task Scheduling with CloudSim

Authors: Wanqing You, Kai Qian, Ying Qian

Abstract:

The concepts of Cloud Computing provide users with infrastructure, platform and software as service, which make those services more accessible for people via Internet. To better analysis the performance of Cloud Computing provisioning policies as well as resources allocation strategies, a toolkit named CloudSim proposed. With CloudSim, the Cloud Computing environment can be easily constructed by modelling and simulating cloud computing components, such as datacenter, host, and virtual machine. A good scheduling strategy is the key to achieve the load balancing among different machines as well as to improve the utilization of basic resources. Recently, the existing scheduling algorithms may work well in some presumptive cases in a single machine; however they are unable to make the best decision for the unforeseen future. In real world scenario, there would be numbers of tasks as well as several virtual machines working in parallel. Based on the concepts of multi-queue, this paper presents a new scheduling algorithm to schedule tasks with CloudSim by taking into account several parameters, the machines’ capacity, the priority of tasks and the history log.

Keywords: hierarchical queue, load balancing, CloudSim, information technology

Procedia PDF Downloads 402
7116 Measuring Environmental Efficiency of Energy in OPEC Countries

Authors: Bahram Fathi, Seyedhossein Sajadifar, Naser Khiabani

Abstract:

Data envelopment analysis (DEA) has recently gained popularity in energy efficiency analysis. A common feature of the previously proposed DEA models for measuring energy efficiency performance is that they treat energy consumption as an input within a production framework without considering undesirable outputs. However, energy use results in the generation of undesirable outputs as byproducts of producing desirable outputs. Within a joint production framework of both desirable and undesirable outputs, this paper presents several DEA-type linear programming models for measuring energy efficiency performance. In addition to considering undesirable outputs, our models treat different energy sources as different inputs so that changes in energy mix could be accounted for in evaluating energy efficiency. The proposed models are applied to measure the energy efficiency performances of 12 OPEC countries and the results obtained are presented.

Keywords: energy efficiency, undesirable outputs, data envelopment analysis

Procedia PDF Downloads 714
7115 Enhancing Model Interoperability and Reuse by Designing and Developing a Unified Metamodel Standard

Authors: Arash Gharibi

Abstract:

Mankind has always used models to solve problems. Essentially, models are simplified versions of reality, whose need stems from having to deal with complexity; many processes or phenomena are too complex to be described completely. Thus a fundamental model requirement is that it contains the characteristic features that are essential in the context of the problem to be solved or described. Models are used in virtually every scientific domain to deal with various problems. During the recent decades, the number of models has increased exponentially. Publication of models as part of original research has traditionally been in in scientific periodicals, series, monographs, agency reports, national journals and laboratory reports. This makes it difficult for interested groups and communities to stay informed about the state-of-the-art. During the modeling process, many important decisions are made which impact the final form of the model. Without a record of these considerations, the final model remains ill-defined and open to varying interpretations. Unfortunately, the details of these considerations are often lost or in case there is any existing information about a model, it is likely to be written intuitively in different layouts and in different degrees of detail. In order to overcome these issues, different domains have attempted to implement their own approaches to preserve their models’ information in forms of model documentation. The most frequently cited model documentation approaches show that they are domain specific, not to applicable to the existing models and evolutionary flexibility and intrinsic corrections and improvements are not possible with the current approaches. These issues are all because of a lack of unified standards for model documentation. As a way forward, this research will propose a new standard for capturing and managing models’ information in a unified way so that interoperability and reusability of models become possible. This standard will also be evolutionary, meaning members of modeling realm could contribute to its ongoing developments and improvements. In this paper, the current 3 of the most common metamodels are reviewed and according to pros and cons of each, a new metamodel is proposed.

Keywords: metamodel, modeling, interoperability, reuse

Procedia PDF Downloads 180