Search results for: Irene A. Monte
144 Investigation of Efficient Production of ¹³⁵La for the Auger Therapy Using Medical Cyclotron in Poland
Authors: N. Zandi, M. Sitarz, J. Jastrzebski, M. Vagheian, J. Choinski, A. Stolarz, A. Trzcinska
Abstract:
¹³⁵La with the half-life of 19.5 h can be considered as a good candidate for Auger therapy. ¹³⁵La decays almost 100% by electron capture to the stable ¹³⁵Ba. In this study, all important possible reactions leading to ¹³⁵La production are investigated in details, and the corresponding theoretical yield for each reaction using the Monte-Carlo method (MCNPX code) are presented. Among them, the best reaction based on the cost-effectiveness and production yield regarding Poland facilities equipped with medical cyclotron has been selected. ¹³⁵La is produced using 16.5 MeV proton beam of general electric PET trace cyclotron through the ¹³⁵Ba(p,n)¹³⁵La reaction. Moreover, for a consistent facilitating comparison between the theoretical calculations and the experimental measurements, the beam current and also the proton beam energy is measured experimentally. Then, the obtained proton energy is considered as the entrance energy for the theoretical calculations. The production yield finally is measured and compared with the results obtained using the MCNPX code. The results show the experimental measurement and the theoretical calculations are in good agreement.Keywords: efficient ¹³⁵La production, proton cyclotron energy measurement, MCNPX code, theoretical and experimental production yield
Procedia PDF Downloads 142143 Subjective Temporal Resources: On the Relationship Between Time Perspective and Chronic Time Pressure to Burnout
Authors: Diamant Irene, Dar Tamar
Abstract:
Burnout, conceptualized within the framework of stress research, is to a large extent a result of a threat on resources of time or a feeling of time shortage. In reaction to numerous tasks, deadlines, high output, management of different duties encompassing work-home conflicts, many individuals experience ‘time pressure’. Time pressure is characterized as the perception of a lack of available time in relation to the amount of workload. It can be a result of local objective constraints, but it can also be a chronic attribute in coping with life. As such, time pressure is associated in the literature with general stress experience and can therefore be a direct, contributory burnout factor. The present study examines the relation of chronic time pressure – feeling of time shortage and of being rushed, with another central aspect in subjective temporal experience - time perspective. Time perspective is a stable personal disposition, capturing the extent to which people subjectively remember the past, live the present and\or anticipate the future. Based on Hobfoll’s Conservation of Resources Theory, it was hypothesized that individuals with chronic time pressure would experience a permanent threat on their time resources resulting in relatively increased burnout. In addition, it was hypothesized that different time perspective profiles, based on Zimbardo’s typology of five dimensions – Past Positive, Past Negative, Present Hedonistic, Present Fatalistic, and Future, would be related to different magnitudes of chronic time pressure and of burnout. We expected that individuals with ‘Past Negative’ or ‘Present Fatalist’ time perspectives would experience more burnout, with chronic time pressure being a moderator variable. Conversely, individuals with a ‘Present Hedonistic’ - with little concern with the future consequences of actions, would experience less chronic time pressure and less burnout. Another temporal experience angle examined in this study is the difference between the actual distribution of time (as in a typical day) versus desired distribution of time (such as would have been distributed optimally during a day). It was hypothesized that there would be a positive correlation between the gap between these time distributions and chronic time pressure and burnout. Data was collected through an online self-reporting survey distributed on social networks, with 240 participants (aged 21-65) recruited through convenience and snowball sampling methods from various organizational sectors. The results of the present study support the hypotheses and constitute a basis for future debate regarding the elements of burnout in the modern work environment, with an emphasis on subjective temporal experience. Our findings point to the importance of chronic and stable temporal experiences, as time pressure and time perspective, in occupational experience. The findings are also discussed with a view to the development of practical methods of burnout prevention.Keywords: conservation of resources, burnout, time pressure, time perspective
Procedia PDF Downloads 176142 Measurement and Analysis of Radiation Doses to Radiosensitive Organs from CT Examination of the Cervical Spine Using Radiochromic Films and Monte Carlo Simulation Based Software
Authors: Khaled Soliman, Abdullah Alrushoud, Abdulrahman Alkhalifah, Raed Albathi, Salman Altymiat
Abstract:
Radiation dose received by patients undergoing Computed Tomography (CT) examination of the cervical spine was evaluated using Gafchromic XR-QA2 films and CT-EXPO software (ver. 2.3), in order to document our clinical dose values and to compare our results with other benchmarks reported in the current literature. Radiochromic films were recently used as practical dosimetry tool that provides dose profile information not available using the standard ionisation chamber routinely used in CT dosimetry. We have developed an in-house program to use the films in order to calculate the Entrance Dose Length Product (EDLP) in (mGy.cm) and to relate the EDLP to various organ doses calculated using the CT-EXPO software. We also calculated conversion factor in (mSv/mGy.cm) relating the EDLP to the effective dose (ED) from the examination using CT-EXPO software. Variability among different types of CT scanners and dose modulation methods are reported from at least three major CT brands available at our medical institution. Our work describes the dosimetry method and results are reported. The method can be used as in-vivo dosimetry method. But this work only reports results obtained from adult female anthropomorphic Phantom studies.Keywords: CT dosimetry, gafchromic films, XR-QA2, CT-Expo software
Procedia PDF Downloads 471141 Seismic Fragility of Base-Isolated Multi-Story Piping System in Critical Facilities
Authors: Bu Seog Ju, Ho Young Son, Yong Hee Ryu
Abstract:
This study is focused on the evaluation of seismic fragility of multi-story piping system installed in critical structures, isolated with triple friction pendulum bearing. The concept of this study is to isolate the critical building structure as well as nonstructural component, especially piping system in order to mitigate the earthquake damage and achieve the reliable seismic design. Then, the building system and multi-story piping system was modeled in OpenSees. In particular, the triple friction pendulum isolator was accounted for the vertical and horizontal coupling behavior in the building system subjected to seismic ground motions. Consequently, in order to generate the seismic fragility of base-isolated multi-story piping system, 21 selected seismic ground motions were carried out, by using Monte Carlo Simulation accounted for the uncertainties in demand. Finally, the system-level fragility curves corresponding to the limit state of the piping system was conducted at each T-joint system, which was commonly failure points in piping systems during and after an earthquake. Additionally, the system-level fragilities were performed to the first floor and second floor level in critical structures.Keywords: fragility, friction pendulum bearing, nonstructural component, seismic
Procedia PDF Downloads 150140 Numerical Response of Planar HPGe Detector for 241Am Contamination of Various Shapes
Authors: M. Manohari, Himanshu Gupta, S. Priyadharshini, R. Santhanam, S. Chandrasekaran, B. Venkatraman
Abstract:
Injection is one of the potential routes of intake in a radioactive facility. The internal dose due to this intake is monitored at the radiation emergency medical centre, IGCAR using a portable planar HPGe detector. The contaminated wound may be having different shapes. In a reprocessing potential of wound contamination with actinide is more. Efficiency is one of the input parameters for estimation of internal dose. Estimating these efficiencies experimentally would be tedious and cumbersome. Numerical estimation can be a supplement to experiment. As an initial step in this study 241Am contamination of different shapes are studied. In this study portable planar HPGe detector was modeled using Monte Carlo code FLUKA and the effect of different parameters like distance of the contamination from the detector, radius of the circular contamination were studied. Efficiency values for point and surface contamination located at different distances were estimated. The effect of efficiency on the radius of the surface source was more predominant when the source is at 1 cm distance compared to when the source to detector distance is 10 cm. At 1 cm the efficiency decreased quadratically as the radius increased and at 10 cm it decreased linearly. The point source efficiency varied exponentially with source to detector distance.Keywords: Planar HPGe, efficiency value, injection, surface source
Procedia PDF Downloads 42139 Modified CUSUM Algorithm for Gradual Change Detection in a Time Series Data
Authors: Victoria Siriaki Jorry, I. S. Mbalawata, Hayong Shin
Abstract:
The main objective in a change detection problem is to develop algorithms for efficient detection of gradual and/or abrupt changes in the parameter distribution of a process or time series data. In this paper, we present a modified cumulative (MCUSUM) algorithm to detect the start and end of a time-varying linear drift in mean value of a time series data based on likelihood ratio test procedure. The design, implementation and performance of the proposed algorithm for a linear drift detection is evaluated and compared to the existing CUSUM algorithm using different performance measures. An approach to accurately approximate the threshold of the MCUSUM is also provided. Performance of the MCUSUM for gradual change-point detection is compared to that of standard cumulative sum (CUSUM) control chart designed for abrupt shift detection using Monte Carlo Simulations. In terms of the expected time for detection, the MCUSUM procedure is found to have a better performance than a standard CUSUM chart for detection of the gradual change in mean. The algorithm is then applied and tested to a randomly generated time series data with a gradual linear trend in mean to demonstrate its usefulness.Keywords: average run length, CUSUM control chart, gradual change detection, likelihood ratio test
Procedia PDF Downloads 299138 Medication Side Effects: Implications on the Mental Health and Adherence Behaviour of Patients with Hypertension
Authors: Irene Kretchy, Frances Owusu-Daaku, Samuel Danquah
Abstract:
Hypertension is the leading risk factor for cardiovascular diseases, and a major cause of death and disability worldwide. This study examined whether psychosocial variables influenced patients’ perception and experience of side effects of their medicines, how they coped with these experiences and the impact on mental health and medication adherence to conventional hypertension therapies. Methods: A hospital-based mixed methods study, using quantitative and qualitative approaches was conducted on hypertensive patients. Participants were asked about side effects, medication adherence, common psychological symptoms, and coping mechanisms with the aid of standard questionnaires. Information from the quantitative phase was analyzed with the Statistical Package for Social Sciences (SPSS) version 20. The interviews from the qualitative study were audio-taped with a digital audio recorder, manually transcribed and analyzed using thematic content analysis. The themes originated from participant interviews a posteriori. Results: The experiences of side effects – such as palpitations, frequent urination, recurrent bouts of hunger, erectile dysfunction, dizziness, cough, physical exhaustion - were categorized as no/low (39.75%), moderate (53.0%) and high (7.25%). Significant relationships between depression (x 2 = 24.21, P < 0.0001), anxiety (x 2 = 42.33, P < 0.0001), stress (x 2 = 39.73, P < 0.0001) and side effects were observed. A logistic regression model using the adjusted results for this association are reported – depression [OR = 1.9 (1.03 – 3.57), p = 0.04], anxiety [OR = 1.5 (1.22 – 1.77), p = < 0.001], and stress [OR = 1.3 (1.02 – 1.71), p = 0.04]. Side effects significantly increased the probability of individuals to be non-adherent [OR = 4.84 (95% CI 1.07 – 1.85), p = 0.04] with social factors, media influences and attitudes of primary caregivers further explaining this relationship. The personal adoption of medication modifying strategies, espousing the use of complementary and alternative treatments, and interventions made by clinicians were the main forms of coping with side effects. Conclusions: Results from this study show that contrary to a biomedical approach, the experience of side effects has biological, social and psychological interrelations. The result offers more support for the need for a multi-disciplinary approach to healthcare where all forms of expertise are incorporated into health provision and patient care. Additionally, medication side effects should be considered as a possible cause of non-adherence among hypertensive patients, thus addressing this problem from a Biopsychosocial perspective in any intervention may improve adherence and invariably control blood pressure.Keywords: biopsychosocial, hypertension, medication adherence, psychological disorders
Procedia PDF Downloads 371137 Travel Behavior Simulation of Bike-Sharing System Users in Kaoshiung City
Authors: Hong-Yi Lin, Feng-Tyan Lin
Abstract:
In a Bike-sharing system (BSS), users can easily rent bikes from any station in the city for mid-range or short-range trips. BSS can also be integrated with other types of transport system, especially Green Transportation system, such as rail transport, bus etc. Since BSS records time and place of each pickup and return, the operational data can reflect more authentic and dynamic state of user behaviors. Furthermore, land uses around docking stations are highly associated with origins and destinations for the BSS users. As urban researchers, what concerns us more is to take BSS into consideration during the urban planning process and enhance the quality of urban life. This research focuses on the simulation of travel behavior of BSS users in Kaohsiung. First, rules of users’ behavior were derived by analyzing operational data and land use patterns nearby docking stations. Then, integrating with Monte Carlo method, these rules were embedded into a travel behavior simulation model, which was implemented by NetLogo, an agent-based modeling tool. The simulation model allows us to foresee the rent-return behaviour of BSS in order to choose potential locations of the docking stations. Also, it can provide insights and recommendations about planning and policies for the future BSS.Keywords: agent-based model, bike-sharing system, BSS operational data, simulation
Procedia PDF Downloads 333136 Nanocellulose Reinforced Biocomposites Based on Wheat Plasticized Starch for Food Packaging
Authors: Belen Montero, Carmen Ramirez, Maite Rico, Rebeca Bouza, Irene Derungs
Abstract:
Starch is a promising polymer for producing biocomposite materials because it is renewable, completely biodegradable and easily available at a low cost. Thermoplastic starches (TPS) can be obtained after the disruption and plasticization of native starch with a plasticizer. In this work, the solvent casting method was used to obtain TPS films from wheat starch plasticized with glycerol and reinforced with nanocellulose (CNC). X-ray diffraction analysis was used to follow the evolution of the crystallinity. The native wheat starch granules have shown a profile corresponding to A-type crystal structures typical for cereal starches. When TPS films are analyzed a high amorphous halo centered on 19º is obtained, indicating the plasticization process is completed. SEM imaging was made in order to analyse the morphology. The image from the raw wheat starch granules shows a bimodal granule size distribution with some granules in large round disk-shape forms (A-type) and the others as smaller spherical particles (B-type). The image from the neat TPS surface shows a continuous surface. No starch aggregates or swollen granules can be seen so, the plasticization process is complete. In the surfaces of reinforced TPS films aggregates are seen as the CNC concentration in the matrix increases. The CNC influence on the mechanical properties of TPS films has been studied by dynamic mechanical analysis. A direct relation exists between the storage modulus values, E’, and the CNC content in reinforced TPS films: higher is the content of nanocellulose in the composite, higher is the value of E’. This reinforcement effect can be explained by the appearance of a strong and crystalline nanoparticle-TPS interphase. Thermal stability of films was analysed by TGA. It has not observed any influence on the behaviour related to the thermal degradation of films with the incorporation of the CNC. Finally, the resistance to the water absorption films was analysed following the standard UNE-EN ISO 1998:483. The percentage of water absorbed by the samples at each time was calculated. The addition of 5 wt % of CNC to the TPS matrix leads to a significant improvement in the moisture resistance of the starch based material decreasing their diffusivity. It has been associated to the formation of a nanocrystal network that prevents swelling of the starch and therefore water absorption and to the high crystallinity of cellulose compared to starch. As a conclusion, the wheat film reinforced with 5 wt % of cellulose nanocrystals seems to be a good alternative for short-life applications into the packaging industry, because of its greatest rigidity, thermal stability and moisture sorption resistance.Keywords: biocomposites, nanocellulose, starch, wheat
Procedia PDF Downloads 212135 Study on Reusable, Non Adhesive Silicone Male External Catheter: Clinical Proof of Study and Quality Improvement Project
Authors: Venkata Buddharaju, Irene Mccarron, Hazel Alba
Abstract:
Introduction: Male external catheters (MECs) are commonly used to collect and drain urine. MECs are increasingly used in acute care, long-term acute care hospitals, and nursing facilities, and in other patients as an alternative to invasive urinary catheters to reduce catheter-associated urinary tract infections (CAUTI).MECs are also used to avoid the need for incontinence pads and diapers. Most of the Male External Catheters are held in place by skin adhesive, with the exception of a few, which uses a foam strap clamp around the penile shaft. The adhesive condom catheters typically stay for 24 hours or less. It is also a common practice that extra skin adhesive tape is wrapped around the condom catheter for additional security of the device. The fixed nature of the adhesive will not allow the normal skin expansion of penile size over time. The adhesive can cause skin irritation, redness, erosion, and skin damage. Acanthus condom catheter (ACC) is a patented, specially designed, stretchable silicone catheter without adhesive, adapts to the size and contour of the penis. It is held in place with a single elastic strap that wraps around the lower back and tied to the opposite catheter ring holescriss cross. It can be reused for up to 5 days on the same patient after daily cleaning and washingpotentially reducing cost. Methods: The study was conducted from September 17th to October 8th, 2020. The nursing staff was educated and trained on how to use and reuse the catheter. After identifying five (5) appropriate patients, the catheter was placed and maintained by nursing staff. The data on the ease of use, leak, and skin damage were collected and reported by nurses to the nursing education department of the hospital for analysis. Setting: RML Chicago, long-term acute care hospital, an affiliate of Loyola University Medical Center, Chicago, IL USA. Results: The data showed that the catheter was easy to apply, remove, wash and reuse, without skin problems or urine infections. One patient had used for 16 days after wash, reuse, and replacement without any urine leak or skin issues. A minimal leak was observed on two patients. Conclusion: Acanthus condom catheter was easy to use, functioned well with minimal or no leak during use and reuse. The skin was intact in all patients studied. There were no urinary tract infections in any of the studied patients.Keywords: CAUTI, male external catheter, reusable, skin adhesive
Procedia PDF Downloads 106134 Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.Keywords: probability-based damage detection (PBDD), Kriging, surrogate modeling, uncertainty quantification, artificial intelligence, enhanced ideal gas molecular movement (EIGMM)
Procedia PDF Downloads 239133 Explicit Numerical Approximations for a Pricing Weather Derivatives Model
Authors: Clarinda V. Nhangumbe, Ercília Sousa
Abstract:
Weather Derivatives are financial instruments used to cover non-catastrophic weather events and can be expressed in the form of standard or plain vanilla products, structured or exotics products. The underlying asset, in this case, is the weather index, such as temperature, rainfall, humidity, wind, and snowfall. The complexity of the Weather Derivatives structure shows the weakness of the Black Scholes framework. Therefore, under the risk-neutral probability measure, the option price of a weather contract can be given as a unique solution of a two-dimensional partial differential equation (parabolic in one direction and hyperbolic in other directions), with an initial condition and subjected to adequate boundary conditions. To calculate the price of the option, one can use numerical methods such as the Monte Carlo simulations and implicit finite difference schemes conjugated with Semi-Lagrangian methods. This paper is proposed two explicit methods, namely, first-order upwind in the hyperbolic direction combined with Lax-Wendroff in the parabolic direction and first-order upwind in the hyperbolic direction combined with second-order upwind in the parabolic direction. One of the advantages of these methods is the fact that they take into consideration the boundary conditions obtained from the financial interpretation and deal efficiently with the different choices of the convection coefficients.Keywords: incomplete markets, numerical methods, partial differential equations, stochastic process, weather derivatives
Procedia PDF Downloads 85132 Motion Performance Analyses and Trajectory Planning of the Movable Leg-Foot Lander
Authors: Shan Jia, Jinbao Chen, Jinhua Zhou, Jiacheng Qian
Abstract:
In response to the functional limitations of the fixed landers, those are to expand the detection range by the use of wheeled rovers with unavoidable path-repeatability in deep space exploration currently, a movable lander based on the leg-foot walking mechanism is presented. Firstly, a quadruped landing mechanism based on pushrod-damping is proposed. The configuration is of the bionic characteristics such as hip, knee and ankle joints, and the multi-function main/auxiliary buffers based on the crumple-energy absorption and screw-nut mechanism. Secondly, the workspace of the end of the leg-foot mechanism is solved by Monte Carlo method, and the key points on the desired trajectory of the end of the leg-foot mechanism are fitted by cubic spline curve. Finally, an optimal time-jerk trajectory based on weight coefficient is planned and analyzed by an adaptive genetic algorithm (AGA). The simulation results prove the rationality and stability of walking motion of the movable leg-foot lander in the star catalogue. In addition, this research can also provide a technical solution integrating of soft-landing, large-scale inspection and material transfer for future star catalogue exploration, and can even serve as the technical basis for developing the reusable landers.Keywords: motion performance, trajectory planning, movable, leg-foot lander
Procedia PDF Downloads 139131 Working From Home: On the Relationship Between Place Attachment to Work Place, Extraversion and Segmentation Preference to Burnout
Authors: Diamant Irene, Shklarnik Batya
Abstract:
In on to its widespread effects on health and economic issues, Covid-19 shook the work and employment world. Among the prominent changes during the pandemic is the work-from-home trend, complete or partial, as part of social distancing. In fact, these changes accelerated an existing tendency of work flexibility already underway before the pandemic. Technology and means of advanced communications led to a re-assessment of “place of work” as a physical space in which work takes place. Today workers can remotely carry out meetings, manage projects, work in groups, and different research studies point to the fact that this type of work has no adverse effect on productivity. However, from the worker’s perspective, despite numerous advantages associated with work from home, such as convenience, flexibility, and autonomy, various drawbacks have been identified such as loneliness, reduction of commitment, home-work boundary erosion, all risk factors relating to the quality of life and burnout. Thus, a real need has arisen in exploring differences in work-from-home experiences and understanding the relationship between psychological characteristics and the prevalence of burnout. This understanding may be of significant value to organizations considering a future hybrid work model combining in-office and remote working. Based on Hobfoll’s Theory of Conservation of Resources, we hypothesized that burnout would mainly be found among workers whose physical remoteness from the workplace threatens or hinders their ability to retain significant individual resources. In the present study, we compared fully remote and partially remote workers (hybrid work), and we examined psychological characteristics and their connection to the formation of burnout. Based on the conceptualization of Place Attachment as the cognitive-emotional bond of an individual to a meaningful place and the need to maintain closeness to it, we assumed that individuals characterized with Place Attachment to the workplace would suffer more from burnout when working from home. We also assumed that extrovert individuals, characterized by the need of social interaction at the workplace and individuals with segmentationpreference – a need for separation between different life domains, would suffer more from burnout, especially among fully remote workers relative to partially remote workers. 194 workers, of which 111 worked from home in full and 83 worked partially from home, aged 19-53, from different sectors, were tested using an online questionnaire through social media. The results of the study supported our assumptions. The repercussions of these findings are discussed, relating to future occupational experience, with an emphasis on suitable occupational adjustment according to the psychological characteristics and needs of workers.Keywords: working from home, burnout, place attachment, extraversion, segmentation preference, Covid-19
Procedia PDF Downloads 190130 A Case Study on the Numerical-Probability Approach for Deep Excavation Analysis
Authors: Komeil Valipourian
Abstract:
Urban advances and the growing need for developing infrastructures has increased the importance of deep excavations. In this study, after the introducing probability analysis as an important issue, an attempt has been made to apply it for the deep excavation project of Bangkok’s Metro as a case study. For this, the numerical probability model has been developed based on the Finite Difference Method and Monte Carlo sampling approach. The results indicate that disregarding the issue of probability in this project will result in an inappropriate design of the retaining structure. Therefore, probabilistic redesign of the support is proposed and carried out as one of the applications of probability analysis. A 50% reduction in the flexural strength of the structure increases the failure probability just by 8% in the allowable range and helps improve economic conditions, while maintaining mechanical efficiency. With regard to the lack of efficient design in most deep excavations, by considering geometrical and geotechnical variability, an attempt was made to develop an optimum practical design standard for deep excavations based on failure probability. On this basis, a practical relationship is presented for estimating the maximum allowable horizontal displacement, which can help improve design conditions without developing the probability analysis.Keywords: numerical probability modeling, deep excavation, allowable maximum displacement, finite difference method (FDM)
Procedia PDF Downloads 127129 A Flexible Bayesian State-Space Modelling for Population Dynamics of Wildlife and Livestock Populations
Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Hans-Peter Piepho
Abstract:
We aim to model dynamics of wildlife or pastoral livestock population for understanding of their population change and hence for wildlife conservation and promoting human welfare. The study is motivated by an age-sex structured population counts in different regions of Serengeti-Mara during the period 1989-2003. Developing reliable and realistic models for population dynamics of large herbivore population can be a very complex and challenging exercise. However, the Bayesian statistical domain offers some flexible computational methods that enable the development and efficient implementation of complex population dynamics models. In this work, we have used a novel Bayesian state-space model to analyse the dynamics of topi and hartebeest populations in the Serengeti-Mara Ecosystem of East Africa. The state-space model involves survival probabilities of the animals which further depend on various factors like monthly rainfall, size of habitat, etc. that cause recent declines in numbers of the herbivore populations and potentially threaten their future population viability in the ecosystem. Our study shows that seasonal rainfall is the most important factors shaping the population size of animals and indicates the age-class which most severely affected by any change in weather conditions.Keywords: bayesian state-space model, Markov Chain Monte Carlo, population dynamics, conservation
Procedia PDF Downloads 208128 Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning
Authors: Carlos Gordon, Patricio Encalada, Henry Lema, Diego Leon, Dennis Chicaiza
Abstract:
The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms.Keywords: autonomous navigation, machine learning, path planning, robotic operative system, open source computer vision library
Procedia PDF Downloads 177127 The Diffusion of Membrane Nanodomains with Specific Ganglioside Composition
Authors: Barbora Chmelova, Radek Sachl
Abstract:
Gangliosides are amphipathic membrane lipids. Due to the composition of bulky oligosaccharide chains containing one or more sialic acids linked to the hydrophobic ceramide base, gangliosides are classified among glycosphingolipids. This unique structure induces a high self-aggregating tendency of gangliosides and, therefore, the formation of nanoscopic clusters called nanodomains. Gangliosides are preferentially present in an extracellular membrane leaflet of all human tissues and thus have an impact on a huge number of biological processes, such as intercellular communication, cell signalling, membrane trafficking, and regulation of receptor activity. Defects in their metabolism, impairment of proper ganglioside function, or changes in their organization lead to serious health conditions such as Alzheimer´s and Parkinson´s diseases, autoimmune diseases, tumour growth, etc. This work mainly focuses on ganglioside organization into nanodomains and their dynamics within the plasma membrane. Current research investigates static ganglioside nanodomains characterization; nevertheless, the information about their diffusion is missing. In our study, fluorescence correlation spectroscopy is implemented together with stimulated emission depletion (STED-FCS), which combines the diffraction-unlimited spatial resolution with high temporal resolution. By comparison of the experiments performed on model vesicles containing 4 % of either GM1, GM2, or GM3 and Monte Carlo simulations of diffusion on the plasma membrane, the description of ganglioside clustering, diffusion of nanodomains, and even diffusion of ganglioside molecules inside investigated nanodomains are described.Keywords: gangliosides, nanodomains, STED-FCS, flourescence microscopy, membrane diffusion
Procedia PDF Downloads 81126 City-Wide Simulation on the Effects of Optimal Appliance Scheduling in a Time-of-Use Residential Environment
Authors: Rudolph Carl Barrientos, Juwaln Diego Descallar, Rainer James Palmiano
Abstract:
Household Appliance Scheduling Systems (HASS) coupled with a Time-of-Use (TOU) pricing scheme, a form of Demand Side Management (DSM), is not widely utilized in the Philippines’ residential electricity sector. This paper’s goal is to encourage distribution utilities (DUs) to adopt HASS and TOU by analyzing the effect of household schedulers on the electricity price and load profile in a residential environment. To establish this, a city based on an implemented survey is generated using Monte Carlo Analysis (MCA). Then, a Binary Particle Swarm Optimization (BPSO) algorithm-based HASS is developed considering user satisfaction, electricity budget, appliance prioritization, energy storage systems, solar power, and electric vehicles. The simulations were assessed under varying levels of user compliance. Results showed that the average electricity cost, peak demand, and peak-to-average ratio (PAR) of the city load profile were all reduced. Therefore, the deployment of the HASS and TOU pricing scheme is beneficial for both stakeholders.Keywords: appliance scheduling, DSM, TOU, BPSO, city-wide simulation, electric vehicle, appliance prioritization, energy storage system, solar power
Procedia PDF Downloads 99125 Wrist Pain, Technological Device Used, and Perceived Academic Performance Among the College of Computer Studies Students
Authors: Maquiling Jhuvie Jane R., Ojastro Regine B., Peroja Loreille Marie B., Pinili Joy Angela., Salve Genial Gail M., Villavicencio Marielle Irene B., Yap Alther Francis Garth B.
Abstract:
Introduction: This study investigated the impact of prolonged device usage on wrist pain and perceived academic performance among college students in Computer Studies. The research aims to explore the correlation between the frequency of technological device use and the incidence of wrist pain, as well as how this pain affects students' academic performance. The study seeks to provide insights that could inform interventions to promote better musculoskeletal health among students engaged in intensive technology use to further improve their academic performance. Method: The study utilized descriptive-correlational and comparative design, focusing on bona fide students from Silliman University’s College of Computer Studies during the second semester of 2023-2024. Participants were recruited through a survey sent via school email, with responses collected until March 30, 2024. Data was gathered using a password-protected device and Google Forms, ensuring restricted access to raw data. The demographic profile was summarized, and the prevalence of wrist pain and device usage were analyzed using percentages and weighted means. Statistical analyses included Spearman’s rank correlation coefficient to assess the relationship between wrist pain and device usage and an Independent T-test to evaluate differences in academic performance based on wrist pain presence. Alpha was set at 0.05. Results: The study revealed that 40% of College of Computer Studies students experience wrist pain, with 2 out of every 5 students affected. Laptops and desktops were the most frequently used devices for academic work, achieving a weighted mean of 4.511, while mobile phones and tablets received lower means of 4.183 and 1.911, respectively. The average academic performance score among students was 29.7, classified as ‘Good Performance.’ Notably, there was no significant relationship between the frequency of device usage and wrist pain, as indicated by p-values exceeding 0.05. However, a significant difference in perceived academic performance was observed, with students without wrist pain scoring an average of 30.39 compared to 28.72 for those with wrist pain and a p-value of 0.0134 confirming this distinction. Conclusion: The study revealed that about 40% of students in the College of Computer Studies experience wrist pain, but there is no significant link between device usage and pain occurrence. However, students without wrist pain demonstrated better academic performance than those with pain, suggesting that wrist health may impact academic success. These findings imply that physical therapy practices in the Philippines should focus on preventive strategies and ergonomic education to improve student health and performance.Keywords: wrist pain, frequency of use of technological devices, perceived academic performance, physical therapy
Procedia PDF Downloads 14124 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution
Authors: Najrullah Khan, Athar Ali Khan
Abstract:
The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation
Procedia PDF Downloads 535123 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable
Authors: Xinyuan Y. Song, Kai Kang
Abstract:
Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data
Procedia PDF Downloads 144122 Salmonella Emerging Serotypes in Northwestern Italy: Genetic Characterization by Pulsed-Field Gel Electrophoresis
Authors: Clara Tramuta, Floris Irene, Daniela Manila Bianchi, Monica Pitti, Giulia Federica Cazzaniga, Lucia Decastelli
Abstract:
This work presents the results obtained by the Regional Reference Centre for Salmonella Typing (CeRTiS) in a retrospective study aimed to investigate, through Pulsed-field Gel Electrophoresis (PFGE) analysis, the genetic relatedness of emerging Salmonella serotypes of human origin circulating in North-West of Italy. Furthermore, the goal of this work was to create a Regional database to facilitate foodborne outbreak investigation and to monitor them at an earlier stage. A total of 112 strains, isolated from 2016 to 2018 in hospital laboratories, were included in this study. The isolates were previously identified as Salmonella according to standard microbiological techniques and serotyping was performed according to ISO 6579-3 and the Kaufmann-White scheme using O and H antisera (Statens Serum Institut®). All strains were characterized by PFGE: analysis was conducted according to a standardized PulseNet protocol. The restriction enzyme XbaI was used to generate several distinguishable genomic fragments on the agarose gel. PFGE was performed on a CHEF Mapper system, separating large fragments and generating comparable genetic patterns. The agarose gel was then stained with GelRed® and photographed under ultraviolet transillumination. The PFGE patterns obtained from the 112 strains were compared using Bionumerics version 7.6 software with the Dice coefficient with 2% band tolerance and 2% optimization. For each serotype, the data obtained with the PFGE were compared according to the geographical origin and the year in which they were isolated. Salmonella strains were identified as follow: S. Derby n. 34; S. Infantis n. 38; S. Napoli n. 40. All the isolates had appreciable restricted digestion patterns ranging from approximately 40 to 1100 kb. In general, a fairly heterogeneous distribution of pulsotypes has emerged in the different provinces. Cluster analysis indicated high genetic similarity (≥ 83%) among strains of S. Derby (n. 30; 88%), S. Infantis (n. 36; 95%) and S. Napoli (n. 38; 95%) circulating in north-western Italy. The study underlines the genomic similarities shared by the emerging Salmonella strains in Northwest Italy and allowed to create a database to detect outbreaks in an early stage. Therefore, the results confirmed that PFGE is a powerful and discriminatory tool to investigate the genetic relationships among strains in order to monitoring and control Salmonellosis outbreak spread. Pulsed-field gel electrophoresis (PFGE) still represents one of the most suitable approaches to characterize strains, in particular for the laboratories for which NGS techniques are not available.Keywords: emerging Salmonella serotypes, genetic characterization, human strains, PFGE
Procedia PDF Downloads 105121 Bayesian Semiparametric Geoadditive Modelling of Underweight Malnutrition of Children under 5 Years in Ethiopia
Authors: Endeshaw Assefa Derso, Maria Gabriella Campolo, Angela Alibrandi
Abstract:
Objectives:Early childhood malnutrition can have long-term and irreversible effects on a child's health and development. This study uses the Bayesian method with spatial variation to investigate the flexible trends of metrical covariates and to identify communities at high risk of injury. Methods: Cross-sectional data on underweight are collected from the 2016 Ethiopian Demographic and Health Survey (EDHS). The Bayesian geo-additive model is performed. Appropriate prior distributions were provided for scall parameters in the models, and the inference is entirely Bayesian, using Monte Carlo Markov chain (MCMC) stimulation. Results: The results show that metrical covariates like child age, maternal body mass index (BMI), and maternal age affect a child's underweight non-linearly. Lower and higher maternal BMI seem to have a significant impact on the child’s high underweight. There was also a significant spatial heterogeneity, and based on IDW interpolation of predictive values, the western, central, and eastern parts of the country are hotspot areas. Conclusion: Socio-demographic and community- based programs development should be considered compressively in Ethiopian policy to combat childhood underweight malnutrition.Keywords: bayesX, Ethiopia, malnutrition, MCMC, semi-parametric bayesian analysis, spatial distribution, P- splines
Procedia PDF Downloads 89120 Parameter Estimation for the Mixture of Generalized Gamma Model
Authors: Wikanda Phaphan
Abstract:
Mixture generalized gamma distribution is a combination of two distributions: generalized gamma distribution and length biased generalized gamma distribution. These two distributions were presented by Suksaengrakcharoen and Bodhisuwan in 2014. The findings showed that probability density function (pdf) had fairly complexities, so it made problems in estimating parameters. The problem occurred in parameter estimation was that we were unable to calculate estimators in the form of critical expression. Thus, we will use numerical estimation to find the estimators. In this study, we presented a new method of the parameter estimation by using the expectation – maximization algorithm (EM), the conjugate gradient method, and the quasi-Newton method. The data was generated by acceptance-rejection method which is used for estimating α, β, λ and p. λ is the scale parameter, p is the weight parameter, α and β are the shape parameters. We will use Monte Carlo technique to find the estimator's performance. Determining the size of sample equals 10, 30, 100; the simulations were repeated 20 times in each case. We evaluated the effectiveness of the estimators which was introduced by considering values of the mean squared errors and the bias. The findings revealed that the EM-algorithm had proximity to the actual values determined. Also, the maximum likelihood estimators via the conjugate gradient and the quasi-Newton method are less precision than the maximum likelihood estimators via the EM-algorithm.Keywords: conjugate gradient method, quasi-Newton method, EM-algorithm, generalized gamma distribution, length biased generalized gamma distribution, maximum likelihood method
Procedia PDF Downloads 219119 Development and Verification of the Idom Shielding Optimization Tool
Authors: Omar Bouhassoun, Cristian Garrido, César Hueso
Abstract:
The radiation shielding design is an optimization problem with multiple -constrained- objective functions (radiation dose, weight, price, etc.) that depend on several parameters (material, thickness, position, etc.). The classical approach for shielding design consists of a brute force trial-and-error process subject to previous designer experience. Therefore, the result is an empirical solution but not optimal, which can degrade the overall performance of the shielding. In order to automate the shielding design procedure, the IDOM Shielding Optimization Tool (ISOT) has been developed. This software combines optimization algorithms with the capabilities to read/write input files, run calculations, as well as parse output files for different radiation transport codes. In the first stage, the software was established to adjust the input files for two well-known Monte Carlo codes (MCNP and Serpent) and optimize the result (weight, volume, price, dose rate) using multi-objective genetic algorithms. Nevertheless, its modular implementation easily allows the inclusion of more radiation transport codes and optimization algorithms. The work related to the development of ISOT and its verification on a simple 3D multi-layer shielding problem using both MCNP and Serpent will be presented. ISOT looks very promising for achieving an optimal solution to complex shielding problems.Keywords: optimization, shielding, nuclear, genetic algorithm
Procedia PDF Downloads 110118 Analysis of Nonlinear Dynamic Systems Excited by Combined Colored and White Noise Excitations
Authors: Siu-Siu Guo, Qingxuan Shi
Abstract:
In this paper, single-degree-of-freedom (SDOF) systems to white noise and colored noise excitations are investigated. By expressing colored noise excitation as a second-order filtered white noise process and introducing colored noise as an additional state variable, the equation of motion for SDOF system under colored noise is then transferred artificially to multi-degree-of-freedom (MDOF) system under white noise excitations. As a consequence, corresponding Fokker-Planck-Kolmogorov (FPK) equation governing the joint probabilistic density function (PDF) of state variables increases to 4-dimension (4-D). Solution procedure and computer programme become much more sophisticated. The exponential-polynomial closure (EPC) method, widely applied for cases of SDOF systems under white noise excitations, is developed and improved for cases of systems under colored noise excitations and for solving the complex 4-D FPK equation. On the other hand, Monte Carlo simulation (MCS) method is performed to test the approximate EPC solutions. Two examples associated with Gaussian and non-Gaussian colored noise excitations are considered. Corresponding band-limited power spectral densities (PSDs) for colored noise excitations are separately given. Numerical studies show that the developed EPC method provides relatively accurate estimates of the stationary probabilistic solutions. Moreover, statistical parameter of mean-up crossing rate (MCR) is taken into account, which is important for reliability and failure analysis.Keywords: filtered noise, narrow-banded noise, nonlinear dynamic, random vibration
Procedia PDF Downloads 225117 The “Bright Side” of COVID-19: Effects of Livestream Affordances on Consumer Purchase Willingness: Explicit IT Affordances Perspective
Authors: Isaac Owusu Asante, Yushi Jiang, Hailin Tao
Abstract:
Live streaming marketing, the new electronic commerce element, became an optional marketing channel following the COVID-19 pandemic. Many sellers have leveraged the features presented by live streaming to increase sales. Studies on live streaming have focused on gaming and consumers’ loyalty to brands through live streaming, using interview questionnaires. This study, however, was conducted to measure real-time observable interactions between consumers and sellers. Based on the affordance theory, this study conceptualized constructs representing the interactive features and examined how they drive consumers’ purchase willingness during live streaming sessions using 1238 datasets from Amazon Live, following the manual observation of transaction records. Using structural equation modeling, the ordinary least square regression suggests that live viewers, new followers, live chats, and likes positively affect purchase willingness. The Sobel and Monte Carlo tests show that new followers, live chats, and likes significantly mediate the relationship between live viewers and purchase willingness. The study introduces a new way of measuring interactions in live streaming commerce and proposes a way to manually gather data on consumer behaviors in live streaming platforms when the application programming interface (API) of such platforms does not support data mining algorithms.Keywords: livestreaming marketing, live chats, live viewers, likes, new followers, purchase willingness
Procedia PDF Downloads 81116 Optimization of Air Pollution Control Model for Mining
Authors: Zunaira Asif, Zhi Chen
Abstract:
The sustainable measures on air quality management are recognized as one of the most serious environmental concerns in the mining region. The mining operations emit various types of pollutants which have significant impacts on the environment. This study presents a stochastic control strategy by developing the air pollution control model to achieve a cost-effective solution. The optimization method is formulated to predict the cost of treatment using linear programming with an objective function and multi-constraints. The constraints mainly focus on two factors which are: production of metal should not exceed the available resources, and air quality should meet the standard criteria of the pollutant. The applicability of this model is explored through a case study of an open pit metal mine, Utah, USA. This method simultaneously uses meteorological data as a dispersion transfer function to support the practical local conditions. The probabilistic analysis and the uncertainties in the meteorological conditions are accomplished by Monte Carlo simulation. Reasonable results have been obtained to select the optimized treatment technology for PM2.5, PM10, NOx, and SO2. Additional comparison analysis shows that baghouse is the least cost option as compared to electrostatic precipitator and wet scrubbers for particulate matter, whereas non-selective catalytical reduction and dry-flue gas desulfurization are suitable for NOx and SO2 reduction respectively. Thus, this model can aid planners to reduce these pollutants at a marginal cost by suggesting control pollution devices, while accounting for dynamic meteorological conditions and mining activities.Keywords: air pollution, linear programming, mining, optimization, treatment technologies
Procedia PDF Downloads 208115 A Hierarchical Method for Multi-Class Probabilistic Classification Vector Machines
Authors: P. Byrnes, F. A. DiazDelaO
Abstract:
The Support Vector Machine (SVM) has become widely recognised as one of the leading algorithms in machine learning for both regression and binary classification. It expresses predictions in terms of a linear combination of kernel functions, referred to as support vectors. Despite its popularity amongst practitioners, SVM has some limitations, with the most significant being the generation of point prediction as opposed to predictive distributions. Stemming from this issue, a probabilistic model namely, Probabilistic Classification Vector Machines (PCVM), has been proposed which respects the original functional form of SVM whilst also providing a predictive distribution. As physical system designs become more complex, an increasing number of classification tasks involving industrial applications consist of more than two classes. Consequently, this research proposes a framework which allows for the extension of PCVM to a multi class setting. Additionally, the original PCVM framework relies on the use of type II maximum likelihood to provide estimates for both the kernel hyperparameters and model evidence. In a high dimensional multi class setting, however, this approach has been shown to be ineffective due to bad scaling as the number of classes increases. Accordingly, we propose the application of Markov Chain Monte Carlo (MCMC) based methods to provide a posterior distribution over both parameters and hyperparameters. The proposed framework will be validated against current multi class classifiers through synthetic and real life implementations.Keywords: probabilistic classification vector machines, multi class classification, MCMC, support vector machines
Procedia PDF Downloads 221