Search results for: method of multiple scales
21976 Hyper-Immunoglobulin E (Hyper-Ige) Syndrome In Skin Of Color: A Retrospective Single-Centre Observational Study
Authors: Rohit Kothari, Muneer Mohamed, Vivekanandh K., Sunmeet Sandhu, Preema Sinha, Anuj Bhatnagar
Abstract:
Introduction: Hyper-IgE syndrome is a rare primary immunodeficiency syndrome characterised by triad of severe atopic dermatitis, recurrent pulmonary infections, and recurrent staphylococcal skin infections. The diagnosis requires a high degree of suspicion, typical clinical features, and not mere rise in serum-IgE levels, which may be seen in multiple conditions. Genetic studies are not always possible in a resource poor setting. This study highlights various presentations of Hyper-IgE syndrome in skin of color children. Case-series: Our study had six children of Hyper-IgE syndrome aged twomonths to tenyears. All had onset in first ten months of life except one with a late-onset at two years. All had recurrent eczematoid rash, which responded poorly to conventional treatment, secondary infection, multiple episodes of hospitalisation for pulmonary infection, and raised serum IgE levels. One case had occasional vesicles, bullae, and crusted plaques over both the extremities. Genetic study was possible in only one of them who was found to have pathogenic homozygous deletions of exon-15 to 18 in DOCK8 gene following which he underwent bone marrow transplant (BMT), however, succumbed to lower respiratory tract infection two months after BMT and rest of them received multiple courses of antibiotics, oral/ topical steroids, and cyclosporine intermittently with variable response. Discussion: Our study highlights various characteristics, presentation, and management of this rare syndrome in children. Knowledge of these manifestations in skin of color will facilitate early identification and contribute to optimal care of the patients as representative data on the same is limited in literature.Keywords: absolute eosinophil count, atopic dermatitis, eczematous rash, hyper-immunoglobulin E syndrome, pulmonary infection, serum IgE, skin of color
Procedia PDF Downloads 13821975 Using the UK as a Case Study to Assess the Current State of Large Woody Debris Restoration as a Tool for Improving the Ecological Status of Natural Watercourses Globally
Authors: Isabelle Barrett
Abstract:
Natural watercourses provide a range of vital ecosystem services, notably freshwater provision. They also offer highly heterogeneous habitat which supports an extreme diversity of aquatic life. Exploitation of rivers, changing land use and flood prevention measures have led to habitat degradation and subsequent biodiversity loss; indeed, freshwater species currently face a disproportionate rate of extinction compared to their terrestrial and marine counterparts. Large woody debris (LWD) encompasses the trees, large branches and logs which fall into watercourses, and is responsible for important habitat characteristics. Historically, natural LWD has been removed from streams under the assumption that it is not aesthetically pleasing and is thus ecologically unfavourable, despite extensive evidence contradicting this. Restoration efforts aim to replace lost LWD in order to reinstate habitat heterogeneity. This paper aims to assess the current state of such restoration schemes for improving fluvial ecological health in the UK. A detailed review of the scientific literature was conducted alongside a meta-analysis of 25 UK-based projects involving LWD restoration. Projects were chosen for which sufficient information was attainable for analysis, covering a broad range of budgets and scales. The most effective strategies for river restoration encompass ecological success, stakeholder engagement and scientific advancement, however few projects surveyed showed sensitivity to all three; for example, only 32% of projects stated biological aims. Focus tended to be on stakeholder engagement and public approval, since this is often a key funding driver. Consequently, there is a tendency to focus on the aesthetic outcomes of a project, however physical habitat restoration does not necessarily lead to direct biodiversity increases. This highlights the significance of rivers as highly heterogeneous environments with multiple interlinked processes, and emphasises a need for a stronger scientific presence in project planning. Poor scientific rigour means monitoring is often lacking, with varying, if any, definitions of success which are rarely pre-determined. A tendency to overlook negative or neutral results was apparent, with unjustified focus often put on qualitative results. The temporal scale of monitoring is typically inadequate to facilitate scientific conclusions, with only 20% of projects surveyed reporting any pre-restoration monitoring. Furthermore, monitoring is often limited to a few variables, with biotic monitoring often fish-focussed. Due to their longer life cycles and dispersal capability, fish are usually poor indicators of environmental change, making it difficult to attribute any changes in ecological health to restoration efforts. Although the potential impact of LWD restoration may be positive, this method of restoration could simply be making short-term, small-scale improvements; without addressing the underlying symptoms of degradation, for example water quality, the issue cannot be fully resolved. Promotion of standardised monitoring for LWD projects could help establish a deeper understanding of the ecology surrounding the practice, supporting movement towards adaptive management in which scientific evidence feeds back to practitioners, enabling the design of more efficient projects with greater ecological success. By highlighting LWD, this study hopes to address the difficulties faced within river management, and emphasise the need for a more holistic international and inter-institutional approach to tackling problems associated with degradation.Keywords: biological monitoring, ecological health, large woody debris, river management, river restoration
Procedia PDF Downloads 21621974 The Sr-Nd Isotope Data of the Platreef Rocks from the Northern Limb of the Bushveld Igneous Complex: Evidence of Contrasting Magma Composition and Origin
Authors: Tshipeng Mwenze, Charles Okujeni, Abdi Siad, Russel Bailie, Dirk Frei, Marcelene Voigt, Petrus Le Roux
Abstract:
The Platreef is a platinum group element (PGE) deposit in the northern limb of the Bushveld Igneous Complex (BIC) which was emplaced as a series of mafic and ultramafic sills between the Main Zone (MZ) and the country rocks. The PGE mineralisation in the Platreef is hosted in different rock types, and its distribution and style vary with depth and along strike. This study contributes towards understanding the processes involved in the genesis of the Platreef. Twenty-four Platreef (2 harzburgites, 4 olivine pyroxenites, 17 feldspathic pyroxenites and 1 gabbronorite) and few MZ (1 gabbronorite and 1 leucogabbronorite) quarter core samples were collected from four drill cores (e.g., TN754, TN200, SS339, and OY482) and analysed for whole-rock Sr-Nd isotope data. The results show positive ɛNd values (+3.53 to +7.51) for harzburgites suggesting their parental magmas derived from the depleted Mantle. The remaining Platreef rocks have negative ɛNd values (-2.91 to -22.88) and show significant variations in Sr-Nd isotopic compositions. The first group of Platreef samples has relatively high isotopic compositions (ɛNd= -2.91 to -5.68; ⁸⁷Sr/⁸⁶Sri= 0.709177 - 0.711998). The second group of Platreef samples has Sr ratios (⁸⁷Sr/⁸⁶Sri= 0.709816-0.712106) overlapping with samples of the first group but slightly lower ɛNd values (-7.44 to -8.39). Lastly, the third group of Platreef samples has low ɛNd values (-10.82 to -14.32) and low Sr ratios (⁸⁷Sr/⁸⁶Sri= 0.707545-0.710042) than those from samples of the two Platreef groups mentioned above. There is, however, a Platreef sample with ɛNd value (-5.26) in range with the Platreef samples of the first group, but its Sr ratio (0.707281) is the lowest even when compared to samples of the third Platreef group. There are also five other Platreef samples which have either anomalous ɛNd or Sr ratios which make it difficult to assess their isotopic compositions relative to other samples. These isotopic variations for the Platreef samples indicate both multiple sources and multiple magma chambers where varying crustal contamination styles have operated during the evolution of these magmas prior their emplacements into the Platreef setting as sills. Furthermore, the MZ rocks have different Sr-Nd isotopic compositions (For OY482 gabbronorite [ɛNd= +0.65; ⁸⁷Sr/⁸⁶Sri= 0.711746]; for TN754 leucogabbronorite [ɛNd= -7.44; ⁸⁷Sr/⁸⁶Sri= 0.709322]) which do not only indicate different MZ magma chambers, but also different magmas from those of the Platreef. Although the Platreef is still considered a single stratigraphic unit in the northern limb of the BIC, its genesis involved multiple magmatic processes which evolved independently from each other.Keywords: crustal contamination styles, magma chambers, magma sources, multiple sills emplacement
Procedia PDF Downloads 16721973 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression
Authors: Anne M. Denton, Rahul Gomes, David W. Franzen
Abstract:
High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression
Procedia PDF Downloads 12921972 Heroin Withdrawal, Prison and Multiple Temporalities
Authors: Ian Walmsley
Abstract:
The aim of this paper is to explore the influence of time and temporality on the experience of coming off heroin in prison. The presentation draws on qualitative data collected during a small-scale pilot study of the role of self-care in the process of coming off drugs in prison. Time and temporality emerged as a key theme in the interview transcripts. Drug dependent prisoners experience of time in prison has not been recognized in the research literature. Instead, the literature on prison time typically views prisoners as a homogenous group or tends to focus on the influence of aging and gender on prison time. Furthermore, there is a tendency in the literature on prison drug treatment and recovery to conceptualize drug dependent prisoners as passive recipients of prison healthcare, rather than active agents. In building on these gaps, this paper argues that drug dependent prisoners experience multiple temporalities which involve an interaction between the body-times of the drug dependent prisoner and the economy of time in prison. One consequence of this interaction is the feeling that they are doing, at this point in their prison sentence, double prison time. The second part of the argument is that time and temporality were a means through which they governed their withdrawing bodies. In addition, this paper will comment on the challenges of prison research in England.Keywords: heroin withdrawal, time and temporality, prison, body
Procedia PDF Downloads 27621971 Effectiveness of Using Multiple Non-pharmacological Interventions to Prevent Delirium in the Hospitalized Elderly
Authors: Yi Shan Cheng, Ya Hui Yeh, Hsiao Wen Hsu
Abstract:
Delirium is an acute state of confusion, which is mainly the result of the interaction of many factors, including: age>65 years, comorbidity, cognitive function and visual/auditory impairment, dehydration, pain, sleep disorder, pipeline retention, general anesthesia and major surgery… etc. Researches show the prevalence of delirium in hospitalized elderly patients over 50%. If it doesn't improve in time, may cause cognitive decline or impairment, not only prolong the length of hospital stay but also increase mortality. Some studies have shown that multiple nonpharmacological interventions are the most effective and common strategies, which are reorientation, early mobility, promoting sleep and nutritional support (including water intake), could improve or prevent delirium in the hospitalized elderly. In Taiwan, only one research to compare the delirium incidence of the older patients who have received orthopedic surgery between multi-nonpharmacological interventions and general routine care. Therefore, the purpose of this study is to address the prevention or improvement of delirium incidence density in medical hospitalized elderly, provide clinical nurses as a reference for clinical implementation, and develop follow-up related research. This study is a quasi-experimental design using purposive sampling. Samples are from two wards: the geriatric ward and the general medicine ward at a medical center in central Taiwan. The sample size estimated at least 100, and then the data will be collected through a self-administered structured questionnaire, including: demographic and professional evaluation items. Case recruiting from 5/13/2023. The research results will be analyzed by SPSS for Windows 22.0 software, including descriptive statistics and inferential statistics: logistic regression、Generalized Estimating Equation(GEE)、multivariate analysis of variance(MANOVA).Keywords: multiple nonpharmacological interventions, hospitalized elderly, delirium incidence, delirium
Procedia PDF Downloads 7821970 The Survey of Sexual Health and Pornography among Divorce-Asking Women in West Azerbaijan-Iran: A Cross-Sectional Study
Authors: Soheila Rabiepoor, Elham Sadeghi
Abstract:
Introduction: Divorce is both a personal and a social issue. Nowadays, due to various factors such as rapid social, economical, and cultural changes, the family structure has undergone many rough changes, out of 3 marriages 2 of them lead to divorce. One of the factors affecting the incidence of divorce and relationship problems between couples is the sexual and marital behaviors. There are several different reasons to suspect that pornography might affect divorce in either a positive or a negative way. Therefore this study evaluated the sexual health of divorce-asking in Urmia, Iran. Methods: This was a cross-sectional descriptive study and was conducted on 71 married women of Urmia, Iran in 2016. Participants were applicants of divorce (referred to divorce center) who were selected by using convenient sampling method. Data gathering tool included the scales for measuring demographic, sexual health (sexual satisfaction and function), and researcher made pornography questions. Data were analyzed based on the SPSS 16 software. P-values less than 0.05 were considered significant. Results: Investigation of demographic features showed that age average of studied samples was 28.98 ± 7.44, with a marriage duration average 8.12 ± 6.53 years (min 1 year/ max 28 years). Most of their education was at diploma (45.1%). 69 % of the women declared their income and expenditure as equal. Nearly 42% of women and 59% of their partner had watched sexual pornography clips. 45.5% of participants reported that they compared own sexual relationship with sexual pornography clips. In the other hand, sexual satisfaction total score was 51.50 ± 17.92. The mean total sexual function score was 16.62 ± 10.58. According to these findings, most of women were experienced sexual dissatisfaction and dysfunction. Conclusions: The results of the study indicated that who had low sexual satisfaction score, had higher rate of watching pornography clips. Based on current study, paying attention to family education and counseling programs especially in the sexual field will be more fruitful.Keywords: divorce-asking, pornography, sexual satisfaction, sexual function, women
Procedia PDF Downloads 58621969 A Critical Review and Bibliometric Analysis on Measures of Achievement Motivation
Authors: Kanupriya Rawat, Aleksandra Błachnio, Paweł Izdebski
Abstract:
Achievement motivation, which drives a person to strive for success, is an important construct in sports psychology. This systematic review aims to analyze the methods of measuring achievement motivation used in previous studies published over the past four decades and to find out which method of measuring achievement motivation is the most prevalent and the most effective by thoroughly examining measures of achievement motivation used in each study and by evaluating most highly cited achievement motivation measures in sport. In order to understand this latent construct, thorough measurement is necessary, hence a critical evaluation of measurement tools is required. The literature search was conducted in the following databases: EBSCO, MEDLINE, APA PsychARTICLES, Academic Search Ultimate, Open Dissertations, ERIC, Science direct, Web of Science, as well as Wiley Online Library. A total of 26 articles met the inclusion criteria and were selected. From this review, it was found that the Achievement Goal Questionnaire- Sport (AGQ-Sport) and the Task and Ego Orientation in Sport Questionnaire (TEOSQ) were used in most of the research, however, the average weighted impact factor of the Achievement Goal Questionnaire- Sport (AGQ-Sport) is the second highest and most relevant in terms of research articles related to the sport psychology discipline. Task and Ego Orientation in Sport Questionnaire (TEOSQ) is highly popular in cross-cultural adaptation but has the second last average IF among other scales due to the less impact factor of most of the publishing journals. All measures of achievement motivation have Cronbach’s alpha value of more than .70, which is acceptable. The advantages and limitations of each measurement tool are discussed, and the distinction between using implicit and explicit measures of achievement motivation is explained. Overall, both implicit and explicit measures of achievement motivation have different conceptualizations of achievement motivation and are applicable at either the contextual or situational level. The conceptualization and degree of applicability are perhaps the most crucial factors for researchers choosing a questionnaire, even though they differ in their development, reliability, and use.Keywords: achievement motivation, task and ego orientation, sports psychology, measures of achievement motivation
Procedia PDF Downloads 9621968 The Importance of Functioning and Disability Status Follow-Up in People with Multiple Sclerosis
Authors: Sanela Slavkovic, Congor Nad, Spela Golubovic
Abstract:
Background: The diagnosis of multiple sclerosis (MS) is a major life challenge and has repercussions on all aspects of the daily functioning of those attained by it – personal activities, social participation, and quality of life. Regular follow-up of only the neurological status is not informative enough so that it could provide data on the sort of support and rehabilitation that is required. Objective: The aim of this study was to establish the current level of functioning of persons attained by MS and the factors that influence it. Methods: The study was conducted in Serbia, on a sample of 108 persons with relapse-remitting form of MS, aged 20 to 53 (mean 39.86 years; SD 8.20 years). All participants were fully ambulatory. Methods applied in the study include Expanded Disability Status Scale-EDSS and World Health Organization Disability Assessment Schedule, WHODAS 2.0 (36-item version, self-administered). Results: Participants were found to experience the most problems in the domains of Participation, Mobility, Life activities and Cognition. The least difficulties were found in the domain of Self-care. Symptom duration was the only control variable with a significant partial contribution to the prediction of the WHODAS scale score (β=0.30, p < 0.05). The total EDSS score correlated with the total WHODAS 2.0 score (r=0.34, p=0.00). Statistically significant differences in the domain of EDSS 0-5.5 were found within categories (0-1.5; 2-3.5; 4-5.5). The more pronounced a participant’s EDSS score was, although not indicative of large changes in the neurological status, the more apparent the changes in the functional domain, i.e. in all areas covered by WHODAS 2.0. Pyramidal (β=0.34, p < 0.05) and Bowel and bladder (β=0.24, p < 0.05) functional systems were found to have a significant partial contribution to the prediction of the WHODAS score. Conclusion: Measuring functioning and disability is important in the follow-up of persons suffering from MS in order to plan rehabilitation and define areas in which additional support is needed.Keywords: disability, functionality, multiple sclerosis, rehabilitation
Procedia PDF Downloads 12221967 Relationship of Workplace Stress and Mental Wellbeing among Health Professionals
Authors: Rabia Mushtaq, Uroosa Javaid
Abstract:
It has been observed that health professionals are at higher danger of stress in light of the fact that being a specialist is physically and emotionally demanding. The study aimed to investigate the relationship between workplace stress and mental wellbeing among health professionals. Sample of 120 male and female health professionals belonging to two age groups, i.e., early adulthood and middle adulthood, was employed through purposive sampling technique. Job stress scale, mindful attention awareness scale, and Warwick Edinburgh mental wellbeing scales were used for the measurement of study variables. Results of the study indicated that job stress has a significant negative relationship with mental wellbeing among health professionals. The current study opened the door for more exploratory work on mindfulness among health professionals. Yielding outcomes helped in consolidating adapting procedures among workers to improve their mental wellbeing and lessen the job stress.Keywords: health professionals, job stress, mental wellbeing, mindfulness
Procedia PDF Downloads 17521966 High-Frequency Cryptocurrency Portfolio Management Using Multi-Agent System Based on Federated Reinforcement Learning
Authors: Sirapop Nuannimnoi, Hojjat Baghban, Ching-Yao Huang
Abstract:
Over the past decade, with the fast development of blockchain technology since the birth of Bitcoin, there has been a massive increase in the usage of Cryptocurrencies. Cryptocurrencies are not seen as an investment opportunity due to the market’s erratic behavior and high price volatility. With the recent success of deep reinforcement learning (DRL), portfolio management can be modeled and automated. In this paper, we propose a novel DRL-based multi-agent system to automatically make proper trading decisions on multiple cryptocurrencies and gain profits in the highly volatile cryptocurrency market. We also extend this multi-agent system with horizontal federated transfer learning for better adapting to the inclusion of new cryptocurrencies in our portfolio; therefore, we can, through the concept of diversification, maximize our profits and minimize the trading risks. Experimental results through multiple simulation scenarios reveal that this proposed algorithmic trading system can offer three promising key advantages over other systems, including maximized profits, minimized risks, and adaptability.Keywords: cryptocurrency portfolio management, algorithmic trading, federated learning, multi-agent reinforcement learning
Procedia PDF Downloads 11921965 A Comprehensive Study on CO₂ Capture and Storage: Advances in Technology and Environmental Impact Mitigation
Authors: Oussama Fertaq
Abstract:
This paper investigates the latest advancements in CO₂ capture and storage (CCS) technologies, which are vital for addressing the growing challenge of climate change. The study focuses on multiple techniques for CO₂ capture, including chemical absorption, membrane separation, and adsorption, analyzing their efficiency, scalability, and environmental impact. The research further explores geological storage options such as deep saline aquifers and depleted oil fields, providing insights into the challenges and opportunities presented by each method. This paper emphasizes the importance of integrating CCS with existing industrial processes to reduce greenhouse gas emissions effectively. It also discusses the economic and policy frameworks required to promote wider adoption of CCS technologies. The findings of this study offer a comprehensive view of the potential of CCS in achieving global climate goals, particularly in hard-to-abate sectors such as energy and manufacturing.Keywords: CO₂ capture, carbon storage, climate change mitigation, carbon sequestration, environmental sustainability
Procedia PDF Downloads 1221964 Development of Electrochemical Biosensor Based on Dendrimer-Magnetic Nanoparticles for Detection of Alpha-Fetoprotein
Authors: Priyal Chikhaliwala, Sudeshna Chandra
Abstract:
Liver cancer is one of the most common malignant tumors with poor prognosis. This is because liver cancer does not exhibit any symptoms in early stage of disease. Increased serum level of AFP is clinically considered as a diagnostic marker for liver malignancy. The present diagnostic modalities include various types of immunoassays, radiological studies, and biopsy. However, these tests undergo slow response times, require significant sample volumes, achieve limited sensitivity and ultimately become expensive and burdensome to patients. Considering all these aspects, electrochemical biosensors based on dendrimer-magnetic nanoparticles (MNPs) was designed. Dendrimers are novel nano-sized, three-dimensional molecules with monodispersed structures. Poly-amidoamine (PAMAM) dendrimers with eight –NH₂ groups using ethylenediamine as a core molecule were synthesized using Michael addition reaction. Dendrimers provide added the advantage of not only stabilizing Fe₃O₄ NPs but also displays capability of performing multiple electron redox events and binding multiple biological ligands to its dendritic end-surface. Fe₃O₄ NPs due to its superparamagnetic behavior can be exploited for magneto-separation process. Fe₃O₄ NPs were stabilized with PAMAM dendrimer by in situ co-precipitation method. The surface coating was examined by FT-IR, XRD, VSM, and TGA analysis. Electrochemical behavior and kinetic studies were evaluated using CV which revealed that the dendrimer-Fe₃O₄ NPs can be looked upon as electrochemically active materials. Electrochemical immunosensor was designed by immobilizing anti-AFP onto dendrimer-MNPs by gluteraldehyde conjugation reaction. The bioconjugates were then incubated with AFP antigen. The immunosensor was characterized electrochemically indicating successful immuno-binding events. The binding events were also further studied using magnetic particle imaging (MPI) which is a novel imaging modality in which Fe₃O₄ NPs are used as tracer molecules with positive contrast. Multicolor MPI was able to clearly localize AFP antigen and antibody and its binding successfully. Results demonstrate immense potential in terms of biosensing and enabling MPI of AFP in clinical diagnosis.Keywords: alpha-fetoprotein, dendrimers, electrochemical biosensors, magnetic nanoparticles
Procedia PDF Downloads 13621963 Managing Incomplete PSA Observations in Prostate Cancer Data: Key Strategies and Best Practices for Handling Loss to Follow-Up and Missing Data
Authors: Madiha Liaqat, Rehan Ahmed Khan, Shahid Kamal
Abstract:
Multiple imputation with delta adjustment is a versatile and transparent technique for addressing univariate missing data in the presence of various missing mechanisms. This approach allows for the exploration of sensitivity to the missing-at-random (MAR) assumption. In this review, we outline the delta-adjustment procedure and illustrate its application for assessing the sensitivity to deviations from the MAR assumption. By examining diverse missingness scenarios and conducting sensitivity analyses, we gain valuable insights into the implications of missing data on our analyses, enhancing the reliability of our study's conclusions. In our study, we focused on assessing logPSA, a continuous biomarker in incomplete prostate cancer data, to examine the robustness of conclusions against plausible departures from the MAR assumption. We introduced several approaches for conducting sensitivity analyses, illustrating their application within the pattern mixture model (PMM) under the delta adjustment framework. This proposed approach effectively handles missing data, particularly loss to follow-up.Keywords: loss to follow-up, incomplete response, multiple imputation, sensitivity analysis, prostate cancer
Procedia PDF Downloads 8921962 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton
Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani
Abstract:
Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton
Procedia PDF Downloads 32421961 Passive Attenuation with Multiple Resonator Rings for Musical Instruments Equalization
Authors: Lorenzo Bonoldi, Gianluca Memoli, Abdelhalim Azbaid El Ouahabi
Abstract:
In this paper, a series of ring-shaped attenuators utilizing Helmholtz and quarter wavelength resonators in variable, fixed, and combined configurations have been manufactured using a 3D printer. We illustrate possible uses by incorporating such devices into musical instruments (e.g. in acoustic guitar sound holes) and audio speakers with a view to controlling such devices tonal emissions without electronic equalization systems. Numerical investigations into the transmission loss values of these ring-shaped attenuators using finite element method simulations (COMSOL Multiphysics) have been presented in the frequency range of 100– 1000 Hz. We compare such results for each attenuator model with experimental measurements using different driving sources such as white noise, a maximum-length sequence (MLS), square and sine sweep pulses, and point scans in the frequency domain. Finally, we present a preliminary discussion on the comparison of numerical and experimental results.Keywords: equaliser, metamaterials, musical, instruments
Procedia PDF Downloads 17421960 Performance Evaluation of MIMO-OFDM Communication Systems
Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany
Abstract:
This paper evaluates the bit error rate (BER) performance of MIMO-OFDM communication system. MIMO system uses multiple transmitting and receiving antennas with different coding techniques to either enhance the transmission diversity or spatial multiplexing gain. Utilizing alamouti algorithm were the same information transmitted over multiple antennas at different time intervals and then collected again at the receivers to minimize the probability of error, combat fading and thus improve the received signal to noise ratio. While utilizing V-BLAST algorithm, the transmitted signals are divided into different transmitting channels and transferred over the channel to be received by different receiving antennas to increase the transmitted data rate and achieve higher throughput. The paper provides a study of different diversity gain coding schemes and spatial multiplexing coding for MIMO systems. A comparison of various channels' estimation and equalization techniques are given. The simulation is implemented using MATLAB, and the results had shown the performance of transmission models under different channel environments.Keywords: MIMO communication, BER, space codes, channels, alamouti, V-BLAST
Procedia PDF Downloads 17521959 Determination of Frequency Relay Setting during Distributed Generators Islanding
Authors: Tarek Kandil, Ameen Ali
Abstract:
Distributed generation (DG) has recently gained a lot of momentum in power industry due to market deregulation and environmental concerns. One of the most technical challenges facing DGs is islanding of distributed generators. The current industry practice is to disconnect all distributed generators immediately after the occurrence of islands within 200 to 350 ms after loss of main supply. To achieve such goal, each DG must be equipped with an islanding detection device. Frequency relays are one of the most commonly used loss of mains detection method. However, distribution utilities may be faced with concerns related to false operation of these frequency relays due to improper settings. The commercially available frequency relays are considering standard tight setting. This paper investigates some factors related to relays internal algorithm that contribute to their different operating responses. Further, the relay operation in the presence of multiple distributed at the same network is analyzed. Finally, the relay setting can be accurately determined based on these investigation and analysis.Keywords: frequency relay, distributed generation, islanding detection, relay setting
Procedia PDF Downloads 53421958 Evaluating Factors Influencing Information Quality in Large Firms
Authors: B. E. Narkhede, S. K. Mahajan, B. T. Patil, R. D. Raut
Abstract:
Information quality is a major performance measure for an Enterprise Resource Planning (ERP) system of any firm. This study identifies various critical success factors of information quality. The effect of various critical success factors like project management, reengineering efforts and interdepartmental communications on information quality is analyzed using a multiple regression model. Here quantitative data are collected from respondents from various firms through structured questionnaire for assessment of the information quality, project management, reengineering efforts and interdepartmental communications. The validity and reliability of the data are ensured using techniques like factor analysis, computing of Cronbach’s alpha. This study gives relative importance of each of the critical success factors. The findings suggest that among the various factors influencing information quality careful reengineering efforts are the most influencing factor. This paper gives clear insight to managers and practitioners regarding the relative importance of critical success factors influencing information quality so that they can formulate a strategy at the beginning of ERP system implementation.Keywords: Enterprise Resource Planning (ERP), information systems (IS), multiple regression, information quality
Procedia PDF Downloads 33321957 The Role of Leapfrogging: Cross-Level Interactions and MNE Decision-Making in Conflict-Settings
Authors: Arrian Cornwell, Larisa Yarovaya, Mary Thomson
Abstract:
This paper seeks to examine the transboundary nature of foreign subsidiary exit vs. stay decisions when threatened by conflict in a host country. Using the concepts of nested vulnerability and teleconnections, we show that the threat of conflict can transcend bounded territories and have non-linear outcomes for actors, institutions and systems at broader scales of analysis. To the best of our knowledge, this has not been done before. By introducing the concepts of ‘leapfrogging upwards’ and ‘cascading downwards’, we develop a two-stage model which characterises the impacts of conflict as transboundary phenomena. We apply our model to a dataset of 266 foreign subsidiaries in six conflict-afflicted host countries over 2011-2015. Our results indicate that information is transmitted upwards and subsequent pressure flows cascade downwards, which, in turn, influence exit decisions.Keywords: subsidiary exit, conflict, information transmission, pressure flows, transboundary
Procedia PDF Downloads 27721956 Role of Tele-health in Expansion of Medical Care
Authors: Garima Singh, Kunal Malhotra
Abstract:
Objective: The expansion of telehealth has been instrumental in increasing access to medical services, especially for underserved and rural communities. In 2020, 14 million patients received virtual care through telemedicine and the global telemedicine market is expected to reach up to $185 million by 2023. It provides a platform and allows a patient to receive primary care as well as specialized care using technology and the comfort of their homes. Telemedicine was particularly useful during COVID-pandemic and the number of telehealth visits increased by 5000% during that time. It continues to serve as a significant resource for patients seeking care and to bridge the gap between the disease and the treatment. Method: As per APA (American Psychiatric Association), Telemedicine is the process of providing health care from a distance through technology. It is a subset of telemedicine, and can involve providing a range of services, including evaluations, therapy, patient education and medication management. It can involve direct interaction between a physician and the patient. It also encompasses supporting primary care providers with specialist consultation and expertise. It can also involve recording medical information (images, videos, etc.) and sending this to a distant site for later review. Results: In our organization, we are using telepsychiatry and serving 25 counties and approximately 1.4 million people. We provide multiple services, including inpatient, outpatient, crisis intervention, Rehab facility, autism services, case management, community treatment and multiple other modalities. With project ECHO (Extension for Community Healthcare Outcomes) it has been used to advise and assist primary care providers in treating mental health. It empowers primary care providers to treat patients in their own community by sharing knowledge. Conclusion: Telemedicine has shown to be a great medium in meeting patients’ needs and accessible mental health. It has been shown to improve access to care in both urban and rural settings by bringing care to a patient and reducing barriers like transportation, financial stress and resources. Telemedicine is also helping with reducing ER visits, integrating primary care and improving the continuity of care and follow-up. There has been substantial evidence and research about its effectiveness and its usage.Keywords: telehealth, telemedicine, access to care, medical technology
Procedia PDF Downloads 10321955 Self-Efficacy of Preschool Teachers and Their Perception of Excellent Preschools
Authors: Yael Fisher
Abstract:
Little is known about perceived self-efficacy of public preschool teachers, their perception of preschool excellence, or the relations between the two. There were three purposes for this research: defining the professional self-efficacy of preschool teachers (PTSE); defining preschool teachers' perception of preschool excellence (PTPPE); and investigating the relationship between the two. Scales for PTSE and PTPPE were developed especially for this study. Public preschool teachers (N = 202) participated during the 2013 school year. Structural Equation Modeling was performed to test the fit between the research model and the obtained data. PTPSE scale (α = 0.91) was comprised of three subscales: pedagogy (α=0.84), organization (α = 0.85) and staff (α = 0.72). The PTPPE scale (α = 0.92) is also composed of three subscales: organization and pedagogy (α = 0.88), staff (α = 0.84) and parents (α = 0.83). The goodness of fit measures were RMSEA = 0.045, CFI = 0.97, NFI = 0.89, df = 173, χ²=242.94, p= .000, showing GFI = 1.4 (< 3) as a good fit. Understanding self-efficacy of preschool teachers, preschool could and should lead to better professional development (in-service training) of preschool teachers.Keywords: self-efficacy, public pre schools, preschool excellence, SEM
Procedia PDF Downloads 13021954 Implementation of a Method of Crater Detection Using Principal Component Analysis in FPGA
Authors: Izuru Nomura, Tatsuya Takino, Yuji Kageyama, Shin Nagata, Hiroyuki Kamata
Abstract:
We propose a method of crater detection from the image of the lunar surface captured by the small space probe. We use the principal component analysis (PCA) to detect craters. Nevertheless, considering severe environment of the space, it is impossible to use generic computer in practice. Accordingly, we have to implement the method in FPGA. This paper compares FPGA and generic computer by the processing time of a method of crater detection using principal component analysis.Keywords: crater, PCA, eigenvector, strength value, FPGA, processing time
Procedia PDF Downloads 55521953 Quantification of Magnetic Resonance Elastography for Tissue Shear Modulus using U-Net Trained with Finite-Differential Time-Domain Simulation
Authors: Jiaying Zhang, Xin Mu, Chang Ni, Jeff L. Zhang
Abstract:
Magnetic resonance elastography (MRE) non-invasively assesses tissue elastic properties, such as shear modulus, by measuring tissue’s displacement in response to mechanical waves. The estimated metrics on tissue elasticity or stiffness have been shown to be valuable for monitoring physiologic or pathophysiologic status of tissue, such as a tumor or fatty liver. To quantify tissue shear modulus from MRE-acquired displacements (essentially an inverse problem), multiple approaches have been proposed, including Local Frequency Estimation (LFE) and Direct Inversion (DI). However, one common problem with these methods is that the estimates are severely noise-sensitive due to either the inverse-problem nature or noise propagation in the pixel-by-pixel process. With the advent of deep learning (DL) and its promise in solving inverse problems, a few groups in the field of MRE have explored the feasibility of using DL methods for quantifying shear modulus from MRE data. Most of the groups chose to use real MRE data for DL model training and to cut training images into smaller patches, which enriches feature characteristics of training data but inevitably increases computation time and results in outcomes with patched patterns. In this study, simulated wave images generated by Finite Differential Time Domain (FDTD) simulation are used for network training, and U-Net is used to extract features from each training image without cutting it into patches. The use of simulated data for model training has the flexibility of customizing training datasets to match specific applications. The proposed method aimed to estimate tissue shear modulus from MRE data with high robustness to noise and high model-training efficiency. Specifically, a set of 3000 maps of shear modulus (with a range of 1 kPa to 15 kPa) containing randomly positioned objects were simulated, and their corresponding wave images were generated. The two types of data were fed into the training of a U-Net model as its output and input, respectively. For an independently simulated set of 1000 images, the performance of the proposed method against DI and LFE was compared by the relative errors (root mean square error or RMSE divided by averaged shear modulus) between the true shear modulus map and the estimated ones. The results showed that the estimated shear modulus by the proposed method achieved a relative error of 4.91%±0.66%, substantially lower than 78.20%±1.11% by LFE. Using simulated data, the proposed method significantly outperformed LFE and DI in resilience to increasing noise levels and in resolving fine changes of shear modulus. The feasibility of the proposed method was also tested on MRE data acquired from phantoms and from human calf muscles, resulting in maps of shear modulus with low noise. In future work, the method’s performance on phantom and its repeatability on human data will be tested in a more quantitative manner. In conclusion, the proposed method showed much promise in quantifying tissue shear modulus from MRE with high robustness and efficiency.Keywords: deep learning, magnetic resonance elastography, magnetic resonance imaging, shear modulus estimation
Procedia PDF Downloads 6821952 Evaluating Contextually Targeted Advertising with Attention Measurement
Authors: John Hawkins, Graham Burton
Abstract:
Contextual targeting is a common strategy for advertising that places marketing messages in media locations that are expected to be aligned with the target audience. There are multiple major challenges to contextual targeting: the ideal categorisation scheme needs to be known, as well as the most appropriate subsections of that scheme for a given campaign or creative. In addition, the campaign reach is typically limited when targeting becomes narrow, so a balance must be struck between requirements. Finally, refinement of the process is limited by the use of evaluation methods that are either rapid but non-specific (click through rates), or reliable but slow and costly (conversions or brand recall studies). In this study we evaluate the use of attention measurement as a technique for understanding the performance of targeting on the basis of specific contextual topics. We perform the analysis using a large scale dataset of impressions categorised using the iAB V2.0 taxonomy. We evaluate multiple levels of the categorisation hierarchy, using categories at different positions within an initial creative specific ranking. The results illustrate that measuring attention time is an affective signal for the performance of a specific creative within a specific context. Performance is sustained across a ranking of categories from one period to another.Keywords: contextual targeting, digital advertising, attention measurement, marketing performance
Procedia PDF Downloads 10421951 Portable Hands-Free Process Assistant for Gas Turbine Maintenance
Authors: Elisabeth Brandenburg, Robert Woll, Rainer Stark
Abstract:
This paper presents how smart glasses and voice commands can be used for improving the maintenance process of industrial gas turbines. It presents the process of inspecting a gas turbine’s combustion chamber and how it is currently performed using a set of paper-based documents. In order to improve this process, a portable hands-free process assistance system has been conceived. In the following, it will be presented how the approach of user-centered design and the method of paper prototyping have been successfully applied in order to design a user interface and a corresponding workflow model that describes the possible interaction patterns between the user and the interface. The presented evaluation of these results suggests that the assistance system could help the user by rendering multiple manual activities obsolete, thus allowing him to work hands-free and to save time for generating protocols.Keywords: paper prototyping, smart glasses, turbine maintenance, user centered design
Procedia PDF Downloads 32321950 Connections among Personality, Teacher-Student Relationship, Belief in a Just World for Others and Teacher Bullying
Authors: Hui-Yu Peng, Hsiu-I Hsueh, Li-Ming Chen
Abstract:
Most studies focused on bullying behaviors among students, however few research concerns about teachers’ bullying behaviors against students. In order to have more understandings and reduce teacher bullying, it is important to examine what factors may affect teachers’ bullying behaviors. This study aimed to explore the connections between different psychological variables and teacher bullying. Four variables, neuroticism, extraversion, teacher-student relationship, and belief in a just world for others (BJW-others), were selected in this study. Four hundred and five elementary and secondary school teachers in Taiwan endorsed the self-reported surveys. Multiple regression method was used to analyze data. Results revealed that teachers’ BJW-others and extraversion did not have significant correlations with teacher bullying scores. However, closed teacher-student relationship and neuroticism can negatively and positively predict teachers’ bullying behaviors against students, respectively. Implications for preventing teacher bullying were discussed at the end of this study.Keywords: belief in a just world for others, big five personality traits, teacher bullying, teacher-student relationship
Procedia PDF Downloads 21321949 MapReduce Logistic Regression Algorithms with RHadoop
Authors: Byung Ho Jung, Dong Hoon Lim
Abstract:
Logistic regression is a statistical method for analyzing a dataset in which there are one or more independent variables that determine an outcome. Logistic regression is used extensively in numerous disciplines, including the medical and social science fields. In this paper, we address the problem of estimating parameters in the logistic regression based on MapReduce framework with RHadoop that integrates R and Hadoop environment applicable to large scale data. There exist three learning algorithms for logistic regression, namely Gradient descent method, Cost minimization method and Newton-Rhapson's method. The Newton-Rhapson's method does not require a learning rate, while gradient descent and cost minimization methods need to manually pick a learning rate. The experimental results demonstrated that our learning algorithms using RHadoop can scale well and efficiently process large data sets on commodity hardware. We also compared the performance of our Newton-Rhapson's method with gradient descent and cost minimization methods. The results showed that our newton's method appeared to be the most robust to all data tested.Keywords: big data, logistic regression, MapReduce, RHadoop
Procedia PDF Downloads 28521948 Mapping Intertidal Changes Using Polarimetry and Interferometry Techniques
Authors: Khalid Omari, Rene Chenier, Enrique Blondel, Ryan Ahola
Abstract:
Northern Canadian coasts have vulnerable and very dynamic intertidal zones with very high tides occurring in several areas. The impact of climate change presents challenges not only for maintaining this biodiversity but also for navigation safety adaptation due to the high sediment mobility in these coastal areas. Thus, frequent mapping of shorelines and intertidal changes is of high importance. To help in quantifying the changes in these fragile ecosystems, remote sensing provides practical monitoring tools at local and regional scales. Traditional methods based on high-resolution optical sensors are often used to map intertidal areas by benefiting of the spectral response contrast of intertidal classes in visible, near and mid-infrared bands. Tidal areas are highly reflective in visible bands mainly because of the presence of fine sand deposits. However, getting a cloud-free optical data that coincide with low tides in intertidal zones in northern regions is very difficult. Alternatively, the all-weather capability and daylight-independence of the microwave remote sensing using synthetic aperture radar (SAR) can offer valuable geophysical parameters with a high frequency revisit over intertidal zones. Multi-polarization SAR parameters have been used successfully in mapping intertidal zones using incoherence target decomposition. Moreover, the crustal displacements caused by ocean tide loading may reach several centimeters that can be detected and quantified across differential interferometric synthetic aperture radar (DInSAR). Soil moisture change has a significant impact on both the coherence and the backscatter. For instance, increases in the backscatter intensity associated with low coherence is an indicator for abrupt surface changes. In this research, we present primary results obtained following our investigation of the potential of the fully polarimetric Radarsat-2 data for mapping an inter-tidal zone located on Tasiujaq on the south-west shore of Ungava Bay, Quebec. Using the repeat pass cycle of Radarsat-2, multiple seasonal fine quad (FQ14W) images are acquired over the site between 2016 and 2018. Only 8 images corresponding to low tide conditions are selected and used to build an interferometric stack of data. The observed displacements along the line of sight generated using HH and VV polarization are compared with the changes noticed using the Freeman Durden polarimetric decomposition and Touzi degree of polarization extrema. Results show the consistency of both approaches in their ability to monitor the changes in intertidal zones.Keywords: SAR, degree of polarization, DInSAR, Freeman-Durden, polarimetry, Radarsat-2
Procedia PDF Downloads 13721947 Prioritizing The Evaluation factors of Hospital Information System with The Analytical Hierarchy Process
Authors: F.Sadoughi, A. Sarsarshahi, L, Eerfannia, S.M.A. Khatami
Abstract:
Hospital information systems with lots of ability would lead to health care quality improvement. Evaluation of this system has done according different method and criteria. The main goal of present study is to prioritize the most important factors which are influence these systems evaluation. At the first step, according relevant literature, three main factor and 29 subfactors extracted. Then, study framework was designed. Based on analytical hierarchical process (AHP), 28 paired comparisons with Saaty range, in a questionnaire format obtained. Questionnaires were filled by 10 experts in health information management and medical informatics field. Human factors with weight of 0.55 were ranked as the most important. Organization (0.25) and technology (0.14) were in next place. It seems MADM methods such as AHP have enough potential to use in health research and provide positive opportunities for health domain decision makers.Keywords: Analytical hierarchy process, Multiple criteria decision-making (MCDM), Hospital information system, Evaluation factors
Procedia PDF Downloads 454