Search results for: exclusion probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1684

Search results for: exclusion probability

874 Applied Bayesian Regularized Artificial Neural Network for Up-Scaling Wind Speed Profile and Distribution

Authors: Aghbalou Nihad, Charki Abderafi, Saida Rahali, Reklaoui Kamal

Abstract:

Maximize the benefit from the wind energy potential is the most interest of the wind power stakeholders. As a result, the wind tower size is radically increasing. Nevertheless, choosing an appropriate wind turbine for a selected site require an accurate estimate of vertical wind profile. It is also imperative from cost and maintenance strategy point of view. Then, installing tall towers or even more expensive devices such as LIDAR or SODAR raises the costs of a wind power project. Various models were developed coming within this framework. However, they suffer from complexity, generalization and lacks accuracy. In this work, we aim to investigate the ability of neural network trained using the Bayesian Regularization technique to estimate wind speed profile up to height of 100 m based on knowledge of wind speed lower heights. Results show that the proposed approach can achieve satisfactory predictions and proof the suitability of the proposed method for generating wind speed profile and probability distributions based on knowledge of wind speed at lower heights.

Keywords: bayesian regularization, neural network, wind shear, accuracy

Procedia PDF Downloads 489
873 A Systematic Review on Prevalence, Serotypes and Antibiotic Resistance of Salmonella in Ethiopia

Authors: Atsebaha Gebrekidan Kahsay, Tsehaye Asmelash, Enquebaher Kassaye

Abstract:

Background: Salmonella remains a global public health problem with a significant burden in sub-Saharan African countries. Human restricted cause of typhoid and paratyphoid fever are S. Typhi and S. Paratyphi, whereas S. Enteritidis and S. Typhimurium is the causative agent of invasive nontyphoidal diseases among humans and animals are their reservoir. The antibiotic resistance of Salmonella is another public health threat around the globe. To come up with full information about human and animal salmonellosis, we made a systematic review of the prevalence, serotypes, and antibiotic resistance of Salmonella in Ethiopia. Methods: This systematic review used Google Scholar and PubMed search engines to search articles from Ethiopia that were published in English in peer-reviewed international journals from 2010 to 2022. We used keywords to identify the intended research articles and used a Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist to ensure the inclusion and exclusion criteria. Frequencies and percentages were analyzed using Microsoft Excel. Results: Two hundred seven published articles were searched, and 43 were selected for a systematic review, human (28) and animals (15). The prevalence of Salmonella in humans and animals was 434 (5.2%) and 641(10.1%), respectively. Fourteen serotypes were identified from animals, and S. Typhimurium was among the top five. Among the ciprofloxacin-resistant isolates in human studies, 16.7% was the highest, whereas, for ceftriaxone, 100% resistance was reported. Conclusions: The prevalence of Salmonella among diarrheic patients and food handlers (5.2%) was lower than the prevalence in food animals (10.1%). We did not find serotypes of Salmonella in human studies, although fourteen serotypes were included in food-animal studies, and S. Typhimurium was among the top five. Salmonella species from some human studies revealed a non-susceptibility to ceftriaxone. We recommend further study about invasive nontyphoidal Salmonella and predisposing factors among humans and animals in Ethiopia.

Keywords: antibiotic resistance, prevalence, systematic review, serotypes, Salmonella, Ethiopia

Procedia PDF Downloads 66
872 Effectiveness of the Bundle Care to Relieve the Thirst for Intensive Care Unit Patients: Meta-Analysis

Authors: Wen Hsin Hsu, Pin Lin

Abstract:

Objective: Thirst discomfort is the most common yet often overlooked symptom in patients in the intensive care unit (ICU), with an incidence rate of 69.8%. If not properly cared for, it can easily lead to irritability, affect sleep quality, and increase the incidence of delirium, thereby extending the length of hospital stay. Research points out that the sensation of coldness is an effective strategy to alleviate thirst. Using a combined care approach for thirst can prolong the sensation of coldness in the mouth and reduce thirst discomfort. Therefore, it needs to be further analyzed and its effectiveness reviewed. Methods: This study uses systematic literature review and meta-analysis methodologies and searched databases including PubMed, MEDLINE, EMBASE, Cochrane, CINAHL, and two Chinese databases (CEPS and CJTD) based on keywords. JBI was used to appraise the quality of the literature. RevMen 5.4 software package was used, and Fix Effect Model was applied for data analysis. We selected experimental articles, including those in English and Chinese, that met the inclusion and exclusion criteria. Three research articles were included in total, with a sample size of 416 people. Two were randomized controlled trials, and one was a quasi-experimental design. Results: The results show that the combined care for thirst, which includes ice water spray or oral swab wipes, menthol mouthwash, and lip balm, can significantly relieve thirst intensity MD=-1.36 (3 studies, 95% CI (-1.77, -0.95), p <0.001) and thirst distress MD=-0.71 (2 studies, 95% CI (-1.32, -0.10), p =0.02). Therefore, it is recommended that medical staff identify high-risk groups for thirst early on. Implications for Practice: For patients who cannot eat orally, providing combined care for thirst can increase oral comfort and improve the quality of care.

Keywords: thirst bundle care, intensive care units, meta-analysis, ice water spray, menthol

Procedia PDF Downloads 54
871 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan

Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail

Abstract:

Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.

Keywords: credibility, decision making, food bloggers, generation z, e-wom

Procedia PDF Downloads 59
870 Disrupting Certainties: Reimagined History Curriculum as Critical Pedagogy in Secondary Teacher Education

Authors: Philippa Hunter

Abstract:

How might history education support teachers and students to see the past as a provocation, be open to possible futures, and act differently? As teacher educators in an age of diversity and uncertainty, we need to question history’s curriculum nature, pedagogy, and policy intent. The cultural politics of history’s identity in the senior secondary curriculum influences educational socialization (disciplinary, professional, research) and engagement with curriculum decision-making. This paper reflects on curriculum disturbance that shaped a critical pedagogy stance to problematize school history’s certainties. The context is situated in an Aotearoa New Zealand university-based initial teacher education programme. A pedagogic innovation was activated whereby problematized history pedagogy [PHP] was conceptualized as the phenomenon and method of inquiry and storied in doctoral work. The PHP was a reciprocal research process involving history class’ participants and the teacher as researcher, in fashioning teaching identities, identifying with, and thinking critically about history pedagogy. PHP findings revealed evocative discourses of embodiment, nostalgia, and connectedness about living ‘inside the past’. Participants expressed certainty about their abilities as teachers living ‘outside the past’ to interpret historical perspectives. However, discomfort was evident in relation to ‘difficult knowledge’ or unfamiliar contexts of the past that exposed exclusion, powerlessness, or silenced voices. Participants identified history programmes as strongly masculine and conflict-focused. A normalized inquiry-transmission approach to history pedagogy was identified and critiqued. Individuals’ reflexive accounts of PHP implemented whilst on practicum indicate possibilities of history pedagogy as; inclusive and democratic, social and ethical reconstruction, and as a critical project. The PHP sought to reimagine history curriculum and identify spaces of possibility in secondary postgraduate teacher education.

Keywords: curriculum, pedagogy, problematise, reciprocal

Procedia PDF Downloads 153
869 Preventing Violent Extremism in Mozambique and Tanzania: A Survey to Measure Community Resilience

Authors: L. Freeman, D. Bax, V. K. Sapong

Abstract:

Community-based, preventative approaches to violent extremism may be effective and yet remain an underutilised method. In a realm where security approaches dominate, with the focus on countering violence extremism and combatting radicalisation, community resilience programming remains sparse. This paper will present a survey tool that aims to measure the risk and protective factors that can lead to violent extremism in Mozambique and Tanzania. Conducted in four districts in the Cabo Delgado region of Mozambique and one district in Pwani, Tanzania, the survey uses a combination of BRAVE-14, Afrocentric and context-specific questions in order to more fully understand community resilience opportunities and challenges in preventing and countering violent extremism. Developed in Australia and Canada to measure radicalisation risks in individuals and communities, BRAVE-14 is a tool not yet applied in the African continent. Given the emerging threat of Islamic extremism in Northern Mozambique and Eastern Tanzania, which both experience a combination of socio-political exclusion, resource marginalisation and religious/ideological motivations, the development of the survey is timely and fills a much-needed information gap in these regions. Not only have these Islamist groups succeeded in tapping into the grievances of communities by radicalising and recruiting individuals, but their presence in these regions has been characterised by extreme forms of violence, leaving isolated communities vulnerable to attack. The expected result of these findings will facilitate the contextualisation and comparison of the protective and risk factors that inhibit or promote the radicalisation of the youth in these communities. In identifying sources of resilience and vulnerability, this study emphasises the implementation of context-specific intervention programming and provides a strong research tool for understanding youth and community resilience to violent extremism.

Keywords: community resilience, Mozambique, preventing violent extremism, radicalisation, Tanzania

Procedia PDF Downloads 124
868 A Survey on Students' Intentions to Dropout and Dropout Causes in Higher Education of Mongolia

Authors: D. Naranchimeg, G. Ulziisaikhan

Abstract:

Student dropout problem has not been recently investigated within the Mongolian higher education. A student dropping out is a personal decision, but it may cause unemployment and other social problems including low quality of life because students who are not completed a degree cannot find better-paid jobs. The research aims to determine percentage of at-risk students, and understand reasons for dropouts and to find a way to predict. The study based on the students of the Mongolian National University of Education including its Arkhangai branch school, National University of Mongolia, Mongolian University of Life Sciences, Mongolian University of Science and Technology, Mongolian National University of Medical Science, Ikh Zasag International University, and Dornod University. We conducted the paper survey by method of random sampling and have surveyed about 100 students per university. The margin of error - 4 %, confidence level -90%, and sample size was 846, but we excluded 56 students from this study. Causes for exclusion were missing data on the questionnaire. The survey has totally 17 questions, 4 of which was demographic questions. The survey shows that 1.4% of the students always thought to dropout whereas 61.8% of them thought sometimes. Also, results of the research suggest that students’ dropouts from university do not have relationships with their sex, marital and social status, and peer and faculty climate, whereas it slightly depends on their chosen specialization. Finally, the paper presents the reasons for dropping out provided by the students. The main two reasons for dropouts are personal reasons related with choosing wrong study program, not liking the course they had chosen (50.38%), and financial difficulties (42.66%). These findings reveal the importance of early prevention of dropout where possible, combined with increased attention to high school students in choosing right for them study program, and targeted financial support for those who are at risk.

Keywords: at risk students, dropout, faculty climate, Mongolian universities, peer climate

Procedia PDF Downloads 390
867 Discrimination in Insurance Pricing: A Textual-Analysis Perspective

Authors: Ruijuan Bi

Abstract:

Discrimination in insurance pricing is a topic of increasing concern, particularly in the context of the rapid development of big data and artificial intelligence. There is a need to explore the various forms of discrimination, such as direct and indirect discrimination, proxy discrimination, algorithmic discrimination, and unfair discrimination, and understand their implications in insurance pricing models. This paper aims to analyze and interpret the definitions of discrimination in insurance pricing and explore measures to reduce discrimination. It utilizes a textual analysis methodology, which involves gathering qualitative data from relevant literature on definitions of discrimination. The research methodology focuses on exploring the various forms of discrimination and their implications in insurance pricing models. Through textual analysis, this paper identifies the specific characteristics and implications of each form of discrimination in the general insurance industry. This research contributes to the theoretical understanding of discrimination in insurance pricing. By analyzing and interpreting relevant literature, this paper provides insights into the definitions of discrimination and the laws and regulations surrounding it. This theoretical foundation can inform future empirical research on discrimination in insurance pricing using relevant theories of probability theory.

Keywords: algorithmic discrimination, direct and indirect discrimination, proxy discrimination, unfair discrimination, insurance pricing

Procedia PDF Downloads 57
866 Examining Relationship between Resource-Curse and Under-Five Mortality in Resource-Rich Countries

Authors: Aytakin Huseynli

Abstract:

The paper reports findings of the study which examined under-five mortality rate among resource-rich countries. Typically when countries obtain wealth citizens gain increased wellbeing. Societies with new wealth create equal opportunities for everyone including vulnerable groups. But scholars claim that this is not the case for developing resource-rich countries and natural resources become the curse for them rather than the blessing. Spillovers from natural resource curse affect the social wellbeing of vulnerable people negatively. They get excluded from the mainstream society, and their situation becomes tangible. In order to test this hypothesis, the study compared under-5 mortality rate among resource-rich countries by using independent sample one-way ANOVA. The data on under-five mortality rate came from the World Bank. The natural resources for this study are oil, gas and minerals. The list of 67 resource-rich countries was taken from Natural Resource Governance Institute. The sample size was categorized and 4 groups were created such as low, low-middle, upper middle and high-income countries based on income classification of the World Bank. Results revealed that there was a significant difference in the scores for low, middle, upper-middle and high-income countries in under-five mortality rate (F(3(29.01)=33.70, p=.000). To find out the difference among income groups, the Games-Howell test was performed and it was found that infant mortality was an issue for low, middle and upper middle countries but not for high-income countries. Results of this study are in agreement with previous research on resource curse and negative effects of resource-based development. Policy implications of the study for social workers, policy makers, academicians and social development specialists are to raise and discuss issues of marginalization and exclusion of vulnerable groups in developing resource-rich countries and suggest interventions for avoiding them.

Keywords: children, natural resource, extractive industries, resource-based development, vulnerable groups

Procedia PDF Downloads 243
865 Associations between Physical Activity and Risk Factors for Type II Diabetes in Prediabetic Adults

Authors: Rukia Yosuf

Abstract:

Diabetes is a national healthcare crisis related to both macrovascular and microvascular complications. We hypothesized that higher levels of physical activity are associated with lower total and visceral fat mass, lower systolic blood pressure, and increased insulin sensitivity. Participant inclusion criteria: 21-50 years old, BMI ≥ 30 kg/m2, hemoglobin A1C 5.7-6.4, fasting glucose 100-125 mg/dL, and HOMA IR ≥ 2.5. Exclusion criteria: history of diabetes, hypertension, HIV, renal disease, hearing loss, alcoholic intake over four drinks daily, use of organic nitrates or PDE5 inhibitors, and decreased cardiac function. Total physical activity was measured using accelerometers, body composition using DXA, and insulin resistance via fsIVGTT. Clinical and biochemical cardiometabolic risk factors, blood pressure and heart rate were obtained using a calibrated sphygmomanometer. Anthropometric measures, fasting glucose, insulin, lipid profile, C-reactive protein, and BMP were analyzed using standard procedures. Within our study, we found correlations between levels of physical activity in a heterogeneous group of prediabetic adults. Patients with more physical activity had a higher degree of insulin sensitivity, lower blood pressure, total visceral adipose tissue, and overall lower total mass. Total physical activity levels showed small, but significant correlations with systolic blood pressure, visceral fat, lean mass and insulin sensitivity. After normalizing for the race, age, and gender using multiple regression, these associations were no longer significant considering our small sample size. More research into prediabetes will decrease the population of diabetics overall. In the future, we could increase sample size and conduct cross sectional and longitudinal studies in various populations with prediabetes.

Keywords: diabetes, kidney disease, nephrology, prediabetes

Procedia PDF Downloads 179
864 Semi-Supervised Learning Using Pseudo F Measure

Authors: Mahesh Balan U, Rohith Srinivaas Mohanakrishnan, Venkat Subramanian

Abstract:

Positive and unlabeled learning (PU) has gained more attention in both academic and industry research literature recently because of its relevance to existing business problems today. Yet, there still seems to be some existing challenges in terms of validating the performance of PU learning, as the actual truth of unlabeled data points is still unknown in contrast to a binary classification where we know the truth. In this study, we propose a novel PU learning technique based on the Pseudo-F measure, where we address this research gap. In this approach, we train the PU model to discriminate the probability distribution of the positive and unlabeled in the validation and spy data. The predicted probabilities of the PU model have a two-fold validation – (a) the predicted probabilities of reliable positives and predicted positives should be from the same distribution; (b) the predicted probabilities of predicted positives and predicted unlabeled should be from a different distribution. We experimented with this approach on a credit marketing case study in one of the world’s biggest fintech platforms and found evidence for benchmarking performance and backtested using historical data. This study contributes to the existing literature on semi-supervised learning.

Keywords: PU learning, semi-supervised learning, pseudo f measure, classification

Procedia PDF Downloads 222
863 Different Types of Amyloidosis Revealed with Positive Cardiac Scintigraphy with Tc-99M DPD-SPECT

Authors: Ioannis Panagiotopoulos, Efstathios Kastritis, Anastasia Katinioti, Georgios Efthymiadis, Argyrios Doumas, Maria Koutelou

Abstract:

Introduction: Transthyretin amyloidosis (ATTR) is a rare but serious infiltrative disease. Myocardial scintigraphy with DPD has emerged as the most effective, non-invasive, highly sensitive, and highly specific diagnostic method for cardiac ATTR amyloidosis. However, there are cases in which additional laboratory investigations reveal AL amyloidosis or other diseases despite a positive DPD scintigraphy. We describe the experience from the Onassis Cardiac Surgery Center and the monitoring center for infiltrative myocardial diseases of the cardiology clinic at AHEPA. Materials and Methods: All patients with clinical suspicion of cardiac or extracardiac amyloidosis undergo a myocardial scintigraphy scan with Tc-99m DPD. In this way, over 500 patients have been examined. Further diagnostic approach based on clinical and imaging findings includes laboratory investigation and invasive techniques (e.g., biopsy). Results: Out of 76 patients in total with positive myocardial scintigraphy Grade 2 or 3 according to the Perugini scale, 8 were proven to suffer from AL Amyloidosis during the investigation of paraproteinemia. Among these patients, 3 showed Grade 3 uptake, while the rest were graded as Grade 2, or 2 to 3. Additionally, one patient presented diffuse and unusual radiopharmaceutical uptake in soft tissues throughout the body without cardiac involvement. These findings raised suspicions, leading to the analysis of κ and λ light chains in the serum, as well as immunostaining of proteins in the serum and urine of these specific patients. The final diagnosis was AL amyloidosis. Conclusion: The value of DPD scintigraphy in the diagnosis of cardiac amyloidosis from transthyretin is undisputed. However, positive myocardial scintigraphy with DPD should not automatically lead to the diagnosis of ATTR amyloidosis. Laboratory differentiation between ATTR and AL amyloidosis is crucial, as both prognosis and therapeutic strategy are dramatically altered. Laboratory exclusion of paraproteinemia is a necessary and essential step in the diagnostic algorithm of ATTR amyloidosis for all positive myocardial scintigraphy with diphosphonate tracers since >20% of patients with Grade 3 and 2 uptake may conceal AL amyloidosis.

Keywords: AL amyloidosis, amyloidosis, ATTR, myocardial scintigraphy, Tc-99m DPD

Procedia PDF Downloads 57
862 Optimization of Process Parameters using Response Surface Methodology for the Removal of Zinc(II) by Solvent Extraction

Authors: B. Guezzen, M.A. Didi, B. Medjahed

Abstract:

A factorial design of experiments and a response surface methodology were implemented to investigate the liquid-liquid extraction process of zinc (II) from acetate medium using the 1-Butyl-imidazolium di(2-ethylhexyl) phosphate [BIm+][D2EHP-]. The optimization process of extraction parameters such as the initial pH effect (2.5, 4.5, and 6.6), ionic liquid concentration (1, 5.5, and 10 mM) and salt effect (0.01, 5, and 10 mM) was carried out using a three-level full factorial design (33). The results of the factorial design demonstrate that all these factors are statistically significant, including the square effects of pH and ionic liquid concentration. The results showed that the order of significance: IL concentration > salt effect > initial pH. Analysis of variance (ANOVA) showing high coefficient of determination (R2 = 0.91) and low probability values (P < 0.05) signifies the validity of the predicted second-order quadratic model for Zn (II) extraction. The optimum conditions for the extraction of zinc (II) at the constant temperature (20 °C), initial Zn (II) concentration (1mM) and A/O ratio of unity were: initial pH (4.8), extractant concentration (9.9 mM), and NaCl concentration (8.2 mM). At the optimized condition, the metal ion could be quantitatively extracted.

Keywords: ionic liquid, response surface methodology, solvent extraction, zinc acetate

Procedia PDF Downloads 362
861 Factors Affecting Students' Performance in the Examination

Authors: Amylyn F. Labasano

Abstract:

A significant number of empirical studies are carried out to investigate factors affecting college students’ performance in the academic examination. With a wide-array of literature-and studies-supported findings, this study is limited only on the students’ probability of passing periodical exams which is associated with students’ gender, absences in the class, use of reference book, and hours of study. Binary logistic regression was the technique used in the analysis. The research is based on the students’ record and data collected through survey. The result reveals that gender, use of reference book and hours of study are significant predictors of passing an examination while students’ absenteeism is an insignificant predictor. Females have 45% likelihood of passing the exam than their male classmates. Students who use and read their reference book are 38 times more likely pass the exam than those who do not use and read their reference book. Those who spent more than 3 hours in studying are four (4) times more likely pass the exam than those who spent only 3 hours or less in studying.

Keywords: absences, binary logistic regression, gender, hours of study prediction-causation method, periodical exams, random sampling, reference book

Procedia PDF Downloads 294
860 A Mixture Vine Copula Structures Model for Dependence Wind Speed among Wind Farms and Its Application in Reactive Power Optimization

Authors: Yibin Qiu, Yubo Ouyang, Shihan Li, Guorui Zhang, Qi Li, Weirong Chen

Abstract:

This paper aims at exploring the impacts of high dimensional dependencies of wind speed among wind farms on probabilistic optimal power flow. To obtain the reactive power optimization faster and more accurately, a mixture vine Copula structure model combining the K-means clustering, C vine copula and D vine copula is proposed in this paper, through which a more accurate correlation model can be obtained. Moreover, a Modified Backtracking Search Algorithm (MBSA), the three-point estimate method is applied to probabilistic optimal power flow. The validity of the mixture vine copula structure model and the MBSA are respectively tested in IEEE30 node system with measured data of 3 adjacent wind farms in a certain area, and the results indicate effectiveness of these methods.

Keywords: mixture vine copula structure model, three-point estimate method, the probability integral transform, modified backtracking search algorithm, reactive power optimization

Procedia PDF Downloads 241
859 Indigenous Understandings of Climate Vulnerability in Chile: A Qualitative Approach

Authors: Rosario Carmona

Abstract:

This article aims to discuss the importance of indigenous people participation in climate change mitigation and adaptation. Specifically, it analyses different understandings of climate vulnerability among diverse actors involved in climate change policies in Chile: indigenous people, state officials, and academics. These data were collected through participant observation and interviews conducted during October 2017 and January 2019 in Chile. Following Karen O’Brien, there are two types of vulnerability, outcome vulnerability and contextual vulnerability. How vulnerability to climate change is understood determines the approach, which actors are involved and which knowledge is considered to address it. Because climate change is a very complex phenomenon, it is necessary to transform the institutions and their responses. To do so, it is fundamental to consider these two perspectives and different types of knowledge, particularly those of the most vulnerable, such as indigenous people. For centuries and thanks to a long coexistence with the environment, indigenous societies have elaborated coping strategies, and some of them are already adapting to climate change. Indigenous people from Chile are not an exception. But, indigenous people tend to be excluded from decision-making processes. And indigenous knowledge is frequently seen as subjective and arbitrary in relation to science. Nevertheless, last years indigenous knowledge has gained particular relevance in the academic world, and indigenous actors are getting prominence in international negotiations. There are some mechanisms that promote their participation (e.g., Cancun safeguards, World Bank operational policies, REDD+), which are not absent from difficulties. And since 2016 parties are working on a Local Communities and Indigenous Peoples Platform. This paper also explores the incidence of this process in Chile. Although there is progress in the participation of indigenous people, this participation responds to the operational policies of the funding agencies and not to a real commitment of the state with this sector. The State of Chile omits a review of the structure that promotes inequality and the exclusion of indigenous people. In this way, climate change policies could be configured as a new mechanism of coloniality that validates a single type of knowledge and leads to new territorial control strategies, which increases vulnerability.

Keywords: indigenous knowledge, climate change, vulnerability, Chile

Procedia PDF Downloads 111
858 Life Cycle Cost Evaluation of Structures Retrofitted with Damped Cable System

Authors: Asad Naeem, Mohamed Nour Eldin, Jinkoo Kim

Abstract:

In this study, the seismic performance and life cycle cost (LCC) are evaluated of the structure retrofitted with the damped cable system (DCS). The DCS is a seismic retrofit system composed of a high-strength steel cable and pressurized viscous dampers. The analysis model of the system is first derived using various link elements in SAP2000, and fragility curves of the structure retrofitted with the DCS and viscous dampers are obtained using incremental dynamic analyses. The analysis results show that the residual displacements of the structure equipped with the DCS are smaller than those of the structure with retrofitted with only conventional viscous dampers, due to the enhanced stiffness/strength and self-centering capability of the damped cable system. The fragility analysis shows that the structure retrofitted with the DCS has the least probability of reaching the specific limit states compared to the bare structure and the structure with viscous damper. It is also observed that the initial cost of the DCS method required for the seismic retrofit is smaller than that of the structure with viscous dampers and that the LCC of the structure equipped with the DCS is smaller than that of the structure with viscous dampers.

Keywords: damped cable system, fragility curve, life cycle cost, seismic retrofit, self-centering

Procedia PDF Downloads 538
857 The Efficacy of Methylphenidate vs Atomoxetine in Treating Attention Deficit/Hyperactivity Disorder in Child and Adolescent

Authors: Gadia Duhita, Noorhana, Tjhin Wiguna

Abstract:

Background: ADHD is the most common behavioural disorder in Indonesia. A stimulant, specifically methylphenidate, has been the first drug of choice for an ADHD treatment more than half a century. During the last decade, non-stimulant therapy (atomoxetine) for ADHD treatment has been developing. Growing evidence of its efficacy and the difference in its side effects profile to stimulant therapy have made methylphenidate’s position as a first line therapy for ADHD in need of re-evaluation. Both methylphenidate and atomoxetine have proven themselves against placebos in reducing core symptoms of ADHD. More recent studies directly compare the efficacy of methylphenidate and atomoxetine. Objective: The objective of this paper is to find out if either methylphenidate or atomoxetine is superior to another. This paper will assess the validity, importance, and applicability of current available evidence which compare the effectivity, efficacy, and safety of methylphenidate to atomoxetine for treatment in children and adolescents with ADHD. Method: The articles were searched for through the PubMed and Cochrane databases with “attention deficit/hyperactivity disorder OR adhd”, “methylphenidate”, and “atomoxetine” as the search keywords. Two articles which were relevant and eligible were chosen by using inclusion and exclusion criterias to be critically appraised. Result: The study by Hazel et al. showed that the efficacy of methylphenidate and atomoxetine are comparable for treatment in child and adolescent ADHD. The result shows 53.6% (95% CI 48.5%-58.4%) of the patient responded to the treatment by atomoxetine and 54.4% (95% CI 47.6%-61.1%) patients responded to methylphenidate, with the difference in proportion of–0.9% (95% CI –9.2%-7.5%). The other study by Hanwella et al. also showed that the efficacy of atomoxetine was not inferior to metilphenidate (SMD = 0.09, 95% CI –0.08-0.26) (Z = 1.06, p = 0.29). However, the sub-group analysis showed that OROS methylphenidate is more effective compared to atomoxetine (SMD = 0.32, 95% CI 0.12-0.53) (Z = 3.05, p < 0.02). Conclusion: The efficacy of methylphenidate and atomoxetine in reducing symptoms of ADHD is comparable. None is proven inferior to another. The choice of pharmacological tratment children and adolescents with ADHD should be made based on contraindication and the side effects profile of each drug.

Keywords: attention deficit/hyperactivity disorder, ADHD, atomoxetine, methylphenidate

Procedia PDF Downloads 464
856 Evaluation and Fault Classification for Healthcare Robot during Sit-To-Stand Performance through Center of Pressure

Authors: Tianyi Wang, Hieyong Jeong, An Guo, Yuko Ohno

Abstract:

Healthcare robot for assisting sit-to-stand (STS) performance had aroused numerous research interests. To author’s best knowledge, knowledge about how evaluating healthcare robot is still unknown. Robot should be labeled as fault if users feel demanding during STS when they are assisted by robot. In this research, we aim to propose a method to evaluate sit-to-stand assist robot through center of pressure (CoP), then classify different STS performance. Experiments were executed five times with ten healthy subjects under four conditions: two self-performed STSs with chair heights of 62 cm and 43 cm, and two robot-assisted STSs with chair heights of 43 cm and robot end-effect speed of 2 s and 5 s. CoP was measured using a Wii Balance Board (WBB). Bayesian classification was utilized to classify STS performance. The results showed that faults occurred when decreased the chair height and slowed robot assist speed. Proposed method for fault classification showed high probability of classifying fault classes form others. It was concluded that faults for STS assist robot could be detected by inspecting center of pressure and be classified through proposed classification algorithm.

Keywords: center of pressure, fault classification, healthcare robot, sit-to-stand movement

Procedia PDF Downloads 188
855 Simulation as a Problem-Solving Spotter for System Reliability

Authors: Wheyming Tina Song, Chi-Hao Hong, Peisyuan Lin

Abstract:

An important performance measure for stochastic manufacturing networks is the system reliability, defined as the probability that the production output meets or exceeds a specified demand. The system parameters include the capacity of each workstation and numbers of the conforming parts produced in each workstation. We establish that eighteen archival publications, containing twenty-one examples, provide incorrect values of the system reliability. The author recently published the Song Rule, which provides the correct analytical system-reliability value; it is, however, computationally inefficient for large networks. In this paper, we use Monte Carlo simulation (implemented in C and Flexsim) to provide estimates for the above-mentioned twenty-one examples. The simulation estimates are consistent with the analytical solution for small networks but is computationally efficient for large networks. We argue here for three advantages of Monte Carlo simulation: (1) understanding stochastic systems, (2) validating analytical results, and (3) providing estimates even when analytical and numerical approaches are overly expensive in computation. Monte Carlo simulation could have detected the published analysis errors.

Keywords: Monte Carlo simulation, analytical results, leading digit rule, standard error

Procedia PDF Downloads 349
854 Assessing Solid Waste Management Practices and Health Impacts in Port Harcourt City, Nigeria

Authors: Perpetual Onyejelem, Kenichi Matsui

Abstract:

Solid waste management has recently posed urgent challenges to environmental sustainability and public health in emerging Sub-Saharan urban centers. This paper examines solid waste management in Port Harcourt, the rapidly growing city in Nigeria, with a focus on current solid waste management practices and its health implications. To do so we analyzed past academic papers and official documents. The Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) statement and its four-stage inclusion/exclusion criteria were utilized as part of a systematic literature review technique to identify papers related to solid waste management practices (Scopus and Google Scholar). In terms of policy documents, we focused on information about the implementation between 2014 and 2023. We found that the Rivers State Waste Management Policy and the National Policy on Solid Waste Management were the two most important documents to understand Port Harcourt’s practices. Past studies, however, highlighted that residents continued to dump waste in drainages as they were largely unaware of the policies that encourage them to sort waste. The studies tend to blame the city of its lack of political commitment to monitoring waste sites. Another study highlighted inefficient waste collection practices, the absence of community participation and poor resident awareness of 3R practices. Government documents and past studies tend to agree that an increase in disorderly waste management practices and the emergence of vector-borne diseases (e.g., malaria, lassa fever, cholera) co-incided in Port Harcourt. This led to increased spending for healthcare for locals, particularly low-income households. This study concludes by making some remedial recommendations.

Keywords: health effects, solid waste management practices, environmental pollution, Port Harcourt

Procedia PDF Downloads 12
853 Performance Evaluation of MIMO-OFDM Communication Systems

Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany

Abstract:

This paper evaluates the bit error rate (BER) performance of MIMO-OFDM communication system. MIMO system uses multiple transmitting and receiving antennas with different coding techniques to either enhance the transmission diversity or spatial multiplexing gain. Utilizing alamouti algorithm were the same information transmitted over multiple antennas at different time intervals and then collected again at the receivers to minimize the probability of error, combat fading and thus improve the received signal to noise ratio. While utilizing V-BLAST algorithm, the transmitted signals are divided into different transmitting channels and transferred over the channel to be received by different receiving antennas to increase the transmitted data rate and achieve higher throughput. The paper provides a study of different diversity gain coding schemes and spatial multiplexing coding for MIMO systems. A comparison of various channels' estimation and equalization techniques are given. The simulation is implemented using MATLAB, and the results had shown the performance of transmission models under different channel environments.

Keywords: MIMO communication, BER, space codes, channels, alamouti, V-BLAST

Procedia PDF Downloads 167
852 The Analysis of Application of Green Bonds in New Energy Vehicles in China: From Evolutionary Game Theory

Authors: Jing Zhang

Abstract:

Sustainable development in the new energy vehicles field is the requirement of the net zero aim. Green bonds are accepted as a practical financial tool to boost the transformation of relevant enterprises. The paper analyzes the interactions among governments, enterprises of new energy vehicles, and financial institutions by an evolutionary game theory model and offers advice to stakeholders in China. The decision-making subjects of green behavior are affected by experiences, interests, perception ability, and risk preference, so it is difficult for them to be completely rational. Based on the bounded rationality hypothesis, this paper applies prospect theory in the evolutionary game analysis framework and analyses the costs of government regulation of enterprises adopting green bonds. The influence of the perceived value of revenue prospect and the probability and risk transfer coefficient of the government's active regulation on the decision-making agent's strategy is verified by numerical simulation. Finally, according to the research conclusions, policy suggestions are given to promote green bonds.

Keywords: green bonds, new energy vehicles, sustainable development, evolutionary Game Theory model

Procedia PDF Downloads 72
851 Weighted Risk Scores Method Proposal for Occupational Safety Risk Assessment

Authors: Ulas Cinar, Omer Faruk Ugurlu, Selcuk Cebi

Abstract:

Occupational safety risk management is the most important element of a safe working environment. Effective risk management can only be possible with accurate analysis and evaluations. Scoring-based risk assessment methods offer considerable ease of application as they convert linguistic expressions into numerical results. It can also be easily adapted to any field. Contrary to all these advantages, important problems in scoring-based methods are frequently discussed. Effective measurability is one of the most critical problems. Existing methods allow experts to choose a score equivalent to each parameter. Therefore, experts prefer the score of the most likely outcome for risk. However, all other possible consequences are neglected. Assessments of the existing methods express the most probable level of risk, not the real risk of the enterprises. In this study, it is aimed to develop a method that will present a more comprehensive evaluation compared to the existing methods by evaluating the probability and severity scores, all sub-parameters, and potential results, and a new scoring-based method is proposed in the literature.

Keywords: occupational health and safety, risk assessment, scoring based risk assessment method, underground mining, weighted risk scores

Procedia PDF Downloads 127
850 Proliferative Effect of Some Calcium Channel Blockers on the Human Embryonic Kidney Cell Line

Authors: Lukman Ahmad Jamil, Heather M. Wallace

Abstract:

Introduction: Numerous epidemiological studies have shown a positive as well as negative association and no association in some cases between chronic use of calcium channel blockers and the increased risk of developing cancer. However, these associations were enmeshed with controversies in the absence of laboratory based studies to back up those claims. Aim: The aim of this study was to determine in mechanistic terms the association between the long-term administration of nifedipine and diltiazem and increased risk of developing cancer using the human embryonic kidney (HEK293) cell line. Methods: Cell counting using the Trypan blue dye exclusion and 3-4, 5-Dimethylthiazol-2-yl-2, 5-diphenyl-tetrazolium bromide (MTT) assays were used to investigate the effect of nifedipine and diltiazem on the growth pattern of HEK293 cells. Protein assay using modified Lowry method and analysis of intracellular polyamines concentration using Liquid Chromatography – Tandem Mass Spectrometry (LC-MS) were performed to ascertain the mechanism through which chronic use of nifedipine increases the risk of developing cancer. Results: Both nifedipine and diltiazem significantly increased the proliferation of HEK293 cells dose and time dependently. This proliferative effect after 24, 48 and 72-hour incubation period was observed at 0.78, 1.56 and 25 µM for nifedipine and 0.39, 1.56 and 25 µM for diltiazem, respectively. The increased proliferation of the cells was found to be statistically significantly (p<0.05). Furthermore, the increased proliferation of the cells induced by nifedipine was associated with the increase in the protein content and elevated intracellular polyamines concentration level. Conclusion: The chronic use of nifedipine is associated with increased proliferation of cells with concomitant elevation of polyamines concentration and elevated polyamine levels have been implicated in many malignant transformations and hence, these provide a possible explanation on the link between long term use of nifedipine and development of some human cancers. Further studies are needed to evaluate the cause of this association.

Keywords: cancer, nifedipine, polyamine, proliferation

Procedia PDF Downloads 185
849 Local Spectrum Feature Extraction for Face Recognition

Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh

Abstract:

This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.

Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret

Procedia PDF Downloads 650
848 Functional Vision of Older People in Galician Nursing Homes

Authors: C. Vázquez, L. M. Gigirey, C. P. del Oro, S. Seoane

Abstract:

Early detection of visual problems plays a key role in the aging process. However, although vision problems are common among older people, the percentage of aging people who perform regular optometric exams is low. In fact, uncorrected refractive errors are one of the main causes of visual impairment in this group of the population. Purpose: To evaluate functional vision of older residents in order to show the urgent need of visual screening programs in Galician nursing homes. Methodology: We examined 364 older adults aged 65 years and over. To measure vision of the daily living, we tested distance and near presenting visual acuity (binocular visual acuity with habitual correction if warn, directional E-Snellen) Presenting near vision was tested at the usual working distance. We defined visual impairment (distance and near) as a presenting visual acuity less than 0.3. Exclusion criteria included immobilized residents unable to reach the USC Dual Sensory Loss Unit for visual screening. Association between categorical variables was performed using chi-square tests. We used Pearson and Spearman correlation tests and the variance analysis to determine differences between groups of interest. Results: 23,1% of participants have visual impairment for distance vision and 16,4% for near vision. The percentage of residents with far and near visual impairment reaches 8,2%. As expected, prevalence of visual impairment increases with age. No differences exist with regard to the level of functional vision between gender. Differences exist between age group respect to distance vision, but not in case of near vision. Conclusion: prevalence of visual impairment is high among the older people tested in this pilot study. This means a high percentage of older people with limitations in their daily life activities. It is necessary to develop an effective vision screening program for early detection of vision problems in Galician nursing homes.

Keywords: functional vision, elders, aging, nursing homes

Procedia PDF Downloads 398
847 Constant Factor Approximation Algorithm for p-Median Network Design Problem with Multiple Cable Types

Authors: Chaghoub Soraya, Zhang Xiaoyan

Abstract:

This research presents the first constant approximation algorithm to the p-median network design problem with multiple cable types. This problem was addressed with a single cable type and there is a bifactor approximation algorithm for the problem. To the best of our knowledge, the algorithm proposed in this paper is the first constant approximation algorithm for the p-median network design with multiple cable types. The addressed problem is a combination of two well studied problems which are p-median problem and network design problem. The introduced algorithm is a random sampling approximation algorithm of constant factor which is conceived by using some random sampling techniques form the literature. It is based on a redistribution Lemma from the literature and a steiner tree problem as a subproblem. This algorithm is simple, and it relies on the notions of random sampling and probability. The proposed approach gives an approximation solution with one constant ratio without violating any of the constraints, in contrast to the one proposed in the literature. This paper provides a (21 + 2)-approximation algorithm for the p-median network design problem with multiple cable types using random sampling techniques.

Keywords: approximation algorithms, buy-at-bulk, combinatorial optimization, network design, p-median

Procedia PDF Downloads 186
846 Probabilistic Modeling of Post-Liquefaction Ground Deformation

Authors: Javad Sadoghi Yazdi, Robb Eric S. Moss

Abstract:

This paper utilizes a probabilistic liquefaction triggering method for modeling post-liquefaction ground deformation. This cone penetration test CPT-based liquefaction triggering is employed to estimate the factor of safety against liquefaction (FSL) and compute the maximum cyclic shear strain (γmax). The study identifies a maximum PL value of 90% across various relative densities, which challenges the decrease from 90% to 70% as relative density decreases. It reveals that PL ranges from 5% to 50% for volumetric strain (εvol) less than 1%, while for εvol values between 1% and 3.2%, PL spans from 50% to 90%. The application of the CPT-based simplified liquefaction triggering procedures has been employed in previous researches to estimate liquefaction ground-failure indices, such as the Liquefaction Potential Index (LPI) and Liquefaction Severity Number (LSN). However, several studies have been conducted to highlight the variability in liquefaction probability calculations, suggesting a more accurate depiction of liquefaction likelihood. Consequently, the utilization of these simplified methods may not offer practical efficiency. This paper further investigates the efficacy of various established liquefaction vulnerability parameters, including LPI and LSN, in explaining the observed liquefaction-induced damage within residential zones of Christchurch, New Zealand using results from CPT database.

Keywords: cone penetration test (CPT), liquefaction, postliquefaction, ground failure

Procedia PDF Downloads 52
845 Strengths and Weaknesses of Tally, an LCA Tool for Comparative Analysis

Authors: Jacob Seddlemeyer, Tahar Messadi, Hongmei Gu, Mahboobeh Hemmati

Abstract:

The main purpose of this first tier of the study is to quantify and compare the embodied environmental impacts associated with alternative materials applied to Adohi Hall, a residence building at the University of Arkansas campus, Fayetteville, AR. This 200,000square foot building has5 stories builtwith mass timber and is compared to another scenario where the same edifice is built with a steel frame. Based on the defined goal and scope of the project, the materials respectivetothe respective to the two building options are compared in terms of Global Warming Potential (GWP), starting from cradle to the construction site, which includes the material manufacturing stage (raw material extract, process, supply, transport, and manufacture) plus transportation to the site (module A1-A4, based on standard EN 15804 definition). The consumedfossil fuels and emitted CO2 associated with the buildings are the major reason for the environmental impacts of climate change. In this study, GWP is primarily assessed to the exclusion of other environmental factors. The second tier of this work is to evaluate Tally’s performance in the decision-making process through the design phases, as well as determine its strengths and weaknesses. Tally is a Life Cycle Assessment (LCA) tool capable of conducting a cradle-to-grave analysis. As opposed to other software applications, Tally is specifically targeted at buildings LCA. As a peripheral application, this software tool is directly run within the core modeling application platform called Revit. This unique functionality causes Tally to stand out from other similar tools in the building sector LCA analysis. The results of this study also provide insights for making more environmentally efficient decisions in the building environment and help in the move forward to reduce Green House Gases (GHGs) emissions and GWP mitigation.

Keywords: comparison, GWP, LCA, materials, tally

Procedia PDF Downloads 214