Search results for: advanced diagnostic obesity notation model assessment cardiac index
24775 Rapid and Cheap Test for Detection of Streptococcus pyogenes and Streptococcus pneumoniae with Antibiotic Resistance Identification
Authors: Marta Skwarecka, Patrycja Bloch, Rafal Walkusz, Oliwia Urbanowicz, Grzegorz Zielinski, Sabina Zoledowska, Dawid Nidzworski
Abstract:
Upper respiratory tract infections are one of the most common reasons for visiting a general doctor. Streptococci are the most common bacterial etiological factors in these infections. There are many different types of Streptococci and infections vary in severity from mild throat infections to pneumonia. For example, S. pyogenes mainly contributes to acute pharyngitis, palatine tonsils and scarlet fever, whereas S. Streptococcus pneumoniae is responsible for several invasive diseases like sepsis, meningitis or pneumonia with high mortality and dangerous complications. There are only a few diagnostic tests designed for detection Streptococci from the infected throat of patients. However, they are mostly based on lateral flow techniques, and they are not used as a standard due to their low sensitivity. The diagnostic standard is to culture patients throat swab on semi selective media in order to multiply pure etiological agent of infection and subsequently to perform antibiogram, which takes several days from the patients visit in the clinic. Therefore, the aim of our studies is to develop and implement to the market a Point of Care device for the rapid identification of Streptococcus pyogenes and Streptococcus pneumoniae with simultaneous identification of antibiotic resistance genes. In the course of our research, we successfully selected genes for to-species identification of Streptococci and genes encoding antibiotic resistance proteins. We have developed a reaction to amplify these genes, which allows detecting the presence of S. pyogenes or S. pneumoniae followed by testing their resistance to erythromycin, chloramphenicol and tetracycline. What is more, the detection of β-lactamase-encoding genes that could protect Streptococci against antibiotics from the ampicillin group, which are widely used in the treatment of this type of infection is also developed. The test is carried out directly from the patients' swab, and the results are available after 20 to 30 minutes after sample subjection, which could be performed during the medical visit.Keywords: antibiotic resistance, Streptococci, respiratory infections, diagnostic test
Procedia PDF Downloads 13324774 Clinical Efficacy of Nivolumab and Ipilimumab Combination Therapy for the Treatment of Advanced Melanoma: A Systematic Review and Meta-Analysis of Clinical Trials
Authors: Zhipeng Yan, Janice Wing-Tung Kwong, Ching-Lung Lai
Abstract:
Background: Advanced melanoma accounts for the majority of skin cancer death due to its poor prognosis. Nivolumab and ipilimumab are monoclonal antibodies targeting programmed cell death protein 1 (PD-1) and cytotoxic T-lymphocytes antigen 4 (CTLA-4). Nivolumab and ipilimumab combination therapy has been proven to be effective for advanced melanoma. This systematic review and meta-analysis are to evaluate its clinical efficacy and adverse events. Method: A systematic search was done on databases (Pubmed, Embase, Medline, Cochrane) on 21 June 2020. Search keywords were nivolumab, ipilimumab, melanoma, and randomised controlled trials. Clinical trials fulfilling the inclusion criteria were selected to evaluate the efficacy of combination therapy in terms of prolongation of progression-free survival (PFS), overall survival (OS), and objective response rate (ORR). The odd ratios and distributions of grade 3 or above adverse events were documented. Subgroup analysis was performed based on PD-L1 expression-status and BRAF-mutation status. Results: Compared with nivolumab monotherapy, the hazard ratios of PFS, OS and odd ratio of ORR in combination therapy were 0.64 (95% CI, 0.48-0.85; p=0.002), 0.84 (95% CI, 0.74-0.95; p=0.007) and 1.76 (95% CI, 1.51-2.06; p < 0.001), respectively. Compared with ipilimumab monotherapy, the hazard ratios of PFS, OS and odd ratio of ORR were 0.46 (95% CI, 0.37-0.57; p < 0.001), 0.54 (95% CI, 0.48-0.61; p < 0.001) and 6.18 (95% CI, 5.19-7.36; p < 0.001), respectively. In combination therapy, the odds ratios of grade 3 or above adverse events were 4.71 (95% CI, 3.57-6.22; p < 0.001) compared with nivolumab monotherapy, and 3.44 (95% CI, 2.49-4.74; p < 0.001) compared with ipilimumab monotherapy, respectively. High PD-L1 expression level and BRAF mutation were associated with better clinical outcomes in patients receiving combination therapy. Conclusion: Combination therapy is effective for the treatment of advanced melanoma. Adverse events were common but manageable. Better clinical outcomes were observed in patients with high PD-L1 expression levels and positive BRAF-mutation.Keywords: nivolumab, ipilimumab, advanced melanoma, systematic review, meta-analysis
Procedia PDF Downloads 14124773 Assessment of Agricultural Damage under Different Simulated Flood Conditions
Authors: M. N. Kadir, M. M. H. Oliver, T. Naher
Abstract:
The study assesses the areal extent of riverine flood in the flood-prone area of Faridpur District of Bangladesh using hydrological model and Geographic Information System (GIS). In the context of preparing the inundation map, flood frequency analysis was carried out to assess flooding for different flood magnitudes. Flood inundation maps were prepared based on DEM, and discharge at the river using Delft-3D model. LANDSAT satellite images have been used to develop a land cover map in the study area. The land cover map was used for mapping of cropland area. By incorporating the inundation maps on the land cover map, agricultural damage was assessed. Present monetary values of crop damage were collected through field survey from actual flood of the study area. Two different inundation maps were produced from the model for the year 2000 and 2016. In the year 2000, the floods began in the month of July, whereas in the case of the year 2016 is started in August. Under both cases, most of the areas were found to have been flooded in the month of September followed by flood recession. In order to prepare the land cover maps, four categories of LCs were considered viz., cropland, water body, trees, and rivers. Among the 755791 acres area of Faridpur District, the croplands were categorized to be 334,589 acres, followed by water bodies (279900 acres), trees (101930 acres) and rivers 39372 (acres). Damage assessment data revealed that 40% of the total cropland area had been affected by the flood in the year 2000, whereas only 19% area was affected by the 2016 flood. The study concluded that September is the critical month for cropland protection since the highest flood is expected at this time of the year in Faridpur. The northwestern and the southwestern part of the district was categorized as most vulnerable to flooding.Keywords: agricultural damage, Delft-3d, flood management, land cover map
Procedia PDF Downloads 10524772 Model Driven Architecture Methodologies: A Review
Authors: Arslan Murtaza
Abstract:
Model Driven Architecture (MDA) is technique presented by OMG (Object Management Group) for software development in which different models are proposed and converted them into code. The main plan is to identify task by using PIM (Platform Independent Model) and transform it into PSM (Platform Specific Model) and then converted into code. In this review paper describes some challenges and issues that are faced in MDA, type and transformation of models (e.g. CIM, PIM and PSM), and evaluation of MDA-based methodologies.Keywords: OMG, model driven rrchitecture (MDA), computation independent model (CIM), platform independent model (PIM), platform specific model(PSM), MDA-based methodologies
Procedia PDF Downloads 46324771 Students’ learning Effects in Physical Education between Sport Education Model with TPSR and Traditional Teaching Model with TPSR
Authors: Yi-Hsiang Pan, Chen-Hui Huang, Ching-Hsiang Chen, Wei-Ting Hsu
Abstract:
The purposes of the study were to explore the students' learning effect of physical education curriculum between merging Teaching Personal and Social Responsibility (TPSR) with sport education model and TPSR with traditional teaching model, which these learning effects included sport self-efficacy, sport enthusiastic, group cohesion, responsibility and game performance. The participants include 3 high school physical education teachers and 6 physical education classes, 133 participants with experience group 75 students and control group 58 students, and each teacher taught an experimental group and a control group for 16 weeks. The research methods used questionnaire investigation, interview, focus group meeting. The research instruments included personal and social responsibility questionnaire, sport enthusiastic scale, group cohesion scale, sport self-efficacy scale and game performance assessment instrument. Multivariate Analysis of covariance and Repeated measure ANOVA were used to test difference of students' learning effects between merging TPSR with sport education model and TPSR with traditional teaching model. The findings of research were: 1) The sport education model with TPSR could improve students' learning effects, including sport self-efficacy, game performance, sport enthusiastic, group cohesion and responsibility. 2) The traditional teaching model with TPSR could improve students' learning effect, including sport self-efficacy, responsibility and game performance. 3) the sport education model with TPSR could improve more learning effects than traditional teaching model with TPSR, including sport self-efficacy, sport enthusiastic,responsibility and game performance. 4) Based on qualitative data about learning experience of teachers and students, sport education model with TPSR significant improve learning motivation, group interaction and game sense. The conclusions indicated sport education model with TPSR could improve more learning effects in physical education curriculum. On other hand, the curricular projects of hybrid TPSR-Sport Education model and TPSR-Traditional Teaching model are both good curricular projects of moral character education, which may be applied in school physical education.Keywords: character education, sport season, game performance, sport competence
Procedia PDF Downloads 45524770 A Review on Using Executive Function to Understand the Limited Efficacy of Weight-Loss Interventions
Authors: H. Soltani, Kevin Laugero
Abstract:
Obesity is becoming an increasingly critical issue in the United States due to the steady and substantial increase in prevalence over the last 30 years. Existing interventions have been able to help participants achieve short-term weight loss, but have failed to show long-term results. The complex nature of behavioral change remains one of the most difficult barriers in promoting sustainable weight-loss in overweight individuals. Research suggests that the 'intention-behavior gap' can be explained by a person’s ability to regulate higher-order thinking, or Executive Function (EF). A review of 63 research articles was completed in fall of 2017 to identify the role of EF in regulating eating behavior and to identify whether there is a potential for improving dietary quality by enhancing EF. Results showed that poor EF is positively associated with obesogenic behavior, namely increased consumption of highly palatable foods, eating in the absence of hunger, high saturated fat intake and low fruit and vegetable consumption. Recent research has indicated that interventions targeting an improvement in EF can be successful in helping promote healthy behaviors. Furthermore, interventions of longer duration have a more lasting and versatile effect on weight loss and maintenance. This may present an opportunity for the increasingly ubiquitous use of mobile application technology.Keywords: eating behavior, executive function, nutrition, obesity, weight-loss
Procedia PDF Downloads 16624769 A Clinical Cutoff to Identify Metabolically Unhealthy Obese and Normal-Weight Phenotype in Young Adults
Authors: Lívia Pinheiro Carvalho, Luciana Di Thommazo-Luporini, Rafael Luís Luporini, José Carlos Bonjorno Junior, Renata Pedrolongo Basso Vanelli, Manoel Carneiro de Oliveira Junior, Rodolfo de Paula Vieira, Renata Trimer, Renata G. Mendes, Mylène Aubertin-Leheudre, Audrey Borghi-Silva
Abstract:
Rationale: Cardiorespiratory fitness (CRF) and functional capacity in young obese and normal-weight people are associated with metabolic and cardiovascular diseases and mortality. However, it remains unclear whether their metabolically healthy (MH) or at risk (AR) phenotype influences cardiorespiratory fitness in this vulnerable population such as obese adults but also in normal-weight people. HOMA insulin resistance index (HI) and leptin-adiponectin ratio (LA) are strong markers for characterizing those phenotypes that we hypothesized to be associated with physical fitness. We also hypothesized that an easy and feasible exercise test could identify a subpopulation at risk to develop metabolic and related disorders. Methods: Thirty-nine sedentary men and women (20-45y; 18.5Keywords: aerobic capacity, exercise, fitness, metabolism, obesity, 6MST
Procedia PDF Downloads 36124768 Derivation of a Risk-Based Level of Service Index for Surface Street Network Using Reliability Analysis
Authors: Chang-Jen Lan
Abstract:
Current Level of Service (LOS) index adopted in Highway Capacity Manual (HCM) for signalized intersections on surface streets is based on the intersection average delay. The delay thresholds for defining LOS grades are subjective and is unrelated to critical traffic condition. For example, an intersection delay of 80 sec per vehicle for failing LOS grade F does not necessarily correspond to the intersection capacity. Also, a specific measure of average delay may result from delay minimization, delay equality, or other meaningful optimization criteria. To that end, a reliability version of the intersection critical degree of saturation (v/c) as the LOS index is introduced. Traditionally, the level of saturation at a signalized intersection is defined as the ratio of critical volume sum (per lane) to the average saturation flow (per lane) during all available effective green time within a cycle. The critical sum is the sum of the maximal conflicting movement-pair volumes in northbound-southbound and eastbound/westbound right of ways. In this study, both movement volume and saturation flow are assumed log-normal distributions. Because, when the conditions of central limit theorem obtain, multiplication of the independent, positive random variables tends to result in a log-normal distributed outcome in the limit, the critical degree of saturation is expected to be a log-normal distribution as well. Derivation of the risk index predictive limits is complex due to the maximum and absolute value operators, as well as the ratio of random variables. A fairly accurate functional form for the predictive limit at a user-specified significant level is yielded. The predictive limit is then compared with the designated LOS thresholds for the intersection critical degree of saturation (denoted as XKeywords: reliability analysis, level of service, intersection critical degree of saturation, risk based index
Procedia PDF Downloads 13324767 Establishment of a Classifier Model for Early Prediction of Acute Delirium in Adult Intensive Care Unit Using Machine Learning
Authors: Pei Yi Lin
Abstract:
Objective: The objective of this study is to use machine learning methods to build an early prediction classifier model for acute delirium to improve the quality of medical care for intensive care patients. Background: Delirium is a common acute and sudden disturbance of consciousness in critically ill patients. After the occurrence, it is easy to prolong the length of hospital stay and increase medical costs and mortality. In 2021, the incidence of delirium in the intensive care unit of internal medicine was as high as 59.78%, which indirectly prolonged the average length of hospital stay by 8.28 days, and the mortality rate is about 2.22% in the past three years. Therefore, it is expected to build a delirium prediction classifier through big data analysis and machine learning methods to detect delirium early. Method: This study is a retrospective study, using the artificial intelligence big data database to extract the characteristic factors related to delirium in intensive care unit patients and let the machine learn. The study included patients aged over 20 years old who were admitted to the intensive care unit between May 1, 2022, and December 31, 2022, excluding GCS assessment <4 points, admission to ICU for less than 24 hours, and CAM-ICU evaluation. The CAMICU delirium assessment results every 8 hours within 30 days of hospitalization are regarded as an event, and the cumulative data from ICU admission to the prediction time point are extracted to predict the possibility of delirium occurring in the next 8 hours, and collect a total of 63,754 research case data, extract 12 feature selections to train the model, including age, sex, average ICU stay hours, visual and auditory abnormalities, RASS assessment score, APACHE-II Score score, number of invasive catheters indwelling, restraint and sedative and hypnotic drugs. Through feature data cleaning, processing and KNN interpolation method supplementation, a total of 54595 research case events were extracted to provide machine learning model analysis, using the research events from May 01 to November 30, 2022, as the model training data, 80% of which is the training set for model training, and 20% for the internal verification of the verification set, and then from December 01 to December 2022 The CU research event on the 31st is an external verification set data, and finally the model inference and performance evaluation are performed, and then the model has trained again by adjusting the model parameters. Results: In this study, XG Boost, Random Forest, Logistic Regression, and Decision Tree were used to analyze and compare four machine learning models. The average accuracy rate of internal verification was highest in Random Forest (AUC=0.86), and the average accuracy rate of external verification was in Random Forest and XG Boost was the highest, AUC was 0.86, and the average accuracy of cross-validation was the highest in Random Forest (ACC=0.77). Conclusion: Clinically, medical staff usually conduct CAM-ICU assessments at the bedside of critically ill patients in clinical practice, but there is a lack of machine learning classification methods to assist ICU patients in real-time assessment, resulting in the inability to provide more objective and continuous monitoring data to assist Clinical staff can more accurately identify and predict the occurrence of delirium in patients. It is hoped that the development and construction of predictive models through machine learning can predict delirium early and immediately, make clinical decisions at the best time, and cooperate with PADIS delirium care measures to provide individualized non-drug interventional care measures to maintain patient safety, and then Improve the quality of care.Keywords: critically ill patients, machine learning methods, delirium prediction, classifier model
Procedia PDF Downloads 8024766 Evaluation of Public Library Adult Programs: Use of Servqual and Nippa Assessment Standards
Authors: Anna Ching-Yu Wong
Abstract:
This study aims to identify the quality and effectiveness of the adult programs provided by the public library using the ServQUAL Method and the National Library Public Programs Assessment guidelines (NIPPA, June 2019). ServQUAl covers several variables, namely: tangible, reliability, responsiveness, assurance, and empathy. NIPPA guidelines focus on program characteristics, particularly on the outcomes – the level of satisfaction from program participants. The reached populations were adults who participated in library adult programs at a small-town public library in Kansas. This study was designed as quantitative evaluative research which analyzed the quality and effectiveness of the library adult programs by analyzing the role of each factor based on ServQUAL and the NIPPA's library program assessment guidelines. Data were collected from November 2019 to January 2020 using a questionnaire with a Likert Scale. The data obtained were analyzed in a descriptive quantitative manner. The impact of this research can provide information about the quality and effectiveness of existing programs and can be used as input to develop strategies for developing future adult programs. Overall the result of ServQUAL measurement is in very good quality, but still, areas need improvement and emphasis in each variable: Tangible Variables still need improvement in indicators of the temperature and space of the meeting room. Reliability Variable still needs improvement in the timely delivery of the programs. Responsiveness Variable still needs improvement in terms of the ability of the presenters to convey trust and confidence from participants. Assurance Variables still need improvement in the indicator of knowledge and skills of program presenters. Empathy Variable still needs improvement in terms of the presenters' willingness to provide extra assistance. The result of program outcomes measurement based on NIPPA guidelines is very positive. Over 96% of participants indicated that the programs were informative and fun. They learned new knowledge and new skills and would recommend the programs to their friends and families. They believed that together, the library and participants build stronger and healthier communities.Keywords: ServQual model, ServQual in public libraries, library program assessment, NIPPA library programs assessment
Procedia PDF Downloads 10024765 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status
Authors: Rosa Figueroa, Christopher Flores
Abstract:
Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm
Procedia PDF Downloads 30024764 The Budget Impact of the DISCERN™ Diagnostic Test for Alzheimer’s Disease in the United States
Authors: Frederick Huie, Lauren Fusfeld, William Burchenal, Scott Howell, Alyssa McVey, Thomas F. Goss
Abstract:
Alzheimer’s Disease (AD) is a degenerative brain disease characterized by memory loss and cognitive decline that presents a substantial economic burden for patients and health insurers in the US. This study evaluates the payer budget impact of the DISCERN™ test in the diagnosis and management of patients with symptoms of dementia evaluated for AD. DISCERN™ comprises three assays that assess critical factors related to AD that regulate memory, formation of synaptic connections among neurons, and levels of amyloid plaques and neurofibrillary tangles in the brain and can provide a quicker, more accurate diagnosis than tests in the current diagnostic pathway (CDP). An Excel-based model with a three-year horizon was developed to assess the budget impact of DISCERN™ compared with CDP in a Medicare Advantage plan with 1M beneficiaries. Model parameters were identified through a literature review and were verified through consultation with clinicians experienced in diagnosis and management of AD. The model assesses direct medical costs/savings for patients based on the following categories: •Diagnosis: costs of diagnosis using DISCERN™ and CDP. •False Negative (FN) diagnosis: incremental cost of care avoidable with a correct AD diagnosis and appropriately directed medication. •True Positive (TP) diagnosis: AD medication costs; cost from a later TP diagnosis with the CDP versus DISCERN™ in the year of diagnosis, and savings from the delay in AD progression due to appropriate AD medication in patients who are correctly diagnosed after a FN diagnosis.•False Positive (FP) diagnosis: cost of AD medication for patients who do not have AD. A one-way sensitivity analysis was conducted to assess the effect of varying key clinical and cost parameters ±10%. An additional scenario analysis was developed to evaluate the impact of individual inputs. In the base scenario, DISCERN™ is estimated to decrease costs by $4.75M over three years, equating to approximately $63.11 saved per test per year for a cohort followed over three years. While the diagnosis cost is higher with DISCERN™ than with CDP modalities, this cost is offset by the higher overall costs associated with CDP due to the longer time needed to receive a TP diagnosis and the larger number of patients who receive a FN diagnosis and progress more rapidly than if they had received appropriate AD medication. The sensitivity analysis shows that the three parameters with the greatest impact on savings are: reduced sensitivity of DISCERN™, improved sensitivity of the CDP, and a reduction in the percentage of disease progression that is avoided with appropriate AD medication. A scenario analysis in which DISCERN™ reduces the utilization for patients of computed tomography from 21% in the base case to 16%, magnetic resonance imaging from 37% to 27% and cerebrospinal fluid biomarker testing, positive emission tomography, electroencephalograms, and polysomnography testing from 4%, 5%, 10%, and 8%, respectively, in the base case to 0%, results in an overall three-year net savings of $14.5M. DISCERN™ improves the rate of accurate, definitive diagnosis of AD earlier in the disease and may generate savings for Medicare Advantage plans.Keywords: Alzheimer’s disease, budget, dementia, diagnosis.
Procedia PDF Downloads 14124763 The Effect of Sumatra Fault Earthquakes on West Malaysia
Authors: Noushin Naraghi Araghi, M. Nawawi, Syed Mustafizur Rahman
Abstract:
This paper presents the effect of Sumatra fault earthquakes on west Malaysia by calculating the peak horizontal ground acceleration (PGA). PGA is calculated by a probabilistic seismic hazard assessment (PSHA). A uniform catalog of earthquakes for the interest region has been provided. We used empirical relations to convert all magnitudes to Moment Magnitude. After eliminating foreshocks and aftershocks in order to achieve more reliable results, the completeness of the catalog and uncertainty of magnitudes have been estimated and seismicity parameters were calculated. Our seismic source model considers the Sumatran strike slip fault that is known historically to generate large earthquakes. The calculations were done using the logic tree method and four attenuation relationships and slip rates for different part of this fault. Seismic hazard assessment carried out for 48 grid points. Eventually, two seismic hazard maps based PGA for 5% and 10% probability of exceedance in 50 year are presented.Keywords: Sumatra fault, west Malaysia, PGA, seismic parameters
Procedia PDF Downloads 40724762 Preliminary Seismic Hazard Mapping of Papua New Guinea
Authors: Hadi Ghasemi, Mark Leonard, Spiliopoulos Spiro, Phil Cummins, Mathew Moihoi, Felix Taranu, Eric Buri, Chris Mckee
Abstract:
In this study the level of seismic hazard in terms of Peak Ground Acceleration (PGA) was calculated for return period of 475 years, using modeled seismic sources and assigned ground-motion equations. The calculations were performed for bedrock site conditions (Vs30=760 m/s). From the results it is evident that the seismic hazard reaches its maximum level (i.e. PGA≈1g for 475 yr return period) at the Huon Peninsula and southern New Britain regions. Disaggregation analysis revealed that moderate to large earthquakes occurring along the New Britain Trench mainly control the level of hazard at these locations. The open-source computer program OpenQuake developed by Global Earthquake Model foundation was used for the seismic hazard computations. It should be emphasized that the presented results are still preliminary and should not be interpreted as our final assessment of seismic hazard in PNG.Keywords: probabilistic seismic hazard assessment, Papua New Guinea, building code, OpenQuake
Procedia PDF Downloads 56424761 Screening Methodology for Seismic Risk Assessment of Aging Structures in Oil and Gas Plants
Authors: Mohammad Nazri Mustafa, Pedram Hatami Abdullah, M. Fakhrur Razi Ahmad Faizul
Abstract:
With the issuance of Malaysian National Annex 2017 as a part of MS EN 1998-1:2015, the seismic mapping of Malaysian Peninsular including Sabah and Sarawak has undergone some changes in terms of the Peak Ground Acceleration (PGA) value. The revision to the PGA has raised a concern on the safety of oil and gas onshore structures as these structures were not designed to accommodate the new PGA values which are much higher than the previous values used in the original design. In view of the high numbers of structures and buildings to be re-assessed, a risk assessment methodology has been developed to prioritize and rank the assets in terms of their criticality against the new seismic loading. To-date such risk assessment method for oil and gas onshore structures is lacking, and it is the main intention of this technical paper to share the risk assessment methodology and risk elements scoring finalized via Delphi Method. The finalized methodology and the values used to rank the risk elements have been established based on years of relevant experience on the subject matter and based on a series of rigorous discussions with professionals in the industry. The risk scoring is mapped against the risk matrix (i.e., the LOF versus COF) and hence, the overall risk for the assets can be obtained. The overall risk can be used to prioritize and optimize integrity assessment, repair and strengthening work against the new seismic mapping of the country.Keywords: methodology, PGA, risk, seismic
Procedia PDF Downloads 15824760 Gradient Index Metalens for WLAN Applications
Authors: Akram Boubakri, Fethi Choubeni, Tan Hoa Vuong, Jacques David
Abstract:
The control of electromagnetic waves is a key aim of several researches over the past decade. In this regard, Metamaterials have shown a strong ability to manipulate the electromagnetic waves on a subwavelength scales thanks to its unconventional properties that are not available in natural materials such as negative refraction index, super imaging and invisibility cloaking. Metalenses were used to avoid some drawbacks presented by conventional lenses since focusing with conventional lenses suffered from the limited resolution because they were only able to focus the propagating wave component. Nevertheless, Metalenses were able to go beyond the diffraction limit and enhance the resolution not only by collecting the propagating waves but also by restoring the amplitude of evanescent waves that decay rapidly when going far from the source and that contains the finest details of the image. Metasurfaces have many mechanical advantages over three-dimensional metamaterial structures especially the ease of fabrication and a smaller required volume. Those structures have been widely used for antenna performance improvement and to build flat metalenses. In this work, we showed that a well-designed metasurface lens operating at the frequency of 5.9GHz, has efficiently enhanced the radiation characteristics of a patch antenna and can be used for WLAN applications (IEEE 802.11 a). The proposed metasurface lens is built with a geometrically modified unit cells which lead to a change in the response of the lens at different position and allow the control of the wavefront beam of the incident wave thanks to the gradient refractive index.Keywords: focusing, gradient index, metasurface, metalens, WLAN Applications
Procedia PDF Downloads 25724759 Maximizing the Aerodynamic Performance of Wind and Water Turbines by Utilizing Advanced Flow Control Techniques
Authors: Edwin Javier Cortes, Surupa Shaw
Abstract:
In recent years, there has been a growing emphasis on enhancing the efficiency and performance of wind and water turbines to meet the increasing demand for sustainable energy sources. One promising approach is the utilization of advanced flow control techniques to optimize aerodynamic performance. This paper explores the application of advanced flow control techniques in both wind and water turbines, aiming to maximize their efficiency and output. By manipulating the flow of air or water around the turbine blades, these techniques offer the potential to improve energy capture, reduce drag, and minimize turbulence-induced losses. The paper will review various flow control strategies, including passive and active techniques such as vortex generators, boundary layer suction, and plasma actuators. It will examine their effectiveness in optimizing turbine performance under different operating conditions and environmental factors. Furthermore, the paper will discuss the challenges and opportunities associated with implementing these techniques in practical turbine designs. It will consider factors such as cost-effectiveness, reliability, and scalability, as well as the potential impact on overall turbine efficiency and lifecycle. Through a comprehensive analysis of existing research and case studies, this paper aims to provide insights into the potential benefits and limitations of advanced flow control techniques for wind and water turbines. It will also highlight areas for future research and development, with the ultimate goal of advancing the state-of-the-art in turbine technology and accelerating the transition towards a more sustainable energy future.Keywords: flow control, efficiency, passive control, active control
Procedia PDF Downloads 7524758 Assessment of Estrogenic Contamination and Potential Risk in Taihu Lake, China
Authors: Guanghua Lu, Zhenhua Yan
Abstract:
To investigate the estrogenic contamination and potential risk of Taihu Lake, eight active biomonitoring points in the northern section of Taihu Lake were set up and located in Wangyuhe River outlet (P1), Gonghu Bay (P2 and P3), Meiliang Bay (P4 and P5), Zhushan Bay (P6 and P7) and Lake Centre (P8). A suite of biomarkers in caged fish after in situ exposure for 28 days, coupled with six selected exogenous estrogens in water, were determined in May and December 2011. Six target estrogens, namely estrone (E1), 17b-estradiol (E2), ethinylestradiol (EE2), estriol (E3), diethylstilbestrol (DES) and bisphenol A (BPA), were quantified using UPLC/MS/MS. The concentrations of E1, E2, E3, EE2, DES and BPA ranged from ND to 3.61 ng/L, ND to 17.3 ng/L, ND to 1.65 ng/L, ND to 10.2 ng/L, ND to 34.6 ng/L, and 3.95 to 207 ng/L, respectively. BPA was detected at all sampling points at all test periods, E2 was detected at 95% of samples, E1 and EE2 was detected at 75% of samples, and E3 was detected only in December 2011 with quite low concentrations. Each individual estrogen concentration measured at each sampling point was multiplied by its relative potency to gain the estradiol equivalent (EEQ). The total EEQ values in all the monitoring points ranged from 5.69 to 17.8 ng/L in May 2011, and from 4.46 to 21.1 ng/L in December 2011. E2 and EE2 were thought to be the major causal agents responsible for the estrogenic activities. Serum vitellogenin and E2 levels, gonadal DNA damage, and gonadosomatic index were measured in the in situ exposed fish. An enhanced integrated biomarker response (EIBR) was calculated and used to evaluate potential feminization risk of fish in the polluted area of Taihu Lake. EIBR index showed good agreement with the observed total EEQ levels in water. Our results indicated that Gong bay and the lake center had a low estrogenic risk, whereas Wangyuhe River, Meiliang Bay, and Zhushan Bay might present a higher risk to fish.Keywords: active biomonitoring, estrogen, feminization risk, Taihu Lake
Procedia PDF Downloads 27824757 An Education Profile for Indonesian Youth Development
Authors: Titia Izzati, Pebri Hastuti, Gusti Ayu Arwati
Abstract:
Based on the program of The Ministry of Youth and Sports of Republic of Indonesia, this study compares the Statistikdata of the educational factors and the number of young people to a survey conducted in the five years, 2009-2013. As a result, significant trends are traced through an era filled with events that deeply affected the lives of young people, such as the peak and the ending of the political issues. Changing values under examination include attitudes toward authority and obligations toward others; social values dealing with attitudes toward the work ethic; marriage, family, and the importance of money in defining the meaning of success; and self-fulfillment. While the largest portion of the sample contains college youth, other people between the ages of 16 and 30 are considered, including high school students, blue collar workers, housewives, and high school dropouts. The report provides an overview and interpretation of the data with the presents the research contrasting the values of the college and non-college youth. In the other hand, the youth education profile data also can be utilized in making arrange the youth development index, especially in educational dimension. In order to the formulation of this youth development index, the basic needs of youth in Indonesia have to be listed as the variables. So that, the indicators of the youth development index are really in accordance withthe actual conditions of Indonesian youth. The indicators are the average number of old-school youth, the rate of youth illiterate people, the numbers of youth who are continuing their studies or who have completed the study in college, the number of youth graduate high school/vocational or college graduates were engaged in the labor fair. The formula for the youth development index is arranged in educational dimension with all actual indicatorsKeywords: education, young people, Indonesia, ministry programs, youth index development
Procedia PDF Downloads 27724756 Paradigm Shift in Classical Drug Research: Challenges to Mordern Pharmaceutical Sciences
Authors: Riddhi Shukla, Rajeshri Patel, Prakruti Buch, Tejas Sharma, Mihir Raval, Navin Sheth
Abstract:
Many classical drugs are claimed to have blood sugar lowering properties that make them valuable for people with or at high risk of type 2 diabetes. Vijaysar (Pterocarpus marsupium) and Gaumutra (Indian cow urine) both have been shown antidiabetic property since primordial time and both shows synergistic effect in combination for hypoglycaemic activity. The study was undertaken to investigate the hypoglycaemic and anti-diabetic effects of the combination of Vijaysar and Gaumutra which is a classical preparation mentioned in Ayurveda named as Pramehari ark. Rats with Type 2 diabetes which is induced by streptozotocin (STZ, 35mg/kg) given a high-fat diet for one month and compared with normal rats. Diabetic rats showed raised level of body weight, triglyceride (TG), total cholesterol, HDL, LDL, and D-glucose concentration and other serum, cardiac and hypertrophic parameters in comparison of normal rats. After treatment of different doses of drug the level of parameters like TG, total cholesterol, HDL, LDL, and D-glucose concentration found to be decreased in standard as well as in treatment groups. In addition treatment groups also found to be decreased in the level of serum markers, cardiac markers, and hypertrophic parameters. The findings demonstrated that Pramehari ark prevented the pathological progression of type 2 diabetes in rats.Keywords: cow urine, hypoglycemic effect, synergic effect, type 2 diabetes, vijaysar
Procedia PDF Downloads 28224755 Research on the Evaluation of Enterprise-University-Research Cooperation Ability in Hubei Province
Authors: Dongfang Qiu, Yilin Lu
Abstract:
The measurement of enterprise-university-research cooperative efficiency has important meanings in improving the cooperative efficiency, strengthening the effective integration of regional resource, enhancing the ability of regional innovation and promoting the development of regional economy. The paper constructs the DEA method and DEA-Malmquist productivity index method to research the cooperation efficiency of Hubei by making comparisons with other provinces in China. The study found out the index of technology efficiency is 0.52 and the enterprise-university- research cooperative efficiency is Non-DEA efficient. To realize the DEA efficiency of Hubei province, the amount of 1652.596 R&D employees and 638.368 R&D employees’ full time equivalence should be reduced or 137.89 billion yuan of new products’ sales income be increased. Finally, it puts forward policy recommendations on existing problems to strengthen the standings of the cooperation, realize the effective application of the research results, and improve the level of management of enterprise-university-research cooperation efficiency.Keywords: cooperation ability, DEA method, enterprise-university-research cooperation, Malmquist efficiency index
Procedia PDF Downloads 40124754 Slow Pyrolysis of Bio-Wastes: Environmental, Exergetic, and Energetic (3E) Assessment
Authors: Daniela Zalazar-Garcia, Erick Torres, German Mazza
Abstract:
Slow pyrolysis of a pellet of pistachio waste was studied using a lab-scale stainless-steel reactor. Experiments were conducted at different heating rates (5, 10, and 15 K/min). A 3-E (environmental, exergetic, and energetic) analysis for the processing of 20 kg/h of bio-waste was carried out. Experimental results showed that biochar and gas yields decreased with an increase in the heating rate (43 to 36 % and 28 to 24 %, respectively), while the bio-oil yield increased (29 to 40 %). Finally, from the 3-E analysis and the experimental results, it can be suggested that an increase in the heating rate resulted in a higher pyrolysis exergetic efficiency (70 %) due to an increase of the bio-oil yield with high-energy content.Keywords: 3E assessment, bio-waste pellet, life cycle assessment, slow pyrolysis
Procedia PDF Downloads 22524753 Pavement Maintenance and Rehabilitation Scheduling Using Genetic Algorithm Based Multi Objective Optimization Technique
Authors: Ashwini Gowda K. S, Archana M. R, Anjaneyappa V
Abstract:
This paper presents pavement maintenance and management system (PMMS) to obtain optimum pavement maintenance and rehabilitation strategies and maintenance scheduling for a network using a multi-objective genetic algorithm (MOGA). Optimal pavement maintenance & rehabilitation strategy is to maximize the pavement condition index of the road section in a network with minimum maintenance and rehabilitation cost during the planning period. In this paper, NSGA-II is applied to perform maintenance optimization; this maintenance approach was expected to preserve and improve the existing condition of the highway network in a cost-effective way. The proposed PMMS is applied to a network that assessed pavement based on the pavement condition index (PCI). The minimum and maximum maintenance cost for a planning period of 20 years obtained from the non-dominated solution was found to be 5.190x10¹⁰ ₹ and 4.81x10¹⁰ ₹, respectively.Keywords: genetic algorithm, maintenance and rehabilitation, optimization technique, pavement condition index
Procedia PDF Downloads 15624752 Risk Assessment of Oil Spill Pollution by Integration of Gnome, Aloha and Gis in Bandar Abbas Coast, Iran
Authors: Mehrnaz Farzingohar, Mehran Yasemi, Ahmad Savari
Abstract:
The oil products are imported and exported via Rajaee’s tanker terminal. Within loading and discharging in several cases the oil is released into the berths and made oil spills. The spills are distributed within short time and seriously affected Rajaee port’s environment and even extended areas. The trajectory and fate of oil spills investigated by modeling and parted by three risk levels base on the modeling results. First GNOME (General NOAA Operational Modeling Environment) applied to trajectory the liquid oil. Second, ALOHA (Areal Location Of Hazardous Atmosphere) air quality model, is integrated to predict the oil evaporation path within the air. Base on the identified zones the high risk areas are signed by colored dots which their densities calculated and clarified on a map which displayed the harm places. Wind and water circulation moved the pollution to the East of Rajaee Port that accumulated about 12 km of coastline. Approximately 20 km of north east of Qeshm Island shore is covered by the three levels of risky areas. Since the main wind direction is SSW the pollution pushed to the east and the highest risk zones formed on the crests edges hence the low risk appeared on the concavities. This assessment help the management and emergency systems to monitor the exposure places base on the priority factors and find the best approaches to protect the environment.Keywords: oil spill, modeling, pollution, risk assessment
Procedia PDF Downloads 39024751 Influence of Pretreatment Magnetic Resonance Imaging on Local Therapy Decisions in Intermediate-Risk Prostate Cancer Patients
Authors: Christian Skowronski, Andrew Shanholtzer, Brent Yelton, Muayad Almahariq, Daniel J. Krauss
Abstract:
Prostate cancer has the third highest incidence rate and is the second leading cause of cancer death for men in the United States. Of the diagnostic tools available for intermediate-risk prostate cancer, magnetic resonance imaging (MRI) provides superior soft tissue delineation serving as a valuable tool for both diagnosis and treatment planning. Currently, there is minimal data regarding the practical utility of MRI for evaluation of intermediate-risk prostate cancer. As such, the National Comprehensive Cancer Network’s guidelines indicate MRI as optional in intermediate-risk prostate cancer evaluation. This project aims to elucidate whether MRI affects radiation treatment decisions for intermediate-risk prostate cancer. This was a retrospective study evaluating 210 patients with intermediate-risk prostate cancer, treated with definitive radiotherapy at our institution between 2019-2020. NCCN risk stratification criteria were used to define intermediate-risk prostate cancer. Patients were divided into two groups: those with pretreatment prostate MRI, and those without pretreatment prostate MRI. We compared the use of external beam radiotherapy, brachytherapy alone, brachytherapy boost, and androgen depravation therapy between the two groups. Inverse probability of treatment weighting was used to match the two groups for age, comorbidity index, American Urologic Association symptoms index, pretreatment PSA, grade group, and percent core involvement on prostate biopsy. Wilcoxon Rank Sum and Chi-squared tests were used to compare continuous and categorical variables. Of the patients who met the study’s eligibility criteria, 133 had a prostate MRI and 77 did not. Following propensity matching, there were no differences between baseline characteristics between the two groups. There were no statistically significant differences in treatments pursued between the two groups: 42% vs 47% were treated with brachytherapy alone, 40% vs 42% were treated with external beam radiotherapy alone, 18% vs 12% were treated with external beam radiotherapy with a brachytherapy boost, and 24% vs 17% received androgen deprivation therapy in the non-MRI and MRI groups, respectively. This analysis suggests that pretreatment MRI does not significantly impact radiation therapy or androgen deprivation therapy decisions in patients with intermediate-risk prostate cancer. Obtaining a pretreatment prostate MRI should be used judiciously and pursued only to answer a specific question, for which the answer is likely to impact treatment decision. Further follow up is needed to correlate MRI findings with their impacts on specific oncologic outcomes.Keywords: magnetic resonance imaging, prostate cancer, definitive radiotherapy, gleason score 7
Procedia PDF Downloads 9524750 Preventive Effect of Stem Back Extracts of Coula edulis Baill. against High-Fat / High Sucrose Diet-Induced Insulin Resistance and Oxidative Stress in Rats
Authors: Eric Beyegue, Boris Azantza, Judith Laure Ngondi, Julius E. Oben
Abstract:
Background: Insulin resistance (IR) and oxidative stress are associated with obesity, diabetes mellitus, and other cardio metabolic disorders. The aim of this study was to investigate the effect of Coula edulis extracts (CEE) on insulin resistance and oxidative stress markers in high-fat/high sucrose diet-induced insulin resistance in rats. Materials and Methods: Thirty male rats were divided into 6 groups of 5 rats each fed, received daily oral administration of CE extracts for 8 weeks as follows: Group 1 or negative control group, fed with standard diet (SD); Group 2 fed with high-fat/high sucrose diet (HFHS) only; Group3 fed with HFHS + CEAq 200; Group 4 fed with HFHS + CEAq 400; Group 5 fed with HFHS + CEEt 200; Group 6 fed with HFHS + CEEt 400. At the end of the experiment (8 weeks), animals were sacrificed plasma lipid profile, glucose, insulin, oxidative marker and digestive enzyme activities were measured. The homeostasis model assessment for insulin resistance (HOMA-IR) was determined. Results: Feedings with HFHS significantly (p < 0.01) induced plasma hyperglycaemia, hyperinsulinaemia, increased triglyceride, total cholesterol, and low-density lipoprotein levels, decreased high-density lipoprotein levels, alterations of α amylase, and glucose-6-phosphatase activities, and oxidative stress. Daily oral administration with CEE for eight weeks after insulin resistance induction had a hypolipidaemic action, antioxidative activities and modulated metabolic markers. Ethanolic extract at the higher dose had the best effect on body weight gain and insulin resistance, whereas aqueous extract showed the better activity on hyperlipidemia. Conclusion: These results suggest that CEAq and CEEt at 400mg/kg are promising complementary supplements that can be used to protect better from metabolic disorders associated with HFHS.Keywords: Coula edulis Baill, high-fat / high sucrose diet, insulin resistance, oxidative stress
Procedia PDF Downloads 30924749 Using LMS as an E-Learning Platform in Higher Education
Authors: Mohammed Alhawiti
Abstract:
Assessment of Learning Management Systems has been of less importance than its due share. This paper investigates the evaluation of learning management systems (LMS) within educational setting as both an online learning system as well as a helpful tool for multidisciplinary learning environment. This study suggests a theoretical e-learning evaluation model, studying a multi-dimensional methods for evaluation through LMS system, service and content quality, learner`s perspective and attitudes of the instructor. A survey was conducted among 105 e-learners. The sample consisted of students at both undergraduate and master’s levels. Content validity, reliability were tested through the instrument, Findings suggested the suitability of the proposed model in evaluation for the satisfaction of learners through LMS. The results of this study would be valuable for both instructors and users of e-learning systems.Keywords: e-learning, LMS, higher education, management systems
Procedia PDF Downloads 40724748 The Influence of the Concentration and Temperature on the Rheological Behavior of Carbonyl-Methylcellulose
Authors: Mohamed Rabhi, Kouider Halim Benrahou
Abstract:
The rheological properties of the carbonyl-methylcellulose (CMC), of different concentrations (25000, 50000, 60000, 80000 and 100000 ppm) and different temperatures were studied. We found that the rheological behavior of all CMC solutions presents a pseudo-plastic behavior, it follows the model of Ostwald-de Waele. The objective of this work is the modeling of flow by the CMC Cross model. The Cross model gives us the variation of the viscosity according to the shear rate. This model allowed us to adjust more clearly the rheological characteristics of CMC solutions. A comparison between the Cross model and the model of Ostwald was made. Cross the model fitting parameters were determined by a numerical simulation to make an approach between the experimental curve and those given by the two models. Our study has shown that the model of Cross, describes well the flow of "CMC" for low concentrations.Keywords: CMC, rheological modeling, Ostwald model, cross model, viscosity
Procedia PDF Downloads 40924747 Measuring the Economic Empowerment of Women Using an Index: An Application to Small-Scale Fisheries and Agriculture in Sebaste, Antique
Authors: Ritchie Ann Dionela, Jorilyn Tabuena
Abstract:
This study measured the economic empowerment of women from small-scale fisheries and agriculture sector of Sebaste, Antique. There were a total of 199 respondents selected using stratified random sampling. The Five Domains of Empowerment (5DE) Index was used in measuring the economic empowerment of study participants. Through this composite index, it was determined how women scored in the five domains of empowerment, namely production, resources, income, leadership, and time. The result of the study shows that women fishers are more economically empowered than women farmers. The two sectors showed high disparity in their scores on input in productive decision; autonomy in production; ownership of assets; control over use of income; group member; speaking in public; workload; and leisure. Group member indicator contributed largely to the disempowered population in both sectors. Although income of women farmers is higher than that of women fishers, the latter are still economically empowered which suggests that economic empowerment is not dependent on income alone. The study recommends that fisheries and agriculture organization for women should be established so that their needs and concerns will be heard and addressed. It is further recommended that government projects focused on enhancing women empowerment should also give importance on other factors such as organization and leisure and not just income to totally promote of women empowerment. Further studies on measuring women’s empowerment using other methods should be pursued to provide more information on women’s well-being.Keywords: agriculture, composite index, fisheries, women economic empowerment
Procedia PDF Downloads 23924746 Diagnosis of Alzheimer Diseases in Early Step Using Support Vector Machine (SVM)
Authors: Amira Ben Rabeh, Faouzi Benzarti, Hamid Amiri, Mouna Bouaziz
Abstract:
Alzheimer is a disease that affects the brain. It causes degeneration of nerve cells (neurons) and in particular cells involved in memory and intellectual functions. Early diagnosis of Alzheimer Diseases (AD) raises ethical questions, since there is, at present, no cure to offer to patients and medicines from therapeutic trials appear to slow the progression of the disease as moderate, accompanying side effects sometimes severe. In this context, analysis of medical images became, for clinical applications, an essential tool because it provides effective assistance both at diagnosis therapeutic follow-up. Computer Assisted Diagnostic systems (CAD) is one of the possible solutions to efficiently manage these images. In our work; we proposed an application to detect Alzheimer’s diseases. For detecting the disease in early stage we used the three sections: frontal to extract the Hippocampus (H), Sagittal to analysis the Corpus Callosum (CC) and axial to work with the variation features of the Cortex(C). Our method of classification is based on Support Vector Machine (SVM). The proposed system yields a 90.66% accuracy in the early diagnosis of the AD.Keywords: Alzheimer Diseases (AD), Computer Assisted Diagnostic(CAD), hippocampus, Corpus Callosum (CC), cortex, Support Vector Machine (SVM)
Procedia PDF Downloads 390