Search results for: data sensitivity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26509

Search results for: data sensitivity

26329 Moving toward Language Acquisition: A Case Study Adapting and Applying Laban Movement Analysis in the International English as an Additional Language Classroom

Authors: Andra Yount

Abstract:

The purpose of this research project is to understand how focusing on movement can help English language learners acquire better reading, writing, and speaking skills. More specifically, this case study tests how Laban movement analysis, a tool often used in dance and physical education classes, contributes to advanced-level high school students’ English language acquisition at an international Swiss boarding school. This article shares theoretical bases for and findings from a teaching experiment in which LMA categories (body, effort, space, and shape) were adapted and introduced to students to encourage basic language acquisition and also cultural awareness and sensitivity. As part of the participatory action research process, data collection included pseudonym-protected questionnaires and written/video-taped responses to LMA language and task prompts. Responses from 43 participants were evaluated to determine the efficacy of using this system. Participants (ages 16-19) were enrolled in advanced English as an Additional Language (EAL) courses at a private, co-educational Swiss international boarding school. Final data analysis revealed that drawing attention to movement using LMA language as a stimulus creates better self-awareness and understanding/retention of key literary concepts and vocabulary but does not necessarily contribute to greater cultural sensitivity or eliminate the use of problematic (sexist, racist, or classist) language. Possibilities for future exploration and development are also explored.

Keywords: dance, English, Laban, pedagogy

Procedia PDF Downloads 152
26328 Contribution to the Study of the Microbiological Quality of Chawarma Sold in Biskra

Authors: Sara Boulmai̇z

Abstract:

In order to study the microbiological quality of chawarma sold in Biskra, a sampling through some fastfoods of the city was done, the parameters studied are highlighted according to the criteria required by the country's trade management. Microbiological analyzes revealed different levels of contamination by microorganisms. The 10 samples were of an overall view of unsatisfactory quality, and according to the standards, no sample was satisfactory. The range of total aerobic mesophilic flora found is between 105 and 1.2 × 10 7 CFU / g, that of fecal coliforms is 104 to 2.4 × 10 5 CFU / g. The suspected pathogenic staphylococci were between 3.103 and 2.7.106 CFU / g. Salmonellae were absent in all samples, whereas sulphite-reducing anaerobes were present in a single sample. The rate of E. cloacae was between 103 and 6.104 CFU / g. As for fungi and safe mice, their rate was 103 to 107 CFU / g. The study of the sensitivity of antibiotics showed multi-resistance to all the antibiotics tested, although there is a sensitivity towards others. All strains of Staphylococcus aureus tested demonstrated resistance against erythromycin, 30% against streptomycin, and 10% against tetracycline. While the strains of E. cloacae were resistant in all strains to amoxicillin, ceftazidime, cefotaxime, and erythromycin, while they were sensitive to fosfomycin, sulfamethoxazole trimethoperine, ciprofloxacin, and tetracycline. While against chlorophenicol and ofloxacin, the sensitivity was dominant, although there was intermediate resistance. In this study demonstrates that foodborne illnesses remain a problem that arises in addition to the increasingly observed bacterial resistance and that, after all, healthy eating is a right.

Keywords: chawarma, microbiological quality, pathogens., street food

Procedia PDF Downloads 111
26327 Cross-Layer Design of Event-Triggered Adaptive OFDMA Resource Allocation Protocols with Application to Vehicle Clusters

Authors: Shaban Guma, Naim Bajcinca

Abstract:

We propose an event-triggered algorithm for the solution of a distributed optimization problem by means of the projected subgradient method. Thereby, we invoke an OFDMA resource allocation scheme by applying an event-triggered sensitivity analysis at the access point. The optimal resource assignment of the subcarriers to the involved wireless nodes is carried out by considering the sensitivity analysis of the overall objective function as defined by the control of vehicle clusters with respect to the information exchange between the nodes.

Keywords: consensus, cross-layer, distributed, event-triggered, multi-vehicle, protocol, resource, OFDMA, wireless

Procedia PDF Downloads 331
26326 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 155
26325 Economic Assessment of the Fish Solar Tent Dryers

Authors: Collen Kawiya

Abstract:

In an effort of reducing post-harvest losses and improving the supply of quality fish products in Malawi, the fish solar tent dryers have been designed in the southern part of Lake Malawi for processing small fish species under the project of Cultivate Africa’s Future (CultiAF). This study was done to promote the adoption of the fish solar tent dryers by the many small scale fish processors in Malawi through the assessment of the economic viability of these dryers. With the use of the project’s baseline survey data, a business model for a constructed ‘ready for use’ solar tent dryer was developed where investment appraisal techniques were calculated in addition with the sensitivity analysis. The study also conducted a risk analysis through the use of the Monte Carlo simulation technique and a probabilistic net present value was found. The investment appraisal results showed that the net present value was US$8,756.85, the internal rate of return was 62% higher than the 16.32% cost of capital and the payback period was 1.64 years. The sensitivity analysis results showed that only two input variables influenced the fish solar dryer investment’s net present value. These are the dried fish selling prices that were correlating positively with the net present value and the fresh fish buying prices that were negatively correlating with the net present value. Risk analysis results showed that the chances that fish processors will make a loss from this type of investment are 17.56%. It was also observed that there exist only a 0.20 probability of experiencing a negative net present value from this type of investment. Lastly, the study found that the net present value of the fish solar tent dryer’s investment is still robust in spite of any changes in the levels of investors risk preferences. With these results, it is concluded that the fish solar tent dryers in Malawi are an economically viable investment because they are able to improve the returns in the fish processing activity. As such, fish processors need to adopt them by investing their money to construct and use them.

Keywords: investment appraisal, risk analysis, sensitivity analysis, solar tent drying

Procedia PDF Downloads 278
26324 Comparison of Sensitivity and Specificity of Pap Smear and Polymerase Chain Reaction Methods for Detection of Human Papillomavirus: A Review of Literature

Authors: M. Malekian, M. E. Heydari, M. Irani Estyar

Abstract:

Human papillomavirus (HPV) is one of the most common sexually transmitted infection, which may lead to cervical cancer as the main cause of it. With early diagnosis and treatment in health care services, cervical cancer and its complications are considered to be preventable. This study was aimed to compare the efficiency, sensitivity, and specificity of Pap smear and polymerase chain reaction (PCR) in detecting HPV. A literature search was performed in Google Scholar, PubMed and SID databases using the keywords 'human papillomavirus', 'pap smear' and 'polymerase change reaction' to identify studies comparing Pap smear and PCR methods for the detection. No restrictions were considered.10 studies were included in this review. All samples that were positive by pop smear were also positive by PCR. However, there were positive samples detected by PCR which was negative by pop smear and in all studies, many positive samples were missed by pop smear technique. Although The Pap smear had high specificity, PCR based HPV detection was more sensitive method and had the highest sensitivity. In order to promote the quality of detection and high achievement of the maximum results, PCR diagnostic methods in addition to the Pap smear are needed and Pap smear method should be combined with PCR techniques according to the high error rate of Pap smear in detection.

Keywords: human papillomavirus, cervical cancer, pap smear, polymerase chain reaction

Procedia PDF Downloads 131
26323 Accurate Calculation of the Penetration Depth of a Bullet Using ANSYS

Authors: Eunsu Jang, Kang Park

Abstract:

In developing an armored ground combat vehicle (AGCV), it is a very important step to analyze the vulnerability (or the survivability) of the AGCV against enemy’s attack. In the vulnerability analysis, the penetration equations are usually used to get the penetration depth and check whether a bullet can penetrate the armor of the AGCV, which causes the damage of internal components or crews. The penetration equations are derived from penetration experiments which require long time and great efforts. However, they usually hold only for the specific material of the target and the specific type of the bullet used in experiments. Thus, penetration simulation using ANSYS can be another option to calculate penetration depth. However, it is very important to model the targets and select the input parameters in order to get an accurate penetration depth. This paper performed a sensitivity analysis of input parameters of ANSYS on the accuracy of the calculated penetration depth. Two conflicting objectives need to be achieved in adopting ANSYS in penetration analysis: maximizing the accuracy of calculation and minimizing the calculation time. To maximize the calculation accuracy, the sensitivity analysis of the input parameters for ANSYS was performed and calculated the RMS error with the experimental data. The input parameters include mesh size, boundary condition, material properties, target diameter are tested and selected to minimize the error between the calculated result from simulation and the experiment data from the papers on the penetration equation. To minimize the calculation time, the parameter values obtained from accuracy analysis are adjusted to get optimized overall performance. As result of analysis, the followings were found: 1) As the mesh size gradually decreases from 0.9 mm to 0.5 mm, both the penetration depth and calculation time increase. 2) As diameters of the target decrease from 250mm to 60 mm, both the penetration depth and calculation time decrease. 3) As the yield stress which is one of the material property of the target decreases, the penetration depth increases. 4) The boundary condition with the fixed side surface of the target gives more penetration depth than that with the fixed side and rear surfaces. By using above finding, the input parameters can be tuned to minimize the error between simulation and experiments. By using simulation tool, ANSYS, with delicately tuned input parameters, penetration analysis can be done on computer without actual experiments. The data of penetration experiments are usually hard to get because of security reasons and only published papers provide them in the limited target material. The next step of this research is to generalize this approach to anticipate the penetration depth by interpolating the known penetration experiments. This result may not be accurate enough to be used to replace the penetration experiments, but those simulations can be used in the early stage of the design process of AGCV in modelling and simulation stage.

Keywords: ANSYS, input parameters, penetration depth, sensitivity analysis

Procedia PDF Downloads 401
26322 Clinical Trial of VEUPLEXᵀᴹ TBI Assay to Help Diagnose Traumatic Brain Injury by Quantifying Glial Fibrillary Acidic Protein and Ubiquitin Carboxy-Terminal Hydrolase L1 in the Serum of Patients Suspected of Mild TBI by Fluorescence Immunoassay

Authors: Moon Jung Kim, Guil Rhim

Abstract:

The clinical sensitivity of the “VEUPLEXTM TBI assay”, a clinical trial medical device, in mild traumatic brain injury was 28.6% (95% CI, 19.7%-37.5%), and the clinical specificity was 94.0% (95% CI, 89.3%). -98.7%). In addition, when the results analyzed by marker were put together, the sensitivity was higher when interpreting the two tests together than the two tests, UCHL1 and GFAP alone. Additionally, when sensitivity and specificity were analyzed based on CT results for the mild traumatic brain injury patient group, the clinical sensitivity for 2 CT-positive cases was 50.0% (95% CI: 1.3%-98.7%), and 19 CT-negative cases. The clinical specificity for cases was 68.4% (95% CI: 43.5% - 87.4%). Since the low clinical sensitivity for the two CT-positive cases was not statistically significant due to the small number of samples analyzed, it was judged necessary to secure and analyze more samples in the future. Regarding the clinical specificity analysis results for 19 CT-negative cases, there were a large number of patients who were actually clinically diagnosed with mild traumatic brain injury but actually received a CT-negative result, and about 31.6% of them showed abnormal results on VEUPLEXTM TBI assay. Although traumatic brain injury could not be detected in 31.6% of the CT scans, the possibility of actually suffering a mild brain injury could not be ruled out, so it was judged that this could be confirmed through follow-up observation of the patient. In addition, among patients with mild traumatic brain injury, CT examinations were not performed in many cases because the symptoms were very mild, but among these patients, about 25% or more showed abnormal results in the VEUPLEXTM TBI assay. In fact, no damage is observed with the naked eye immediately after traumatic brain injury, and traumatic brain injury is not observed even on CT. But in some cases, brain hemorrhage may occur (delayed cerebral hemorrhage) after a certain period of time, so the patients who did show abnormal results on VEUPLEXTM TBI assay should be followed up for the delayed cerebral hemorrhage. In conclusion, it was judged that it was difficult to judge mild traumatic brain injury with the VEUPLEXTM TBI assay only through clinical findings without CT results, that is, based on the GCS value. Even in the case of CT, it does not detect all mild traumatic brain injury, so it is difficult to necessarily judge that there is no traumatic brain injury, even if there is no evidence of traumatic brain injury in CT. And in the long term, more patients should be included to evaluate the usefulness of the VEUPLEXTM TBI assay in the detection of microscopic traumatic brain injuries without using CT.

Keywords: brain injury, traumatic brain injury, GFAP, UCHL1

Procedia PDF Downloads 99
26321 Predicting Resistance of Commonly Used Antimicrobials in Urinary Tract Infections: A Decision Tree Analysis

Authors: Meera Tandan, Mohan Timilsina, Martin Cormican, Akke Vellinga

Abstract:

Background: In general practice, many infections are treated empirically without microbiological confirmation. Understanding susceptibility of antimicrobials during empirical prescribing can be helpful to reduce inappropriate prescribing. This study aims to apply a prediction model using a decision tree approach to predict the antimicrobial resistance (AMR) of urinary tract infections (UTI) based on non-clinical features of patients over 65 years. Decision tree models are a novel idea to predict the outcome of AMR at an initial stage. Method: Data was extracted from the database of the microbiological laboratory of the University Hospitals Galway on all antimicrobial susceptibility testing (AST) of urine specimens from patients over the age of 65 from January 2011 to December 2014. The primary endpoint was resistance to common antimicrobials (Nitrofurantoin, trimethoprim, ciprofloxacin, co-amoxiclav and amoxicillin) used to treat UTI. A classification and regression tree (CART) model was generated with the outcome ‘resistant infection’. The importance of each predictor (the number of previous samples, age, gender, location (nursing home, hospital, community) and causative agent) on antimicrobial resistance was estimated. Sensitivity, specificity, negative predictive (NPV) and positive predictive (PPV) values were used to evaluate the performance of the model. Seventy-five percent (75%) of the data were used as a training set and validation of the model was performed with the remaining 25% of the dataset. Results: A total of 9805 UTI patients over 65 years had their urine sample submitted for AST at least once over the four years. E.coli, Klebsiella, Proteus species were the most commonly identified pathogens among the UTI patients without catheter whereas Sertia, Staphylococcus aureus; Enterobacter was common with the catheter. The validated CART model shows slight differences in the sensitivity, specificity, PPV and NPV in between the models with and without the causative organisms. The sensitivity, specificity, PPV and NPV for the model with non-clinical predictors was between 74% and 88% depending on the antimicrobial. Conclusion: The CART models developed using non-clinical predictors have good performance when predicting antimicrobial resistance. These models predict which antimicrobial may be the most appropriate based on non-clinical factors. Other CART models, prospective data collection and validation and an increasing number of non-clinical factors will improve model performance. The presented model provides an alternative approach to decision making on antimicrobial prescribing for UTIs in older patients.

Keywords: antimicrobial resistance, urinary tract infection, prediction, decision tree

Procedia PDF Downloads 255
26320 Effect of 8 Weeks of Intervention on Physical Fitness, Hepatokines, and Insulin Resistance in Obese Subjects

Authors: Adela Penesova, Zofia Radikova, Boris Bajer, Andrea Havranova, Miroslav Vlcek

Abstract:

Background: The aim of our study was to compare the effect of intensified lifestyle intervention on insulin resistance (HOMA-IR), alanine aminotransferase (ALT), aspartate aminotransferase (AST), and Fibroblast growth factor (FGF) 21 after 8 weeks of lifestyle intervention. Methods: A group of 43 obese patients (13M/30F; 43.0±12.4 years; BMI (body mass index) 31.2±6.3 kg/m2 participated in a weight loss interventional program (NCT02325804) following an 8-week hypocaloric diet (-30% energy expenditure) and physical activity 150 minutes/week. Insulin sensitivity was evaluated according to the homeostasis model assessment of insulin resistance (HOMA-IR) and insulin sensitivity indices according to Matsuda and Cederholm were calculated (ISImat and ISIced). Plasma ALT, AST, Fetuin-A, FGF 21, and physical fitness were measured. Results: The average reduction of body weight was 6.8±4.9 kg (0-15 kg; p=0.0006), accompanied with a significant reduction of body fat amount of fat mass (p=0.03), and waist circumference (p=0.02). Insulin sensitivity has been improved (IR HOMA 2.71±3.90 vs 1.24±0.83; p=0.01; ISIMat 6.64±4.38 vs 8.93±5.36 p ≤ 0.001). Total, LDL cholesterol, and triglycerides decreased (p=0.05, p=0.04, p=0.04, respectively). Physical fitness significantly improved after intervention (as measure VO2 max (maximal oxygen uptake) (p ≤ 0.001). ALT decreased significantly (0.44±0.26 vs post 0.33±0.18 ukat/l, p=0.004); however, AST not (pre 0.40±0.15 vs 0.35±0.09 ukat/l, p=0.07). Hepatokine Fetuin-A significantly decreased after intervention (43.1±10.8 vs 32.6±8.6 ng/ml, p < 0.001); however, FGF 21 levels tended to decrease (146±152 vs 132±164 pg/ml, p=0.07). Conclusion: 8-weeks of diet and physical activity intervention program in obese otherwise healthy subjects led to an improvement of insulin resistance parameters and liver marker profiles, as well as increased physical fitness. This study was supported by grants APVV 15-0228; VEGA 2/0161/16.

Keywords: obesity, diet, exercice, insulin sensitivity

Procedia PDF Downloads 201
26319 Effectiveness of the Lacey Assessment of Preterm Infants to Predict Neuromotor Outcomes of Premature Babies at 12 Months Corrected Age

Authors: Thanooja Naushad, Meena Natarajan, Tushar Vasant Kulkarni

Abstract:

Background: The Lacey Assessment of Preterm Infants (LAPI) is used in clinical practice to identify premature babies at risk of neuromotor impairments, especially cerebral palsy. This study attempted to find the validity of the Lacey assessment of preterm infants to predict neuromotor outcomes of premature babies at 12 months corrected age and to compare its predictive ability with the brain ultrasound. Methods: This prospective cohort study included 89 preterm infants (45 females and 44 males) born below 35 weeks gestation who were admitted to the neonatal intensive care unit of a government hospital in Dubai. Initial assessment was done using the Lacey assessment after the babies reached 33 weeks postmenstrual age. Follow up assessment on neuromotor outcomes was done at 12 months (± 1 week) corrected age using two standardized outcome measures, i.e., infant neurological international battery and Alberta infant motor scale. Brain ultrasound data were collected retrospectively. Data were statistically analyzed, and the diagnostic accuracy of the Lacey assessment of preterm infants (LAPI) was calculated -when used alone and in combination with the brain ultrasound. Results: On comparison with brain ultrasound, the Lacey assessment showed superior specificity (96% vs. 77%), higher positive predictive value (57% vs. 22%), and higher positive likelihood ratio (18 vs. 3) to predict neuromotor outcomes at one year of age. The sensitivity of Lacey assessment was lower than brain ultrasound (66% vs. 83%), whereas specificity was similar (97% vs. 98%). A combination of Lacey assessment and brain ultrasound results showed higher sensitivity (80%), positive (66%), and negative (98%) predictive values, positive likelihood ratio (24), and test accuracy (95%) than Lacey assessment alone in predicting neurological outcomes. The negative predictive value of the Lacey assessment was similar to that of its combination with brain ultrasound (96%). Conclusion: Results of this study suggest that the Lacey assessment of preterm infants can be used as a supplementary assessment tool for premature babies in the neonatal intensive care unit. Due to its high specificity, Lacey assessment can be used to identify those babies at low risk of abnormal neuromotor outcomes at a later age. When used along with the findings of the brain ultrasound, Lacey assessment has better sensitivity to identify preterm babies at particular risk. These findings have applications in identifying premature babies who may benefit from early intervention services.

Keywords: brain ultrasound, lacey assessment of preterm infants, neuromotor outcomes, preterm

Procedia PDF Downloads 138
26318 Carbon Nanofilms on Diamond for All-Carbon Chemical Sensors

Authors: Vivek Kumar, Alexander M. Zaitsev

Abstract:

A study on chemical sensing properties of carbon nanofilms on diamond for developing all-carbon chemical sensors is presented. The films were obtained by high temperature graphitization of diamond followed by successive plasma etchings. Characterization of the films was done by Raman spectroscopy, atomic force microscopy, and electrical measurements. Fast and selective response to common organic vapors as seen as sensitivity of electrical conductance was observed. The phenomenological description of the chemical sensitivity is proposed as a function of the surface and bulk material properties of the films.

Keywords: chemical sensor, carbon nanofilm, graphitization of diamond, plasma etching, Raman spectroscopy, atomic force microscopy

Procedia PDF Downloads 446
26317 Correlation Between Different Radiological Findings and Histopathological diagnosis of Breast Diseases: Retrospective Review Conducted Over Sixth Years in King Fahad University Hospital in Eastern Province, Saudi Arabia

Authors: Sadeem Aljamaan, Reem Hariri, Rahaf Alghamdi, Batool Alotaibi, Batool Alsenan, Lama Althunayyan, Areej Alnemer

Abstract:

The aim of this study is to correlate between radiological findings and histopathological results in regard to the breast imaging-reporting and data system scores, size of breast masses, molecular subtypes and suspicious radiological features, as well as to assess the concordance rate in histological grade between core biopsy and surgical excision among breast cancer patients, followed by analyzing the change of concordance rate in relation to neoadjuvant chemotherapy in a Saudi population. A retrospective review was conducted over 6-year period (2017-2022) on all breast core biopsies of women preceded by radiological investigation. Chi-squared test (χ2) was performed on qualitative data, the Mann-Whitney test for quantitative non-parametric variables, and the Kappa test for grade agreement. A total of 641 cases were included. Ultrasound, mammography, and magnetic resonance imaging demonstrated diagnostic accuracies of 85%, 77.9% and 86.9%; respectively. magnetic resonance imaging manifested the highest sensitivity (72.2%), and the lowest was for ultrasound (61%). Concordance in tumor size with final excisions was best in magnetic resonance imaging, while mammography demonstrated a higher tendency of overestimation (41.9%), and ultrasound showed the highest underestimation (67.7%). The association between basal-like molecular subtypes and the breast imaging-reporting and data system score 5 classifications was statistically significant only for magnetic resonance imaging (p=0.04). Luminal subtypes demonstrated a significantly higher percentage of speculation in mammography. Breast imaging-reporting and data system score 4 manifested a substantial number of benign pathologies in all the 3 modalities. A fair concordance rate (k= 0.212 & 0.379) was demonstrated between excision and the preceding core biopsy grading with and without neoadjuvant therapy, respectively. The results demonstrated a down-grading in cases post-neoadjuvant therapy. In cases who did not receive neoadjuvant therapy, underestimation of tumor grade in biopsy was evident. In summary, magnetic resonance imaging had the highest sensitivity, specificity, positive predictive value and accuracy of both diagnosis and estimation of tumor size. Mammography demonstrated better sensitivity than ultrasound and had the highest negative predictive value, but ultrasound had better specificity, positive predictive value and accuracy. Therefore, the combination of different modalities is advantageous. The concordance rate of core biopsy grading with excision was not impacted by neoadjuvant therapy.

Keywords: breast cancer, mammography, MRI, neoadjuvant, pathology, US

Procedia PDF Downloads 82
26316 Exponential Spline Solution for Singularly Perturbed Boundary Value Problems with an Uncertain-But-Bounded Parameter

Authors: Waheed Zahra, Mohamed El-Beltagy, Ashraf El Mhlawy, Reda Elkhadrawy

Abstract:

In this paper, we consider singular perturbation reaction-diffusion boundary value problems, which contain a small uncertain perturbation parameter. To solve these problems, we propose a numerical method which is based on an exponential spline and Shishkin mesh discretization. While interval analysis principle is used to deal with the uncertain parameter, sensitivity analysis has been conducted using different methods. Numerical results are provided to show the applicability and efficiency of our method, which is ε-uniform convergence of almost second order.

Keywords: singular perturbation problem, shishkin mesh, two small parameters, exponential spline, interval analysis, sensitivity analysis

Procedia PDF Downloads 274
26315 Assessing an Instrument Usability: Response Interpolation and Scale Sensitivity

Authors: Betsy Ng, Seng Chee Tan, Choon Lang Quek, Peter Looker, Jaime Koh

Abstract:

The purpose of the present study was to determine the particular scale rating that stands out for an instrument. The instrument was designed to assess student perceptions of various learning environments, namely face-to-face, online and blended. The original instrument had a 5-point Likert items (1 = strongly disagree and 5 = strongly agree). Alternate versions were modified with a 6-point Likert scale and a bar scale rating. Participants consisted of undergraduates in a local university were involved in the usability testing of the instrument in an electronic setting. They were presented with the 5-point, 6-point and percentage-bar (100-point) scale ratings, in response to their perceptions of learning environments. The 5-point and 6-point Likert scales were presented in the form of radio button controls for each number, while the percentage-bar scale was presented with a sliding selection. Among these responses, 6-point Likert scale emerged to be the best overall. When participants were confronted with the 5-point items, they either chose 3 or 4, suggesting that data loss could occur due to the insensitivity of instrument. The insensitivity of instrument could be due to the discreet options, as evidenced by response interpolation. To avoid the constraint of discreet options, the percentage-bar scale rating was tested, but the participant responses were not well-interpolated. The bar scale might have provided a variety of responses without a constraint of a set of categorical options, but it seemed to reflect a lack of perceived and objective accuracy. The 6-point Likert scale was more likely to reflect a respondent’s perceived and objective accuracy as well as higher sensitivity. This finding supported the conclusion that 6-point Likert items provided a more accurate measure of the participant’s evaluation. The 5-point and bar scale ratings might not be accurately measuring the participants’ responses. This study highlighted the importance of the respondent’s perception of accuracy, respondent’s true evaluation, and the scale’s ease of use. Implications and limitations of this study were also discussed.

Keywords: usability, interpolation, sensitivity, Likert scales, accuracy

Procedia PDF Downloads 406
26314 Effect of Correlation of Random Variables on Structural Reliability Index

Authors: Agnieszka Dudzik

Abstract:

The problem of correlation between random variables in the structural reliability analysis has been extensively discussed in literature on the subject. The cases taken under consideration were usually related to correlation between random variables from one side of ultimate limit state: correlation between particular loads applied on structure or correlation between resistance of particular members of a structure as a system. It has been proved that positive correlation between these random variables reduces the reliability of structure and increases the probability of failure. In the paper, the problem of correlation between random variables from both side of the limit state equation will be taken under consideration. The simplest case where these random variables are of the normal distributions will be concerned. The case when a degree of that correlation is described by the covariance or the coefficient of correlation will be used. Special attention will be paid on questions: how much that correlation changes the reliability level and can it be ignored. In reliability analysis will be used well-known methods for assessment of the failure probability: based on the Hasofer-Lind reliability index and Monte Carlo method adapted to the problem of correlation. The main purpose of this work will be a presentation how correlation of random variables influence on reliability index of steel bar structures. Structural design parameters will be defined as deterministic values and random variables. The latter will be correlated. The criterion of structural failure will be expressed by limit functions related to the ultimate and serviceability limit state. In the description of random variables will be used only for the normal distribution. Sensitivity of reliability index to the random variables will be defined. If the reliability index sensitivity due to the random variable X will be low when compared with other variables, it can be stated that the impact of this variable on failure probability is small. Therefore, in successive computations, it can be treated as a deterministic parameter. Sensitivity analysis leads to simplify the description of the mathematical model, determine the new limit functions and values of the Hasofer-Lind reliability index. In the examples, the NUMPRESS software will be used in the reliability analysis.

Keywords: correlation of random variables, reliability index, sensitivity of reliability index, steel structure

Procedia PDF Downloads 237
26313 A Sensitivity Analysis on the Production of Potable Water, Green Hydrogen and Derivatives from South-West African Seawater

Authors: Shane David van Zyl, A. J. Burger

Abstract:

The global green energy shift has placed significant value on the production of green hydrogen and its derivatives. The study examines the impact on capital expenditure (CAPEX), operational expenditure (OPEX), levelized cost, and environmental impact, depending on the relationship between various production capacities of potable water, green hydrogen, and green ammonia. A model-based sensitivity analysis approach was used to determine the relevance of various process parameters in the production of potable water combined with green hydrogen or green ammonia production. The effects of changes on CAPEX, OPEX and levelized costs of the products were determined. Furthermore, a qualitative environmental impact analysis was done to determine the effect on the environment. The findings indicated the individual process unit contribution to the overall CAPEX and OPEX while also determining the major contributors to changes in the levelized costs of products. The results emphasize the difference in costs associated with potable water, green hydrogen, and green ammonia production, indicating the extent to which potable water production costs become insignificant in the complete process, which, therefore, can have a large social benefit through increased potable water production resulting in decreased water scarcity in the south-west African region.

Keywords: CAPEX and OPEX, desalination, green hydrogen and green ammonia, sensitivity analysis

Procedia PDF Downloads 39
26312 Variance-Aware Routing and Authentication Scheme for Harvesting Data in Cloud-Centric Wireless Sensor Networks

Authors: Olakanmi Oladayo Olufemi, Bamifewe Olusegun James, Badmus Yaya Opeyemi, Adegoke Kayode

Abstract:

The wireless sensor network (WSN) has made a significant contribution to the emergence of various intelligent services or cloud-based applications. Most of the time, these data are stored on a cloud platform for efficient management and sharing among different services or users. However, the sensitivity of the data makes them prone to various confidentiality and performance-related attacks during and after harvesting. Various security schemes have been developed to ensure the integrity and confidentiality of the WSNs' data. However, their specificity towards particular attacks and the resource constraint and heterogeneity of WSNs make most of these schemes imperfect. In this paper, we propose a secure variance-aware routing and authentication scheme with two-tier verification to collect, share, and manage WSN data. The scheme is capable of classifying WSN into different subnets, detecting any attempt of wormhole and black hole attack during harvesting, and enforcing access control on the harvested data stored in the cloud. The results of the analysis showed that the proposed scheme has more security functionalities than other related schemes, solves most of the WSNs and cloud security issues, prevents wormhole and black hole attacks, identifies the attackers during data harvesting, and enforces access control on the harvested data stored in the cloud at low computational, storage, and communication overheads.

Keywords: data block, heterogeneous IoT network, data harvesting, wormhole attack, blackhole attack access control

Procedia PDF Downloads 84
26311 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 59
26310 A Theoretical Modelling and Simulation of a Surface Plasmon Resonance Biosensor for the Detection of Glucose Concentration in Blood and Urine

Authors: Natasha Mandal, Rakesh Singh Moirangthem

Abstract:

The present work reports a theoretical model to develop a plasmonic biosensor for the detection of glucose concentrations in human blood and urine as the abnormality of glucose label is the major cause of diabetes which becomes a life-threatening disease worldwide. This study is based on the surface plasmon resonance (SPR) sensor applications which is a well-established, highly sensitive, label-free, rapid optical sensing tool. Here we have introduced a sandwich assay of two dielectric spacer layers of MgF2 and BaTiO3which gives better performance compared to commonly used SiO2 and TiO2 dielectric spacers due to their low dielectric loss and higher refractive index. The sensitivity of our proposed sensor was found as 3242 nm/RIU approximately, with an excellent linear response of 0.958, which is higher than the conventional single-layer Au SPR sensor. Further, the sensitivity enhancement is also optimized by coating a few layers of two-dimensional (2D) nanomaterials (e.g., Graphene, h-BN, MXene, MoS2, WS2, etc.) on the sensor chip. Hence, our proposed SPR sensor has the potential for the detection of glucose concentration in blood and urine with enhanced sensitivity and high affinity and could be utilized as a reliable platform for the optical biosensing application in the field of medical diagnosis.

Keywords: biosensor, surface plasmon resonance, dielectric spacer, 2D nanomaterials

Procedia PDF Downloads 106
26309 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 364
26308 Biopsy or Biomarkers: Which Is the Sample of Choice in Assessment of Liver Fibrosis?

Authors: S. H. Atef, N. H. Mahmoud, S. Abdrahman, A. Fattoh

Abstract:

Background: The aim of the study is to assess the diagnostic value of fibrotest and hyaluronic acid in discriminate between insignificant and significant fibrosis. Also, to find out if these parameters could replace liver biopsy which is currently used for selection of chronic hepatitis C patients eligible for antiviral therapy. Study design: This study was conducted on 52 patients with HCV RNA detected by polymerase chain reaction (PCR) who had undergone liver biopsy and attending the internal medicine clinic at Ain Shams University Hospital. Liver fibrosis was evaluated according to the METAVIR scoring system on a scale of F0 to F4. Biochemical markers assessed were: alpha-2 macroglobulin (α2-MG), apolipoprotein A1 (Apo-A1), haptoglobin, gamma-glutamyl transferase (GGT), total bilirubin (TB) and hyaluronic acid (HA). The fibrotest score was computed after adjusting for age and gender. Predictive values and ROC curves were used to assess the accuracy of fibrotest and HA results. Results: For fibrotest, the observed area under curve for the discrimination between minimal or no fibrosis (F0-F1) and significant fibrosis (F2-F4) was 0.6736 for cutoff value 0.19 with sensitivity of 84.2% and specificity of 85.7%. For HA, the sensitivity was 89.5% and specificity was 85.7% and area under curve was 0.540 at the best cutoff value 71 mg/dL. Multi-use of both parameters, HA at 71 mg/dL with fibrotest score at 0.22 give a sensitivity 89.5%, specificity 100 and efficacy 92.3% (AUC 0.895). Conclusion: The use of both fibrotest score and HA could be as alternative to biopsy in most patients with chronic hepaitis C putting in consideration some limitations of the proposed markers in evaluating liver fibrosis.

Keywords: fibrotest, liver fibrosis, HCV RNA, biochemical markers

Procedia PDF Downloads 287
26307 Effect of Removing Hub Domain on Human CaMKII Isoforms Sensitivity to Calcium/Calmodulin

Authors: Ravid Inbar

Abstract:

CaMKII (calcium-calmodulin dependent protein kinase II) makes up 2% of the protein in our brain and has a critical role in memory formation and long-term potentiation of neurons. Despite this, research has yet to uncover the role of one of the domains on the activation of this kinase. The following proposes to express the protein without the hub domain in E. coli, leaving only the kinase and regulatory segment of the protein. Next, a series of kinase assays will be conducted to elucidate the role the hub domain plays on CaMKII sensitivity to calcium/calmodulin activation. The hub domain may be important for activation; however, it may also be a variety of domains working together to influence protein activation and not the hub alone. Characterization of a protein is critical to the future understanding of the protein's function, as well as for producing pharmacological targets in cases of patients with diseases.

Keywords: CaMKII, hub domain, kinase assays, kinase + reg seg

Procedia PDF Downloads 89
26306 Fatigue Analysis of Spread Mooring Line

Authors: Chanhoe Kang, Changhyun Lee, Seock-Hee Jun, Yeong-Tae Oh

Abstract:

Offshore floating structure under the various environmental conditions maintains a fixed position by mooring system. Environmental conditions, vessel motions and mooring loads are applied to mooring lines as the dynamic tension. Because global responses of mooring system in deep water are specified as wave frequency and low frequency response, they should be calculated from the time-domain analysis due to non-linear dynamic characteristics. To take into account all mooring loads, environmental conditions, added mass and damping terms at each time step, a lot of computation time and capacities are required. Thus, under the premise that reliable fatigue damage could be derived through reasonable analysis method, it is necessary to reduce the analysis cases through the sensitivity studies and appropriate assumptions. In this paper, effects in fatigue are studied for spread mooring system connected with oil FPSO which is positioned in deep water of West Africa offshore. The target FPSO with two Mbbls storage has 16 spread mooring lines (4 bundles x 4 lines). The various sensitivity studies are performed for environmental loads, type of responses, vessel offsets, mooring position, loading conditions and riser behavior. Each parameter applied to the sensitivity studies is investigated from the effects of fatigue damage through fatigue analysis. Based on the sensitivity studies, the following results are presented: Wave loads are more dominant in terms of fatigue than other environment conditions. Wave frequency response causes the higher fatigue damage than low frequency response. The larger vessel offset increases the mean tension and so it results in the increased fatigue damage. The external line of each bundle shows the highest fatigue damage by the governed vessel pitch motion due to swell wave conditions. Among three kinds of loading conditions, ballast condition has the highest fatigue damage due to higher tension. The riser damping occurred by riser behavior tends to reduce the fatigue damage. The various analysis results obtained from these sensitivity studies can be used for a simplified fatigue analysis of spread mooring line as the reference.

Keywords: mooring system, fatigue analysis, time domain, non-linear dynamic characteristics

Procedia PDF Downloads 334
26305 Comparison of Receiver Operating Characteristic Curve Smoothing Methods

Authors: D. Sigirli

Abstract:

The Receiver Operating Characteristic (ROC) curve is a commonly used statistical tool for evaluating the diagnostic performance of screening and diagnostic test with continuous or ordinal scale results which aims to predict the presence or absence probability of a condition, usually a disease. When the test results were measured as numeric values, sensitivity and specificity can be computed across all possible threshold values which discriminate the subjects as diseased and non-diseased. There are infinite numbers of possible decision thresholds along the continuum of the test results. The ROC curve presents the trade-off between sensitivity and the 1-specificity as the threshold changes. The empirical ROC curve which is a non-parametric estimator of the ROC curve is robust and it represents data accurately. However, especially for small sample sizes, it has a problem of variability and as it is a step function there can be different false positive rates for a true positive rate value and vice versa. Besides, the estimated ROC curve being in a jagged form, since the true ROC curve is a smooth curve, it underestimates the true ROC curve. Since the true ROC curve is assumed to be smooth, several smoothing methods have been explored to smooth a ROC curve. These include using kernel estimates, using log-concave densities, to fit parameters for the specified density function to the data with the maximum-likelihood fitting of univariate distributions or to create a probability distribution by fitting the specified distribution to the data nd using smooth versions of the empirical distribution functions. In the present paper, we aimed to propose a smooth ROC curve estimation based on the boundary corrected kernel function and to compare the performances of ROC curve smoothing methods for the diagnostic test results coming from different distributions in different sample sizes. We performed simulation study to compare the performances of different methods for different scenarios with 1000 repetitions. It is seen that the performance of the proposed method was typically better than that of the empirical ROC curve and only slightly worse compared to the binormal model when in fact the underlying samples were generated from the normal distribution.

Keywords: empirical estimator, kernel function, smoothing, receiver operating characteristic curve

Procedia PDF Downloads 152
26304 Investigation of Airship Motion Sensitivity to Geometric Parameters

Authors: Han Ding, Wang Xiaoliang, Duan Dengping

Abstract:

During the process of airship design, the layout and the geometric shape of the hull and fins are crucial to the motion characteristics of the airship. In this paper, we obtained the quantification motion sensitivity of the airship to geometric parameters through turning circles and horizontal/vertical zigzag maneuvers by the parameterization of airship shape and building the dynamic model using Lagrangian approach and MATLAB Simulink program. In the dynamics simulation program, the affection of geometric parameters to the mass, center of gravity, moments of inertia, product of inertia, added mass and the aerodynamic forces and moments have been considered.

Keywords: airship, Lagrangian approach, turning circles, horizontal/vertical zigzag maneuvers

Procedia PDF Downloads 424
26303 Premalignant and Malignant Lesions of Uterine Polyps: Analysis at a University Hospital

Authors: Manjunath A. P., Al-Ajmi G. M., Al Shukri M., Girija S

Abstract:

Introduction: This study aimed to compare the ability of hysteroscopy and ultrasonography to diagnose uterine polyps. To correlate the ultrasonography and hystroscopic findings with various clinical factors and histopathology of uterine polyps. Methods: This is a retrospective study conducted at the Department of Obstetrics and Gynaecology at Sultan Qaboos University Hospital from 2014 to 2019. All women undergoing hysteroscopy for suspected uterine polyps were included. All relevant data were obtained from the electronic patient record and analysed using SPSS. Results: A total of 77 eligible women were analysed. The mean age of the patients was 40 years. The clinical risk factors; obesity, hypertension, and diabetes mellitus, showed no significant statistical association with the presence of uterine polyps (p-value>0.005). Although 20 women (52.6%) with uterine polyps had thickened endometrium (>11 mm), however, there is no statistical association (p-value>0.005). The sensitivity and specificity of ultrasonography in the detection of uterine polyp were 39% and 65%, respectively. Whereas for hysteroscopy, it was 89% and 20%, respectively. The prevalence of malignant and premalignant lesions were 1.85% and 7.4%, respectively. Conclusion: This study found that obesity, hypertension, and diabetes mellitus were not associated with the presence of uterine polyps. There was no association between thick endometrium and uterine polyps. The sensitivity is higher for hysteroscopy, whereas the specificity is higher for sonography in detecting uterine polyps. The prevalence of malignancy was very low in uterine polyps.

Keywords: endometrial polyps, hysteroscopy, ultrasonography, premalignant, malignant

Procedia PDF Downloads 129
26302 Ethylene Sensitivity in Orchids and Its Control Using 1-MCP: A Review

Authors: Parviz Almasi

Abstract:

Ethylene is produced as a gaseous growth regulator in all plants and their constructive parts such as roots, stems, leaves, flowers and fruits. It is considered a multifunctional phytohormone that regulates both growths including flowering, fruit ripening, inhibition of root growth, and senescence such as senescence of leaves and flowers and etc. In addition, exposure to external ethylene is caused some changes that are often undesirable and harmful. Some flowers are more sensitive to others and when exposed to ethylene; their aging process is hastened. 1-MCP is an exogenous and endogenous ethylene action inhibitor, which binds to the ethylene receptors in the plants and prevents ethylene-dependent reactions. The binding affinity of 1- MCP for the receptors is about 10 times more than ethylene. Hence, 1-MCP can be a potential candidate for controlling of ethylene injury in horticultural crops. This review integrates knowledge of ethylene biosynthesis in the plants and also a mode of action of 1-MCP in preventing of ethylene injury.

Keywords: ethylene injury, biosynthesis, ethylene sensitivity, 1-MCP

Procedia PDF Downloads 100
26301 Perspective of Community Health Workers on The Sustainability of Primary Health Care

Authors: Dan Richard D. Fernandez

Abstract:

This study determined the perspectives of community health workers’ perspectives in the sustainability of primary health care. Eight community health workers, two community officials and a rural health midwife in a rural community in the in the Philippines were enjoined to share their perspectives in the sustainability of primary health care. The study utilized the critical research method. The critical research assumes that there are ‘dominated’ or ‘marginalized’ groups whose interests are not best served by existing societal structures. Their experiences highlighted that the challenges of their role include unkind and uncooperative patients, the lack of institutional support mechanisms and conflict of their roles with their family responsibilities. Their most revealing insight is the belief that primary health care is within their grasp. Finally, they believe that the burden to sustain primary health care rests on their shoulders alone. This study establishes that Multi-stakeholder participation is and Gender-sensitivity is integral to the sustainability of Primary Health Care. It also observed that the ingrained Expert-Novice or Top-down Management Culture and the marginalisation of BHWs within the system is a threat to PHC sustainability. This study also recommends to expand the study and to involve the local government units and academe in lobbying the integration of gender-sensitivity and multi-stake participatory approaches to health workforce policies. Finally, this study recognised that the CHWs’ role is indispensable to the sustainability of primary health care.

Keywords: community health workers, multi-stakeholder participation, sustainability, gender-sensitivity

Procedia PDF Downloads 544
26300 Content Analysis of Video Translations: Examining the Linguistic and Thematic Approach by Translator Abdullah Khrief on the X Platform

Authors: Easa Almustanyir

Abstract:

This study investigates the linguistic and thematic approach of translator Abdullah Khrief in the context of video translations on the X platform. The sample comprises 15 videos from Khrief's account, covering diverse content categories like science, religion, social issues, personal experiences, lifestyle, and culture. The analysis focuses on two aspects: language usage and thematic representation. Regarding language, the study examines the prevalence of English while considering the inclusion of French and German content, highlighting Khrief's multilingual versatility and ability to navigate cultural nuances. Thematically, the study explores the diverse range of topics covered, encompassing scientific, religious, social, and personal narratives, underscoring Khrief's broad subject matter expertise and commitment to knowledge dissemination. The study employs a mixed-methods approach, combining quantitative data analysis with qualitative content analysis. Statistical data on video languages, presenter genders, and content categories are analyzed, and a thorough content analysis assesses translation accuracy, cultural appropriateness, and overall quality. Preliminary findings indicate a high level of professionalism and expertise in Khrief's translations. The absence of errors across the diverse range of videos establishes his credibility and trustworthiness. Furthermore, the accurate representation of cultural nuances and sensitive topics highlights Khrief's cultural sensitivity and commitment to preserving intended meanings and emotional resonance.

Keywords: audiovisual translation, linguistic versatility, thematic diversity, cultural sensitivity, content analysis, mixed-methods approach

Procedia PDF Downloads 17