Search results for: search algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3766

Search results for: search algorithms

406 Observational Versus Angioembolisation in Blunt Splenic Trauma: A Systematic Review

Authors: E. Gopi, E. Devaindran

Abstract:

Objective: Non-operative management of blunt splenic trauma have started to overtake the traditional splenectomy in recent years across the grade of splenic injury. The two main non-operative methods are observation and angioembolisation. However, the post management convalescence in these groups are still being investigated. The study attempts to quantify the clinical indicators among the two in particular complications, mortalities, conversions to operative management and duration of inpatient stay. Methodology: A systematic search was done via PUBMED, MEDLINE, and EMBASE. A total of 639 articles identified and subsequently 68 articles were identified post duplicates, full text, and inclusion and exclusion criteria. Main exclusions were non-English articles without English translation, pure observational or angioembolisation articles of which no comparison data could be identified and articles looking into pure hemodynamically unstable patients. Results: 24 non randomized controlled trial, 5 clinical control trial and 39 retrospective studies analyzing a total of 23700 patients with blunt splenic trauma. Discrepancies in data were noted in the group who had observational management versus angioembolisation in particular as data was compared among the classes of splenic rupture, the protocol of management in different centers, availability of angiogram suite, and the study design. Further variability was also noted in the angioembolisation arm as the preference for treatment differs between distal versus proximal splenic artery involvement. Overall the cumulative mortality in both observational and angioembolisation group were similar, 2.78% and 5.97% respectively. The cause of death however is not directly attributed to the management itself but rather patient comorbidities, other associated injuries and conversions to splenectomy leading to post splenectomy complications. The cumulative morbidity among each group appears to be same approximately 12% in observational versus 15% in angioembolisation. However, the type of complications varies with the observational group having higher rates of inpatient stay and intrabdominal hematoma infection and angioembolisation group developing more splenic infarcts and bleeds. There were significant disparity in reporting the actual data on duration of inpatient stay and complications to allow a statistically significant quantitative analysis to be done, 15 articles however are currently being considered. Conclusions: Observational management appears to be much effective in managing lower grade splenic trauma (grade 1 and 2) where else angioembolisation appears to play a bigger role in intermediate grades (grade 3-4) in ensuring splenic function preservation. Care has to be taken however in the angioembolisation group in view of distal splenic infarct group compromising splenic function. The cumulated data of 15 articles are now being considered for a meta-analysis.

Keywords: blunt splenic trauma, conservative, non-operative, angioembolisation

Procedia PDF Downloads 266
405 RA-Apriori: An Efficient and Faster MapReduce-Based Algorithm for Frequent Itemset Mining on Apache Flink

Authors: Sanjay Rathee, Arti Kashyap

Abstract:

Extraction of useful information from large datasets is one of the most important research problems. Association rule mining is one of the best methods for this purpose. Finding possible associations between items in large transaction based datasets (finding frequent patterns) is most important part of the association rule mining. There exist many algorithms to find frequent patterns but Apriori algorithm always remains a preferred choice due to its ease of implementation and natural tendency to be parallelized. Many single-machine based Apriori variants exist but massive amount of data available these days is above capacity of a single machine. Therefore, to meet the demands of this ever-growing huge data, there is a need of multiple machines based Apriori algorithm. For these types of distributed applications, MapReduce is a popular fault-tolerant framework. Hadoop is one of the best open-source software frameworks with MapReduce approach for distributed storage and distributed processing of huge datasets using clusters built from commodity hardware. However, heavy disk I/O operation at each iteration of a highly iterative algorithm like Apriori makes Hadoop inefficient. A number of MapReduce-based platforms are being developed for parallel computing in recent years. Among them, two platforms, namely, Spark and Flink have attracted a lot of attention because of their inbuilt support to distributed computations. Earlier we proposed a reduced- Apriori algorithm on Spark platform which outperforms parallel Apriori, one because of use of Spark and secondly because of the improvement we proposed in standard Apriori. Therefore, this work is a natural sequel of our work and targets on implementing, testing and benchmarking Apriori and Reduced-Apriori and our new algorithm ReducedAll-Apriori on Apache Flink and compares it with Spark implementation. Flink, a streaming dataflow engine, overcomes disk I/O bottlenecks in MapReduce, providing an ideal platform for distributed Apriori. Flink's pipelining based structure allows starting a next iteration as soon as partial results of earlier iteration are available. Therefore, there is no need to wait for all reducers result to start a next iteration. We conduct in-depth experiments to gain insight into the effectiveness, efficiency and scalability of the Apriori and RA-Apriori algorithm on Flink.

Keywords: apriori, apache flink, Mapreduce, spark, Hadoop, R-Apriori, frequent itemset mining

Procedia PDF Downloads 294
404 Gluten Intolerance, Celiac Disease, and Neuropsychiatric Disorders: A Translational Perspective

Authors: Jessica A. Hellings, Piyushkumar Jani

Abstract:

Background: Systemic autoimmune disorders are increasingly implicated in neuropsychiatric illness, especially in the setting of treatment resistance in individuals of all ages. Gluten allergy in fullest extent results in celiac disease, affecting multiple organs including central nervous system (CNS). Clinicians often lack awareness of the association between neuropsychiatric illness and gluten allergy, partly since many such research studies are published in immunology and gastroenterology journals. Methods: Following a Pubmed literature search and online searches on celiac disease websites, 40 articles are critically reviewed in detail. This work reviews celiac disease, gluten intolerance and current evidence of their relationship to neuropsychiatric and systemic illnesses. The review also covers current work-up and diagnosis, as well as dietary interventions, gluten restriction outcomes, and future research directions. Results: Gluten allergy in susceptible individuals damages the small intestine, producing a leaky gut and malabsorption state, as well as allowing antibodies into the bloodstream, which attack major organs. Lack of amino acid precursors for neurotransmitter synthesis together with antibody-associated brain changes and hypoperfusion may result in neuropsychiatric illness. This is well documented; however, studies in neuropsychiatry are often small. In the large CATIE trial, subjects with schizophrenia had significantly increased antibodies to tissue transglutaminase (TTG), and antigliadin antibodies, both significantly greater gluten antibodies than in control subjects. On later follow up, TTG-6 antibodies were identified in these subjects’ brains but not in their intestines. Significant evidence mostly from small studies also exists for gluten allergy and celiac-related depression, anxiety disorders, attention-deficit/hyperactivity disorder, autism spectrum disorders, ataxia, and epilepsy. Dietary restriction of gluten resulted in remission in several published cases, including for treatment-resistant schizophrenia. Conclusions: Ongoing and larger studies are needed of the diagnosis and treatment efficacy of the gluten-free diet in neuropsychiatric illness. Clinicians should ask about the patient history of anemia, hypothyroidism, irritable bowel syndrome and family history of benefit from the gluten-free diet, not limited to but especially in cases of treatment resistance. Obtaining gluten antibodies by a simple blood test, and referral for gastrointestinal work-up in positive cases should be considered.

Keywords: celiac, gluten, neuropsychiatric, translational

Procedia PDF Downloads 161
403 An Audit on the Role of Sentinel Node Biopsy in High-Risk Ductal Carcinoma in Situ and Intracystic Papillary Carcinoma

Authors: M. Sulieman, H. Arabiyat, H. Ali, K. Potiszil, I. Abbas, R. English, P. King, I. Brown, P. Drew

Abstract:

Introduction: The incidence of breast ductal Carcinoma in Situ (DCIS) has been increasing; it currently represents up 20-25% of all breast carcinomas. Some aspects of DCIS management are still controversial, mainly due to the heterogeneity of its clinical presentation and of its biological and pathological characteristics. In DCIS, histological diagnosis obtained preoperatively, carries the risk of sampling error if the presence of invasive cancer is subsequently diagnosed. The mammographic extent over than 4–5 cm and the presence of architectural distortion, focal asymmetric density or mass on mammography are proven important risk factors of preoperative histological under staging. Intracystic papillary cancer (IPC) is a rare form of breast carcinoma. Despite being previously compared to DCIS it has been shown to present histologically with invasion of the basement membrane and even metastasis. SLNB – Carries the risk of associated comorbidity that should be considered when planning surgery for DCIS and IPC. Objectives: The aim of this Audit was to better define a ‘high risk’ group of patients with pre-op diagnosis of non-invasive cancer undergoing breast conserving surgery, who would benefit from sentinel node biopsy. Method: Retrospective data collection of all patients with ductal carcinoma in situ over 5 years. 636 patients identified, and after exclusion criteria applied: 394 patients were included. High risk defined as: Extensive micro-calcification >40mm OR any mass forming DCIS. IPC: Winpath search from for the term ‘papillary carcinoma’ in any breast specimen for 5 years duration;.29 patients were included in this group. Results: DCIS: 188 deemed high risk due to >40mm calcification or a mass forming (radiological or palpable) 61% of those had a mastectomy and 32% BCS. Overall, in that high-risk group - the number with invasive disease was 38%. Of those high-risk DCIS pts 85% had a SLN - 80% at the time of surgery and 5% at a second operation. For the BCS patients - 42% had SLN at time of surgery and 13% (8 patients) at a second operation. 15 (7.9%) pts in the high-risk group had a positive SLNB, 11 having a mastectomy and 4 having BCS. IPC: The provisional diagnosis of encysted papillary carcinoma is upgraded to an invasive carcinoma on final histology in around a third of cases. This has may have implications when deciding whether to offer sentinel node removal at the time of therapeutic surgery. Conclusions: We have defined a ‘high risk’ group of pts with pre-op diagnosis of non-invasive cancer undergoing BCS, who would benefit from SLNB at the time of the surgery. In patients with high-risk features; the risk of invasive disease is up to 40% but the risk of nodal involvement is approximately 8%. The risk of morbidity from SLN is up to about 5% especially the risk of lymphedema.

Keywords: breast ductal carcinoma in Situ (DCIS), intracystic papillary carcinoma (IPC), sentinel node biopsy (SLNB), high-risk, non-invasive, cancer disease

Procedia PDF Downloads 111
402 The Hypoglycaemic and Antioxidant Effects of Ethanolic Extract of Curcuma Longa Rhizomes Alone and with Two Pepper Adjuvants in Alloxan-Induced Diabetic Rats

Authors: J. O. Ezekwesili-Ofili, L. I. Okorafor, S. C. Nsofor

Abstract:

Diabetes mellitus is a carbohydrate metabolism disorder due to an absolute or relative deficiency of insulin secretion, action or both. Many known hypoglycaemic drugs are known to produce serious side effects. However, the search for safer and more effective agents has shifted to plant products, including foods and spices. One of such is the rhizome of Curcuma longa or turmeric, which is a spice with high medicinal value. A drawback in the use of C. longa is the poor bioavailability of curcumin, the active ingredient. It has been reported that piperine, an alkaloid present in peppers increases the bioavailability of curcumin. This work therefore investigated the hypoglycaemic and antioxidant effects of ethanolic extract of C. longa rhizomes, alone and with two pepper adjuvants in alloxan-induced diabetic rats. A total of 48 rats were divided into 6 groups of 8 rats each. Groups A–E were induced with diabetes using 150mg/kg body weight of alloxan monohydrate, while group F was normoglycaemic: Group A: Diabetic; fed with 400 mg/g body weight of turmeric extract; group B: Diabetic, fed with 400 mg/kg b. w. and 200mg/kg b. w of ethanolic extract of seeds of Piper guinensee; group C: Diabetic, fed with 400 mg/kg b. w. and 200 mg /kg b. w. of ethanolic extract of seeds of Capsicum annum var cameroun, group D: Diabetic, treated with standard drug, glibenclamide (0.3mg/kg body weight), group E: Diabetic; no treatment i.e. Positive control and group F: non diabetic, no treatment i.e. Negative control. Blood glucose levels were monitored for 14 days using a glucometer. The levels of the antioxidant enzymes; glutathione peroxidase, catalase and superoxide dismutase were also assayed in serum. The ethanolic extracts of C. longa rhizomes at the dose given (400 mg/kg b. w) significantly reduced the blood glucose levels of the diabetic rats (p<0.05) comparable to the standard drug. Co administration of extract of the peppers did not significantly increase the efficiency of the extract, although C. annum var cameroun showed greater effect, though not significantly. The antioxidant effect of the extract was significant in diabetic rats. The use of piperine-containing peppers enhanced the antioxidant effect. Phytochemical analyses of the ethanolic extract of C. longa showed the presence of alkaloids, flavonoids, steroids, saponins, tannins, glycosides, and terpenoids. These results suggest that the ethanolic extract of C. longa had antidiabetic with antioxidant effects and could thus be of benefit in the treatment and management of diabetes as well as ameliorate pro-oxidant effects that may lead to diabetic complications. However, while the addition of piperine did not affect the antidiabetic effect of C. longa, the antioxidant effect was greatly enhanced.

Keywords: antioxidant, Curcuma longa rhizome, hypoglycaemic, pepper adjuvants, piperine

Procedia PDF Downloads 236
401 Valorization of Lignocellulosic Wastes– Evaluation of Its Toxicity When Used in Adsorption Systems

Authors: Isabel Brás, Artur Figueirinha, Bruno Esteves, Luísa P. Cruz-Lopes

Abstract:

The agriculture lignocellulosic by-products are receiving increased attention, namely in the search for filter materials that retain contaminants from water. These by-products, specifically almond and hazelnut shells are abundant in Portugal once almond and hazelnuts production is a local important activity. Hazelnut and almond shells have as main constituents lignin, cellulose and hemicelluloses, water soluble extractives and tannins. Along the adsorption of heavy metals from contaminated waters, water soluble compounds can leach from shells and have a negative impact in the environment. Usually, the chemical characterization of treated water by itself may not show environmental impact caused by the discharges when parameters obey to legal quality standards for water. Only biological systems can detect the toxic effects of the water constituents. Therefore, the evaluation of toxicity by biological tests is very important when deciding the suitability for safe water discharge or for irrigation applications. The main purpose of the present work was to assess the potential impacts of waters after been treated for heavy metal removal by hazelnut and almond shells adsorption systems, with short term acute toxicity tests. To conduct the study, water at pH 6 with 25 mg.L-1 of lead, was treated with 10 g of shell per litre of wastewater, for 24 hours. This procedure was followed for each bark. Afterwards the water was collected for toxicological assays; namely bacterial resistance, seed germination, Lemna minor L. test and plant grow. The effect in isolated bacteria strains was determined by disc diffusion method and the germination index of seed was evaluated using lettuce, with temperature and humidity germination control for 7 days. For aquatic higher organism, Lemnas were used with 4 days contact time with shell solutions, in controlled light and temperature. For terrestrial higher plants, biomass production was evaluated after 14 days of tomato germination had occurred in soil, with controlled humidity, light and temperature. Toxicity tests of water treated with shells revealed in some extent effects in the tested organisms, with the test assays showing a close behaviour as the control, leading to the conclusion that its further utilization may not be considered to create a serious risk to the environment.

Keywords: lignocellulosic wastes, adsorption, acute toxicity tests, risk assessment

Procedia PDF Downloads 366
400 An Improvement of ComiR Algorithm for MicroRNA Target Prediction by Exploiting Coding Region Sequences of mRNAs

Authors: Giorgio Bertolazzi, Panayiotis Benos, Michele Tumminello, Claudia Coronnello

Abstract:

MicroRNAs are small non-coding RNAs that post-transcriptionally regulate the expression levels of messenger RNAs. MicroRNA regulation activity depends on the recognition of binding sites located on mRNA molecules. ComiR (Combinatorial miRNA targeting) is a user friendly web tool realized to predict the targets of a set of microRNAs, starting from their expression profile. ComiR incorporates miRNA expression in a thermodynamic binding model, and it associates each gene with the probability of being a target of a set of miRNAs. ComiR algorithms were trained with the information regarding binding sites in the 3’UTR region, by using a reliable dataset containing the targets of endogenously expressed microRNA in D. melanogaster S2 cells. This dataset was obtained by comparing the results from two different experimental approaches, i.e., inhibition, and immunoprecipitation of the AGO1 protein; this protein is a component of the microRNA induced silencing complex. In this work, we tested whether including coding region binding sites in the ComiR algorithm improves the performance of the tool in predicting microRNA targets. We focused the analysis on the D. melanogaster species and updated the ComiR underlying database with the currently available releases of mRNA and microRNA sequences. As a result, we find that the ComiR algorithm trained with the information related to the coding regions is more efficient in predicting the microRNA targets, with respect to the algorithm trained with 3’utr information. On the other hand, we show that 3’utr based predictions can be seen as complementary to the coding region based predictions, which suggests that both predictions, from 3'UTR and coding regions, should be considered in a comprehensive analysis. Furthermore, we observed that the lists of targets obtained by analyzing data from one experimental approach only, that is, inhibition or immunoprecipitation of AGO1, are not reliable enough to test the performance of our microRNA target prediction algorithm. Further analysis will be conducted to investigate the effectiveness of the tool with data from other species, provided that validated datasets, as obtained from the comparison of RISC proteins inhibition and immunoprecipitation experiments, will be available for the same samples. Finally, we propose to upgrade the existing ComiR web-tool by including the coding region based trained model, available together with the 3’UTR based one.

Keywords: AGO1, coding region, Drosophila melanogaster, microRNA target prediction

Procedia PDF Downloads 451
399 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation

Authors: Jonathan Gong

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning

Procedia PDF Downloads 130
398 Modeling Floodplain Vegetation Response to Groundwater Variability Using ArcSWAT Hydrological Model, Moderate Resolution Imaging Spectroradiometer - Normalised Difference Vegetation Index Data, and Machine Learning

Authors: Newton Muhury, Armando A. Apan, Tek Maraseni

Abstract:

This study modelled the relationships between vegetation response and available water below the soil surface using the Terra’s Moderate Resolution Imaging Spectroradiometer (MODIS) generated Normalised Difference Vegetation Index (NDVI) and soil water content (SWC) data. The Soil & Water Assessment Tool (SWAT) interface known as ArcSWAT was used in ArcGIS for the groundwater analysis. The SWAT model was calibrated and validated in SWAT-CUP software using 10 years (2001-2010) of monthly streamflow data. The average Nash-Sutcliffe Efficiency during the calibration and validation was 0.54 and 0.51, respectively, indicating that the model performances were good. Twenty years (2001-2020) of monthly MODIS NDVI data for three different types of vegetation (forest, shrub, and grass) and soil water content for 43 sub-basins were analysed using the WEKA, machine learning tool with a selection of two supervised machine learning algorithms, i.e., support vector machine (SVM) and random forest (RF). The modelling results show that different types of vegetation response and soil water content vary in the dry and wet season. For example, the model generated high positive relationships (r=0.76, 0.73, and 0.81) between the measured and predicted NDVI values of all vegetation in the study area against the groundwater flow (GW), soil water content (SWC), and the combination of these two variables, respectively, during the dry season. However, these relationships were reduced by 36.8% (r=0.48) and 13.6% (r=0.63) against GW and SWC, respectively, in the wet season. On the other hand, the model predicted a moderate positive relationship (r=0.63) between shrub vegetation type and soil water content during the dry season, which was reduced by 31.7% (r=0.43) during the wet season. Our models also predicted that vegetation in the top location (upper part) of the sub-basin is highly responsive to GW and SWC (r=0.78, and 0.70) during the dry season. The results of this study indicate the study region is suitable for seasonal crop production in dry season. Moreover, the results predicted that the growth of vegetation in the top-point location is highly dependent on groundwater flow in both dry and wet seasons, and any instability or long-term drought can negatively affect these floodplain vegetation communities. This study has enriched our knowledge of vegetation responses to groundwater in each season, which will facilitate better floodplain vegetation management.

Keywords: ArcSWAT, machine learning, floodplain vegetation, MODIS NDVI, groundwater

Procedia PDF Downloads 119
397 Sustainability from Ecocity to Ecocampus: An Exploratory Study on Spanish Universities' Water Management

Authors: Leyla A. Sandoval Hamón, Fernando Casani

Abstract:

Sustainability has been integrated into the cities’ agenda due to the impact that they generate. The dimensions of greater proliferation of sustainability, which are taken as a reference, are economic, social and environmental. Thus, the decisions of management of the sustainable cities search a balance between these dimensions in order to provide environment-friendly alternatives. In this context, urban models (where water consumption, energy consumption, waste production, among others) that have emerged in harmony with the environment, are known as Ecocity. A similar model, but on a smaller scale, is ‘Ecocampus’ that is developed in universities (considered ‘small cities’ due to its complex structure). So, sustainable practices are being implemented in the management of university campus activities, following different relevant lines of work. The universities have a strategic role in society, and their activities can strengthen policies, strategies, and measures of sustainability, both internal and external to the organization. Because of their mission in knowledge creation and transfer, these institutions can promote and disseminate more advanced activities in sustainability. This model replica also implies challenges in the sustainable management of water, energy, waste, transportation, among others, inside the campus. The challenge that this paper focuses on is the water management, taking into account that the universities consume big amounts of this resource. The purpose of this paper is to analyze the sustainability experience, with emphasis on water management, of two different campuses belonging to two different Spanish universities - one urban campus in a historic city and the other a suburban campus in the outskirts of a large city. Both universities are in the top hundred of international rankings of sustainable universities. The methodology adopts a qualitative method based on the technique of in-depth interviews and focus-group discussions with administrative and academic staff of the ‘Ecocampus’ offices, the organizational units for sustainability management, from the two Spanish universities. The hypotheses indicate that sustainable policies in terms of water management are best in campuses without big green spaces and where the buildings are built or rebuilt with modern style. The sustainability efforts of the university are independent of the kind of (urban – suburban) campus but an important aspect to improve is the degree of awareness of the university community about water scarcity. In general, the paper suggests that higher institutions adapt their sustainability policies depending on the location and features of the campus and their engagement with the water conservation. Many Spanish universities have proposed policies, good practices, and measures of sustainability. In fact, some offices or centers of Ecocampus have been founded. The originality of this study is to learn from the different experiences of sustainability policies of universities.

Keywords: ecocampus, ecocity, sustainability, water management

Procedia PDF Downloads 221
396 Detection and Identification of Antibiotic Resistant Bacteria Using Infra-Red-Microscopy and Advanced Multivariate Analysis

Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel

Abstract:

Antimicrobial drugs have an important role in controlling illness associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global health-care problem. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing like disk diffusion are time-consuming and other method including E-test, genotyping are relatively expensive. Fourier transform infrared (FTIR) microscopy is rapid, safe, and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 550 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 85% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.

Keywords: antibiotics, E. coli, FTIR, multivariate analysis, susceptibility

Procedia PDF Downloads 265
395 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 76
394 Torn Between the Lines of Border: The Pakhtuns of Pakistan and Afghanistan in Search of Identity

Authors: Priyanka Dutta Chowdhury

Abstract:

A globalized connected world, calling loud for a composite culture, was still not able to erase the pain of a desired nationalism based on cultural identity. In the South Asian region, the random drawing of the boundaries without taking the ethnic aspect into consideration have always challenged the very basis of the existence of certain groups. The urge to reunify with the fellow brothers on both sides of the border have always called for a chaos and schism in the countries of this region. Sometimes this became a tool to bargain with the state and find a favorable position in the power structure on the basis of cultural identity. In Pakistan and Afghanistan, the Pakhtuns who are divided across the border of the two countries, from the inception of creation of Pakistan have posed various challenges and hampered the growth of a consolidated nation. The Pakhtuns or Pashtuns of both Pakistan and Afghanistan have a strong cultural affinity which blurs their physical distancing and calls for a nationalism based on this ethnic affiliation. Both the sides wanted to create Pakhtunistan unifying all the Pakhtuns of the region. For long, this group have denied to accept the Durand line separating the two. This was an area of concern especially for the Pakhtuns of Pakistan torn between the decision either to join Afghanistan, create a nation of their own or be a part of Pakistan. This ethnic issue became a bone of contention between the two countries. Later, though well absorbed and recognized in the respective countries, they have fought for their identity and claimed for a dominant position in the politics of the nations. Because of the porous borders often influx of refugees was seen especially during Afghan Wars and later many extremists’ groups were born from them especially the Taliban. In the recent string of events, when the Taliban, who are mostly Pakhtuns ethnically, came in power in Afghanistan, a wave of sympathy arose in Pakistan. This gave a strengthening position to the religious Pakhtuns across the border. It is to be noted here that a fragmented Pakhtun identity between the religious and seculars were clearly visible, voicing for their place in the political hierarchy of the country with a vision distinct from each other especially in Pakistan. In this context the paper tries to evaluate the reasons for this cultural turmoil between the countries and this ethnic group. It also aims to analyze the concept of how the identity politics still holds its relevance in the contemporary world. Additionally, the recent trend of fragmented identity points towards instrumentalization of this ethnic groups, who are engaged in the bargaining process with the state for a robust position in the power structure. In the end, the paper aims to deduct from the theoretical conditions of identity politics, whether this is a primordial or a situational tool to have a visibility in the power structure of the contemporary world.

Keywords: cultural identity, identity politics, instrumentalization of identity pakhtuns, power structure

Procedia PDF Downloads 82
393 Compressed Natural Gas (CNG) Injector Research for Dual Fuel Engine

Authors: Adam Majczak, Grzegorz Barański, Marcin Szlachetka

Abstract:

Environmental considerations necessitate the search for new energy sources. One of the available solutions is a partial replacement of diesel fuel by compressed natural gas (CNG) in the compression ignition engines. This type of the engines is used mainly in vans and trucks. These units are also gaining more and more popularity in the passenger car market. In Europe, this part of the market share reaches 50%. Diesel engines are also used in industry in such vehicles as ship or locomotives. Diesel engines have higher emissions of nitrogen oxides in comparison to spark ignition engines. This can be currently limited by optimizing the combustion process and the use of additional systems such as exhaust gas recirculation or AdBlue technology. As a result of the combustion process of diesel fuel also particulate matter (PM) that are harmful to the human health are emitted. Their emission is limited by the use of a particulate filter. One of the method for toxic components emission reduction may be the use of liquid gas fuel such as propane and butane (LPG) or compressed natural gas (CNG). In addition to the environmental aspects, there are also economic reasons for the use of gaseous fuels to power diesel engines. A total or partial replacement of diesel gas is possible. Depending on the used technology and the percentage of diesel fuel replacement, it is possible to reduce the content of nitrogen oxides in the exhaust gas even by 30%, particulate matter (PM) by 95 % carbon monoxide and by 20%, in relation to original diesel fuel. The research object is prototype gas injector designed for direct injection of compressed natural gas (CNG) in compression ignition engines. The construction of the injector allows for it positioning in the glow plug socket, so that the gas is injected directly into the combustion chamber. The cycle analysis of the four-cylinder Andoria ADCR engine with a capacity of 2.6 dm3 for different crankshaft rotational speeds allowed to determine the necessary time for fuel injection. Because of that, it was possible to determine the required mass flow rate of the injector, for replacing as much of the original fuel by gaseous fuel. To ensure a high value of flow inside the injector, supply pressure equal to 1 MPa was applied. High gas supply pressure requires high value of valve opening forces. For this purpose, an injector with hydraulic control system, using a liquid under pressure for the opening process was designed. On the basis of air pressure measurements in the flow line after the injector, the analysis of opening and closing of the valve was made. Measurements of outflow mass of the injector were also carried out. The results showed that the designed injector meets the requirements necessary to supply ADCR engine by the CNG fuel.

Keywords: CNG, diesel engine, gas flow, gas injector

Procedia PDF Downloads 493
392 Screening for Non-hallucinogenic Neuroplastogens as Drug Candidates for the Treatment of Anxiety, Depression, and Posttraumatic Stress Disorder

Authors: Jillian M. Hagel, Joseph E. Tucker, Peter J. Facchini

Abstract:

With the aim of establishing a holistic approach for the treatment of central nervous system (CNS) disorders, we are pursuing a drug development program rapidly progressing through discovery and characterization phases. The drug candidates identified in this program are referred to as neuroplastogens owing to their ability to mediate neuroplasticity, which can be beneficial to patients suffering from anxiety, depression, or posttraumatic stress disorder. These and other related neuropsychiatric conditions are associated with the onset of neuronal atrophy, which is defined as a reduction in the number and/or productivity of neurons. The stimulation of neuroplasticity results in an increase in the connectivity between neurons and promotes the restoration of healthy brain function. We have synthesized a substantial catalogue of proprietary indolethylamine derivatives based on the general structures of serotonin (5-hydroxytryptamine) and psychedelic molecules such as N,N-dimethyltryptamine (DMT) and psilocin (4-hydroxy-DMT) that function as neuroplastogens. A primary objective in our screening protocol is the identification of derivatives associated with a significant reduction in hallucination, which will allow administration of the drug at a dose that induces neuroplasticity and triggers other efficacious outcomes in the treatment of targeted CNS disorders but which does not cause a psychedelic response in the patient. Both neuroplasticity and hallucination are associated with engagement of the 5HT2A receptor, requiring drug candidates differentially coupled to these two outcomes at a molecular level. We use novel and proprietary artificial intelligence algorithms to predict the mode of binding to the 5HT2A receptor, which has been shown to correlate with the hallucinogenic response. Hallucination is tested using the mouse head-twitch response model, whereas mouse marble-burying and sucrose preference assays are used to evaluate anxiolytic and anti-depressive potential. Neuroplasticity is assays using dendritic outgrowth assays and cell-based ELISA analysis. Pharmacokinetics and additional receptor-binding analyses also contribute the selection of lead candidates. A summary of the program is presented.

Keywords: neuroplastogen, non-hallucinogenic, drug development, anxiety, depression, PTSD, indolethylamine derivatives, psychedelic-inspired, 5-HT2A receptor, computational chemistry, head-twitch response behavioural model, neurite outgrowth assay

Procedia PDF Downloads 138
391 Prenatal Genetic Screening and Counselling Competency Challenges of Nurse-Midwife

Authors: Girija Madhavanprabhakaran, Frincy Franacis, Sheeba Elizabeth John

Abstract:

Introduction: A wide range of prenatal genetic screening is introduced with increasing incidences of congenital anomalies even in low-risk pregnancies and is an emerging standard of care. Being frontline caretakers, the role and responsibilities of nurses and midwives are critical as they are working along with couples to provide evidence-based supportive educative care. The increasing genetic disorders and advances in prenatal genetic screening with limited genetic counselling facilities urge nurses and midwifery nurses with essential competencies to help couples to take informed decision. Objective: This integrative literature review aimed to explore nurse midwives’ knowledge and role in prenatal screening and genetic counselling competency and the challenges faced by them to cater to all pregnant women to empower their autonomy in decision making and ensuring psychological comfort. Method: An electronic search using keywords prenatal screening, genetic counselling, prenatal counselling, nurse midwife, nursing education, genetics, and genomics were done in the PUBMED, SCOPUS and Medline, Google Scholar. Finally, based on inclusion criteria, 8 relevant articles were included. Results: The main review results suggest that nurses and midwives lack essential support, knowledge, or confidence to be able to provide genetic counselling and help the couples ethically to ensure client autonomy and decision making. The majority of nurses and midwives reported inadequate levels of knowledge on genetic screening and their roles in obtaining family history, pedigrees, and providing genetic information for an affected client or high-risk families. The deficiency of well-recognized and influential clinical academic midwives in midwifery practice is also reported. Evidence recommended to update and provide sound educational training to improve nurse-midwife competence and confidence. Conclusion: Overcoming the challenges to achieving informed choices about fetal anomaly screening globally is a major concern. Lack of adequate knowledge and counselling competency, communication insufficiency, need for education and policy are major areas to address. Prenatal nurses' and midwives’ knowledge on prenatal genetic screening and essential counselling competencies can ensure services to the majority of pregnant women around the globe to be better-informed decision-makers and enhances their autonomy, and reduces ethical dilemmas.

Keywords: challenges, genetic counselling, prenatal screening, prenatal counselling

Procedia PDF Downloads 199
390 Virtual Metrology for Copper Clad Laminate Manufacturing

Authors: Misuk Kim, Seokho Kang, Jehyuk Lee, Hyunchang Cho, Sungzoon Cho

Abstract:

In semiconductor manufacturing, virtual metrology (VM) refers to methods to predict properties of a wafer based on machine parameters and sensor data of the production equipment, without performing the (costly) physical measurement of the wafer properties (Wikipedia). Additional benefits include avoidance of human bias and identification of important factors affecting the quality of the process which allow improving the process quality in the future. It is however rare to find VM applied to other areas of manufacturing. In this work, we propose to use VM to copper clad laminate (CCL) manufacturing. CCL is a core element of a printed circuit board (PCB) which is used in smartphones, tablets, digital cameras, and laptop computers. The manufacturing of CCL consists of three processes: Treating, lay-up, and pressing. Treating, the most important process among the three, puts resin on glass cloth, heat up in a drying oven, then produces prepreg for lay-up process. In this process, three important quality factors are inspected: Treated weight (T/W), Minimum Viscosity (M/V), and Gel Time (G/T). They are manually inspected, incurring heavy cost in terms of time and money, which makes it a good candidate for VM application. We developed prediction models of the three quality factors T/W, M/V, and G/T, respectively, with process variables, raw material, and environment variables. The actual process data was obtained from a CCL manufacturer. A variety of variable selection methods and learning algorithms were employed to find the best prediction model. We obtained prediction models of M/V and G/T with a high enough accuracy. They also provided us with information on “important” predictor variables, some of which the process engineers had been already aware and the rest of which they had not. They were quite excited to find new insights that the model revealed and set out to do further analysis on them to gain process control implications. T/W did not turn out to be possible to predict with a reasonable accuracy with given factors. The very fact indicates that the factors currently monitored may not affect T/W, thus an effort has to be made to find other factors which are not currently monitored in order to understand the process better and improve the quality of it. In conclusion, VM application to CCL’s treating process was quite successful. The newly built quality prediction model allowed one to reduce the cost associated with actual metrology as well as reveal some insights on the factors affecting the important quality factors and on the level of our less than perfect understanding of the treating process.

Keywords: copper clad laminate, predictive modeling, quality control, virtual metrology

Procedia PDF Downloads 350
389 Short Association Bundle Atlas for Lateralization Studies from dMRI Data

Authors: C. Román, M. Guevara, P. Salas, D. Duclap, J. Houenou, C. Poupon, J. F. Mangin, P. Guevara

Abstract:

Diffusion Magnetic Resonance Imaging (dMRI) allows the non-invasive study of human brain white matter. From diffusion data, it is possible to reconstruct fiber trajectories using tractography algorithms. Our previous work consists in an automatic method for the identification of short association bundles of the superficial white matter (SWM), based on a whole brain inter-subject hierarchical clustering applied to a HARDI database. The method finds representative clusters of similar fibers, belonging to a group of subjects, according to a distance measure between fibers, using a non-linear registration (DTI-TK). The algorithm performs an automatic labeling based on the anatomy, defined by a cortex mesh parcelated with FreeSurfer software. The clustering was applied to two independent groups of 37 subjects. The clusters resulting from both groups were compared using a restrictive threshold of mean distance between each pair of bundles from different groups, in order to keep reproducible connections. In the left hemisphere, 48 reproducible bundles were found, while 43 bundles where found in the right hemisphere. An inter-hemispheric bundle correspondence was then applied. The symmetric horizontal reflection of the right bundles was calculated, in order to obtain the position of them in the left hemisphere. Next, the intersection between similar bundles was calculated. The pairs of bundles with a fiber intersection percentage higher than 50% were considered similar. The similar bundles between both hemispheres were fused and symmetrized. We obtained 30 common bundles between hemispheres. An atlas was created with the resulting bundles and used to segment 78 new subjects from another HARDI database, using a distance threshold between 6-8 mm according to the bundle length. Finally, a laterality index was calculated based on the bundle volume. Seven bundles of the atlas presented right laterality (IP_SP_1i, LO_LO_1i, Op_Tr_0i, PoC_PoC_0i, PoC_PreC_2i, PreC_SM_0i, y RoMF_RoMF_0i) and one presented left laterality (IP_SP_2i), there is no tendency of lateralization according to the brain region. Many factors can affect the results, like tractography artifacts, subject registration, and bundle segmentation. Further studies are necessary in order to establish the influence of these factors and evaluate SWM laterality.

Keywords: dMRI, hierarchical clustering, lateralization index, tractography

Procedia PDF Downloads 331
388 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 363
387 Selection of Suitable Reference Genes for Assessing Endurance Related Traits in a Native Pony Breed of Zanskar at High Altitude

Authors: Prince Vivek, Vijay K. Bharti, Manishi Mukesh, Ankita Sharma, Om Prakash Chaurasia, Bhuvnesh Kumar

Abstract:

High performance of endurance in equid requires adaptive changes involving physio-biochemical, and molecular responses in an attempt to regain homeostasis. We hypothesized that the identification of the suitable reference genes might be considered for assessing of endurance related traits in pony at high altitude and may ensure for individuals struggling to potent endurance trait in ponies at high altitude. A total of 12 mares of ponies, Zanskar breed, were divided into three groups, group-A (without load), group-B, (60 Kg) and group-C (80 Kg) on backpack loads were subjected to a load carry protocol, on a steep climb of 4 km uphill, and of gravel, uneven rocky surface track at an altitude of 3292 m to 3500 m (endpoint). Blood was collected before and immediately after the load carry on sodium heparin anticoagulant, and the peripheral blood mononuclear cell was separated for total RNA isolation and thereafter cDNA synthesis. Real time-PCR reactions were carried out to evaluate the mRNAs expression profile of a panel of putative internal control genes (ICGs), related to different functional classes, namely glyceraldehyde 3-phosphate dehydrogenase (GAPDH), β₂ microglobulin (β₂M), β-actin (ACTB), ribosomal protein 18 (RS18), hypoxanthine-guanine phosophoribosyltransferase (HPRT), ubiquitin B (UBB), ribosomal protein L32 (RPL32), transferrin receptor protein (TFRC), succinate dehydrogenase complex subunit A (SDHA) for normalizing the real-time quantitative polymerase chain reaction (qPCR) data of native pony’s. Three different algorithms, geNorm, NormFinder, and BestKeeper software, were used to evaluate the stability of reference genes. The result showed that GAPDH was best stable gene and stability value for the best combination of two genes was observed TFRC and β₂M. In conclusion, the geometric mean of GAPDH, TFRC and β₂M might be used for accurate normalization of transcriptional data for assessing endurance related traits in Zanskar ponies during load carrying.

Keywords: endurance exercise, ubiquitin B (UBB), β₂ microglobulin (β₂M), high altitude, Zanskar ponies, reference gene

Procedia PDF Downloads 131
386 Exploring the Impact of Input Sequence Lengths on Long Short-Term Memory-Based Streamflow Prediction in Flashy Catchments

Authors: Farzad Hosseini Hossein Abadi, Cristina Prieto Sierra, Cesar Álvarez Díaz

Abstract:

Predicting streamflow accurately in flashy catchments prone to floods is a major research and operational challenge in hydrological modeling. Recent advancements in deep learning, particularly Long Short-Term Memory (LSTM) networks, have shown to be promising in achieving accurate hydrological predictions at daily and hourly time scales. In this work, a multi-timescale LSTM (MTS-LSTM) network was applied to the context of regional hydrological predictions at an hourly time scale in flashy catchments. The case study includes 40 catchments allocated in the Basque Country, north of Spain. We explore the impact of hyperparameters on the performance of streamflow predictions given by regional deep learning models through systematic hyperparameter tuning - where optimal regional values for different catchments are identified. The results show that predictions are highly accurate, with Nash-Sutcliffe (NSE) and Kling-Gupta (KGE) metrics values as high as 0.98 and 0.97, respectively. A principal component analysis reveals that a hyperparameter related to the length of the input sequence contributes most significantly to the prediction performance. The findings suggest that input sequence lengths have a crucial impact on the model prediction performance. Moreover, employing catchment-scale analysis reveals distinct sequence lengths for individual basins, highlighting the necessity of customizing this hyperparameter based on each catchment’s characteristics. This aligns with well known “uniqueness of the place” paradigm. In prior research, tuning the length of the input sequence of LSTMs has received limited focus in the field of streamflow prediction. Initially it was set to 365 days to capture a full annual water cycle. Later, performing limited systematic hyper-tuning using grid search, revealed a modification to 270 days. However, despite the significance of this hyperparameter in hydrological predictions, usually studies have overlooked its tuning and fixed it to 365 days. This study, employing a simultaneous systematic hyperparameter tuning approach, emphasizes the critical role of input sequence length as an influential hyperparameter in configuring LSTMs for regional streamflow prediction. Proper tuning of this hyperparameter is essential for achieving accurate hourly predictions using deep learning models.

Keywords: LSTMs, streamflow, hyperparameters, hydrology

Procedia PDF Downloads 69
385 Randomized Controlled Trial for the Management of Pain and Anxiety Using Virtual Reality During the Care of Older Hospitalized Patients

Authors: Corbel Camille, Le Cerf Flora, Capriz Françoise, Vaillant-Ciszewicz Anne-Julie, Breaud Jean, Guerin Olivier, Corveleyn Xavier

Abstract:

Background: The medical environment can generate stressful and anxiety-provoking situations for patients, particularly during painful care procedures for the older population. These stressful environments have deleterious effects on the quality of care and can even put the patient at risk and set the care team up for failure. The search for a solution is, therefore, imperative. The development of new technologies, such as virtual reality (VR), seems to be an answer to this problem. Objectives: The objective of this study is to compare the effects of virtual reality on pain and anxiety when caring for older hospitalized people with the effects of usual care. More precisely, different individual factors (age, cognitive level, individual preferences, etc.) and different virtual reality universes (personalized or non-personalized) are studied to understand the role of these factors in reducing pain and anxiety during care procedures. The aim of this study is to improve the quality of life of patients and caregivers in their work environment. Method: This mono-centered, randomized, controlled study was conducted from September 2023 to September 2024 on 120 participants recruited from the geriatric departments of the Cimiez Hospital, Nice, France. Participants are randomized into three groups: a control group, a personalized VR group and a non-personalized VR group. Each participant is followed during a painful care session. Data are collected before, during and after the care, using measures of pain (Algoplus and numerical scale) and anxiety (Hospital anxiety scale and numerical scale). Physiological assessments with an oximeter are also performed to collect both heart and respiratory rate measurements. The implementation of the care will be assessed among healthcare providers to evaluate its effects on the difficulty and fatigue associated with the care. Additionally, a questionnaire (System Usability Scale) will be administered at the conclusion of the study to determine the willingness of healthcare providers to integrate VR into their daily care practices. Result: The preliminary results indicate significant effects on anxiety (p=.001) and pain (p=<.001) following the VR intervention during care, as compared to the control group. Conclusion: The preliminary results suggest that VRI appears to be a suitable and effective method for reducing anxiety and pain among older hospitalized individuals compared with standard care. Finally, the experiences of healthcare professionals involved will also be considered to assess the impact of these interventions on working conditions and patient support.

Keywords: anxiety, care, pain, older adults, virtual reality

Procedia PDF Downloads 73
384 Barrier Membrane Influence Histology of Guided Bone Regenerations: A Systematic Review and Meta-Analysis

Authors: Laura Canagueral-Pellice, Antonio Munar-Frau, Adaia Valls-Ontanon, Joao Carames, Federico Hernandez-Alfaro, Jordi Caballe-Serrano

Abstract:

Objective: Guided bone regeneration (GBR) aims to replace the missing bone with a new structure to achieve long-term stability of rehabilitations. The aim of the present systematic review and meta-analysis is to determine the effect of barrier membranes on histological outcomes after GBR procedures. Moreover, the effect of the grafting material and tissue gain were analyzed. Materials & methods: Two independent reviewers performed an electronic search in Pubmed and Scopus, identifying all eligible publications up to March 2020. Only randomized controlled trials (RCTs) assessing a histological analysis of augmented areas were included. Results: A total of 6 publications were included for the present systematic review. A total of 110 biopsied sites were analysed; 10 corresponded to vertical bone augmentation procedures, whereas 100 analysed horizontal regeneration procedures. A mean tissue gain of 3 ± 1.48mm was obtained for horizontal defects. Histological assessment of new bone formation, residual particle and sub-epithelial connective tissue (SCT) was reported. The four main barrier membranes used were natural collagen membranes, e-PTFE, polylactic resorbable membranes and acellular dermal matrix membranes (AMDG). The analysis demonstrated that resorbable membranes result in higher values of new bone formation and lower values of residual particles and SCT. Xenograft resulted in lower new bone formation compared to allograft; however, no statistically significant differences were observed regarding residual particle and SCT. Overall, regeneration procedures adding autogenous bone, plasma derivate or growth factors achieved in general greater new bone formation and tissue gain. Conclusions: There is limited evidence favoring the effect of a certain type of barrier membrane in GBR. Data needs to be evaluated carefully; however, resorbable membranes are correlated with greater new bone formation values, especially when combined with allograft materials and/or the addition of autogenous bone, platelet reach plasma (PRP) or growth factors in the regeneration area. More studies assessing the histological outcomes of different GBR protocols and procedures testing different biomaterials are needed to maximize the clinical and histological outcomes in bone regeneration science.

Keywords: barrier membrane, graft material, guided bone regeneration, implant surgery, histology

Procedia PDF Downloads 152
383 Occupational Safety and Health in the Wake of Drones

Authors: Hoda Rahmani, Gary Weckman

Abstract:

The body of research examining the integration of drones into various industries is expanding rapidly. Despite progress made in addressing the cybersecurity concerns for commercial drones, knowledge deficits remain in determining potential occupational hazards and risks of drone use to employees’ well-being and health in the workplace. This creates difficulty in identifying key approaches to risk mitigation strategies and thus reflects the need for raising awareness among employers, safety professionals, and policymakers about workplace drone-related accidents. The purpose of this study is to investigate the prevalence of and possible risk factors for drone-related mishaps by comparing the application of drones in construction with manufacturing industries. The chief reason for considering these specific sectors is to ascertain whether there exists any significant difference between indoor and outdoor flights since most construction sites use drones outside and vice versa. Therefore, the current research seeks to examine the causes and patterns of workplace drone-related mishaps and suggest possible ergonomic interventions through data collection. Potential ergonomic practices to mitigate hazards associated with flying drones could include providing operators with professional pieces of training, conducting a risk analysis, and promoting the use of personal protective equipment. For the purpose of data analysis, two data mining techniques, the random forest and association rule mining algorithms, will be performed to find meaningful associations and trends in data as well as influential features that have an impact on the occurrence of drone-related accidents in construction and manufacturing sectors. In addition, Spearman’s correlation and chi-square tests will be used to measure the possible correlation between different variables. Indeed, by recognizing risks and hazards, occupational safety stakeholders will be able to pursue data-driven and evidence-based policy change with the aim of reducing drone mishaps, increasing productivity, creating a safer work environment, and extending human performance in safe and fulfilling ways. This research study was supported by the National Institute for Occupational Safety and Health through the Pilot Research Project Training Program of the University of Cincinnati Education and Research Center Grant #T42OH008432.

Keywords: commercial drones, ergonomic interventions, occupational safety, pattern recognition

Procedia PDF Downloads 209
382 Developing Communicative Skills in Foreign Languages by Video Tasks

Authors: Ekaterina G. Lipatova

Abstract:

The developing potential of a video task in teaching foreign languages involves the opportunities to improve four aspects of speech production process: listening, reading, speaking and writing. A video represents the sequence of actions, realized in the pictures logically connected and verbalized speech flow that simplifies and stimulates the process of perception. In this connection listening skills of students are developed effectively as well as their intellectual properties such as synthesizing, analyzing and generalizing the information. In terms of teaching capacity, a video task, in our opinion, is more stimulating than a traditional listening, since it involves the student into the plot of the communicative situation, emotional background and potentially makes them react to the gist in the cognitive and communicative ways. To be an effective method of teaching the video task should be structured in the way of psycho-linguistic characteristics of speech production process, in other words, should include three phases: before-watching, while-watching and after-watching. The system of tasks provided to each phase might involve the situations on reflecting to the video content in the forms of filling-the-gap tasks, multiple choice, True-or-False tasks (reading skills), exercises on expressing the opinion, project fulfilling (writing and speaking skills). In the before-watching phase we offer the students to adjust their perception mechanism to the topic and the problem of the chosen video by such task as “what do you know about such a problem?”, “is it new for you?”, “have you ever faced the situation of…?”. Then we proceed with the lexical and grammatical analysis of language units that form the body of a speech sample to lessen the perception and develop the student’s lexicon. The goal of while-watching phase is to build the student’s awareness about the problem presented in the video and challenge their inner attitude towards what they have seen by identifying the mistakes in the statements about the video content or making the summary, justifying their understanding. Finally, we move on to development of their speech skills within the communicative situation they observed and learnt by stimulating them to search the similar ideas in their backgrounds and represent them orally or in the written form or express their own opinion on the problem. It is compulsory to highlight, that a video task should contain the urgent, valid and interesting event related to the future profession of the student, since it will help to activate cognitive, emotional, verbal and ethic capacity of students. Also, logically structured video tasks are easily integrated into the system of e-learning and can provide the opportunity for the students to work with the foreign language on their own.

Keywords: communicative situation, perception mechanism, speech production process, speech skills

Procedia PDF Downloads 245
381 Institutional Cooperation to Foster Economic Development: Universities and Social Enterprises

Authors: Khrystyna Pavlyk

Abstract:

In the OECD countries, percentage of adults with higher education degrees has increased by 10 % during 2000-2010. Continuously increasing demand for higher education gives universities a chance of becoming key players in socio-economic development of a territory (region or city) via knowledge creation, knowledge transfer, and knowledge spillovers. During previous decade, universities have tried to support spin-offs and start-ups, introduced courses on sustainability and corporate social responsibility. While much has been done, new trends are starting to emerge in search of better approaches. Recently a number of universities created centers that conduct research in a field social entrepreneurship, which in turn underpin educational programs run at these universities. The list includes but is not limited to the Centre for Social Economy at University of Liège, Institute for Social Innovation at ESADE, Skoll Centre for Social Entrepreneurship at Oxford, Centre for Social Entrepreneurship at Rosklide, Social Entrepreneurship Initiative at INSEAD. Existing literature already examined social entrepreneurship centers in terms of position in the institutional structure, initial and additional funding, teaching initiatives, research achievements, and outreach activities. At the same time, Universities can become social enterprises themselves. Previous research revealed that universities use both business and social entrepreneurship models. Universities which are mainly driven by a social mission are more likely to transform into social entrepreneurial institutions. At the same time, currently, there is no clear understanding of what social entrepreneurship in higher education is about and thus social entrepreneurship in higher education needs to be studied and promoted at the same time. Main roles which socially oriented university can play in city development include: buyer (implementation of socially focused local procurement programs creates partnerships focused on local sustainable growth.); seller (centers created by universities can sell socially oriented goods and services, e.g. in consultancy.); employer (Universities can employ socially vulnerable groups.); business incubator (which will help current student to start their social enterprises). In the paper, we will analyze these in more detail. We will also examine a number of indicators that can be used to assess the impact, both direct and indirect, that universities can have on city's economy. At the same time, originality of this paper mainly lies not in methodological approaches used, but in countries evaluated. Social entrepreneurship is still treated as a relatively new phenomenon in post-transitional countries where social services were provided only by the state for many decades. Paper will provide data and example’s both from developed countries (the US and EU), and those located in CIS and CEE region.

Keywords: social enterprise, university, regional economic development, comparative study

Procedia PDF Downloads 254
380 Designing Form, Meanings, and Relationships for Future Industrial Products. Case Study Observation of PAD

Authors: Elisabetta Cianfanelli, Margherita Tufarelli, Paolo Pupparo

Abstract:

The dialectical mediation between desires and objects or between mass production and consumption continues to evolve over time. This relationship is influenced both by variable geometries of contexts that are distant from the mere design of product form and by aspects rooted in the very definition of industrial design. In particular, the overcoming of macro-areas of innovation in the technological, social, cultural, formal, and morphological spheres, supported by recent theories in critical and speculative design, seems to be moving further and further away from the design of the formal dimension of advanced products. The articulated fabric of theories and practices that feed the definition of “hyperobjects”, and no longer objects describes a common tension in all areas of design and production of industrial products. The latter are increasingly detached from the design of the form and meaning of the same in mass productions, thus losing the quality of products capable of social transformation. For years we have been living in a transformative moment as regards the design process in the definition of the industrial product. We are faced with a dichotomy in which there is, on the one hand, a reactionary aversion to the new techniques of industrial production and, on the other hand, a sterile adoption of the techniques of mass production that we can now consider traditional. This ambiguity becomes even more evident when we talk about industrial products, and we realize that we are moving further and further away from the concepts of "form" as a synthesis of a design thought aimed at the aesthetic-emotional component as well as the functional one. The design of forms and their contents, as statutes of social acts, allows us to investigate the tension on mass production that crosses seasons, trends, technicalities, and sterile determinisms. The design culture has always determined the formal qualities of objects as a sum of aesthetic characteristics functional and structural relationships that define a product as a coherent unit. The contribution proposes a reflection and a series of practical experiences of research on the form of advanced products. This form is understood as a kaleidoscope of relationships through the search for an identity, the desire for democratization, and between these two, the exploration of the aesthetic factor. The study of form also corresponds to the study of production processes, technological innovations, the definition of standards, distribution, advertising, the vicissitudes of taste and lifestyles. Specifically, we will investigate how the genesis of new forms for new meanings introduces a change in the relative innovative production techniques. It becomes, therefore, fundamental to investigate, through the reflections and the case studies exposed inside the contribution, also the new techniques of production and elaboration of the forms of the products, as new immanent and determining element inside the planning process.

Keywords: industrial design, product advanced design, mass productions, new meanings

Procedia PDF Downloads 121
379 The Prediction of Reflection Noise and Its Reduction by Shaped Noise Barriers

Authors: I. L. Kim, J. Y. Lee, A. K. Tekile

Abstract:

In consequence of the very high urbanization rate of Korea, the number of traffic noise damages in areas congested with population and facilities is steadily increasing. The current environmental noise levels data in major cities of the country show that the noise levels exceed the standards set for both day and night times. This research was about comparative analysis in search for optimal soundproof panel shape and design factor that can minimize sound reflection noise. In addition to the normal flat-type panel shape, the reflection noise reduction of swelling-type, combined swelling and curved-type, and screen-type were evaluated. The noise source model Nord 2000, which often provides abundant information compared to models for the similar purpose, was used in the study to determine the overall noise level. Based on vehicle categorization in Korea, the noise levels for varying frequency from different heights of the sound source (directivity heights of Harmonize model) have been calculated for simulation. Each simulation has been made using the ray-tracing method. The noise level has also been calculated using the noise prediction program called SoundPlan 7.2, for comparison. The noise level prediction was made at 15m (R1), 30 m (R2) and at middle of the road, 2m (R3) receiving the point. By designing the noise barriers by shape and running the prediction program by inserting the noise source on the 2nd lane to the noise barrier side, among the 6 lanes considered, the reflection noise slightly decreased or increased in all noise barriers. At R1, especially in the cases of the screen-type noise barriers, there was no reduction effect predicted in all conditions. However, the swelling-type showed a decrease of 0.7~1.2 dB at R1, performing the best reduction effect among the tested noise barriers. Compared to other forms of noise barriers, the swelling-type was thought to be the most suitable for reducing the reflection noise; however, since a slight increase was predicted at R2, further research based on a more sophisticated categorization of related design factors is necessary. Moreover, as swellings are difficult to produce and the size of the modules are smaller than other panels, it is challenging to install swelling-type noise barriers. If these problems are solved, its applicable region will not be limited to other types of noise barriers. Hence, when a swelling-type noise barrier is installed at a downtown region where the amount of traffic is increasing every day, it will both secure visibility through the transparent walls and diminish any noise pollution due to the reflection. Moreover, when decorated with shapes and design, noise barriers will achieve a visual attraction than a flat-type one and thus will alleviate any psychological hardships related to noise, other than the unique physical soundproofing functions of the soundproof panels.

Keywords: reflection noise, shaped noise barriers, sound proof panel, traffic noise

Procedia PDF Downloads 509
378 The Association between Prior Antibiotic Use and Subsequent Risk of Infectious Disease: A Systematic Review

Authors: Umer Malik, David Armstrong, Mark Ashworth, Alex Dregan, Veline L'Esperance, Lucy McDonnell, Mariam Molokhia, Patrick White

Abstract:

Introduction: The microbiota lining epithelial surfaces is thought to play an important role in many human physiological functions including defense against pathogens and modulation of immune response. The microbiota is susceptible to disruption from external influences such as exposure to antibiotic medication. It is thought that antibiotic-induced disruption of the microbiota could predispose to pathogen overgrowth and invasion. We hypothesized that antibiotic use would be associated with increased risk of future infections. We carried out a systematic review of evidence of associations between antibiotic use and subsequent risk of community-acquired infections. Methods: We conducted a review of the literature for observational studies assessing the association between antibiotic use and subsequent community-acquired infection. Eligible studies were published before April 29th, 2016. We searched MEDLINE, EMBASE, and Web of Science and screened titles and abstracts using a predefined search strategy. Infections caused by Clostridium difficile, drug-resistant organisms and fungal organisms were excluded as their association with prior antibiotic use has been examined in previous systematic reviews. Results: Eighteen out of 21,518 retrieved studies met the inclusion criteria. The association between past antibiotic exposure and subsequent increased risk of infection was reported in 16 studies, including one study on Campylobacter jejuni infection (Odds Ratio [OR] 3.3), two on typhoid fever (ORs 5.7 and 12.2), one on Staphylococcus aureus skin infection (OR 2.9), one on invasive pneumococcal disease (OR 1.57), one on recurrent furunculosis (OR 16.6), one on recurrent boils and abscesses (Risk ratio 1.4), one on upper respiratory tract infection (OR 2.3) and urinary tract infection (OR 1.1), one on invasive Haemophilus influenzae type b (Hib) infection (OR 1.51), one on infectious mastitis (OR 5.38), one on meningitis (OR 2.04) and five on Salmonella enteric infection (ORs 1.4, 1.59, 1.9, 2.3 and 3.8). The effect size in three studies on Salmonella enteric infection was of marginal statistical significance. A further two studies on Salmonella infection did not demonstrate a statistically significant association between prior antibiotic exposure and subsequent infection. Conclusion: We have found an association between past antibiotic exposure and subsequent risk of a diverse range of infections in the community setting. Our findings provide evidence to support the hypothesis that prior antibiotic usage may predispose to future infection risk, possibly through antibiotic-induced alteration of the microbiota. The findings add further weight to calls to minimize inappropriate antibiotic prescriptions.

Keywords: antibiotic, infection, risk factor, side effect

Procedia PDF Downloads 224
377 Telemedicine Services in Ophthalmology: A Review of Studies

Authors: Nasim Hashemi, Abbas Sheikhtaheri

Abstract:

Telemedicine is the use of telecommunication and information technologies to provide health care services that would often not be consistently available in distant rural communities to people at these remote areas. Teleophthalmology is a branch of telemedicine that delivers eye care through digital medical equipment and telecommunications technology. Thus, teleophthalmology can overcome geographical barriers and improve quality, access, and affordability of eye health care services. Since teleophthalmology has been widespread applied in recent years, the aim of this study was to determine the different applications of teleophthalmology in the world. To this end, three bibliographic databases (Medline, ScienceDirect, Scopus) were comprehensively searched with these keywords: eye care, eye health care, primary eye care, diagnosis, detection, and screening of different eye diseases in conjunction with telemedicine, telehealth, teleophthalmology, e-services, and information technology. All types of papers were included in the study with no time restriction. We conducted the search strategies until 2015. Finally 70 articles were surveyed. We classified the results based on the’type of eye problems covered’ and ‘the type of telemedicine services’. Based on the review, from the ‘perspective of health care levels’, there are three level for eye health care as primary, secondary and tertiary eye care. From the ‘perspective of eye care services’, the main application of teleophthalmology in primary eye care was related to the diagnosis of different eye diseases such as diabetic retinopathy, macular edema, strabismus and aged related macular degeneration. The main application of teleophthalmology in secondary and tertiary eye care was related to the screening of eye problems i.e. diabetic retinopathy, astigmatism, glaucoma screening. Teleconsultation between health care providers and ophthalmologists and also education and training sessions for patients were other types of teleophthalmology in world. Real time, store–forward and hybrid methods were the main forms of the communication from the perspective of ‘teleophthalmology mode’ which is used based on IT infrastructure between sending and receiving centers. In aspect of specialists, early detection of serious aged-related ophthalmic disease in population, screening of eye disease processes, consultation in an emergency cases and comprehensive eye examination were the most important benefits of teleophthalmology. Cost-effectiveness of teleophthalmology projects resulted from reducing transportation and accommodation cost, access to affordable eye care services and receiving specialist opinions were also the main advantages of teleophthalmology for patients. Teleophthalmology brings valuable secondary and tertiary care to remote areas. So, applying teleophthalmology for detection, treatment and screening purposes and expanding its use in new applications such as eye surgery will be a key tool to promote public health and integrating eye care to primary health care.

Keywords: applications, telehealth, telemedicine, teleophthalmology

Procedia PDF Downloads 374