Search results for: performing art
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1118

Search results for: performing art

368 Building Atmospheric Moisture Diagnostics: Environmental Monitoring and Data Collection

Authors: Paula Lopez-Arce, Hector Altamirano, Dimitrios Rovas, James Berry, Bryan Hindle, Steven Hodgson

Abstract:

Efficient mould remediation and accurate moisture diagnostics leading to condensation and mould growth in dwellings are largely untapped. Number of factors are contributing to the rising trend of excessive moisture in homes mainly linked with modern living, increased levels of occupation and rising fuel costs, as well as making homes more energy efficient. Environmental monitoring by means of data collection though loggers sensors and survey forms has been performed in a range of buildings from different UK regions. Air and surface temperature and relative humidity values of residential areas affected by condensation and/or mould issues were recorded. Additional measurements were taken through different trials changing type, location, and position of loggers. In some instances, IR thermal images and ventilation rates have also been acquired. Results have been interpreted together with environmental key parameters by processing and connecting data from loggers and survey questionnaires, both in buildings with and without moisture issues. Monitoring exercises carried out during Winter and Spring time show the importance of developing and following accurate protocols for guidance to obtain consistent, repeatable and comparable results and to improve the performance of environmental monitoring. A model and a protocol are being developed to build a diagnostic tool with the goal of performing a simple but precise residential atmospheric moisture diagnostics to distinguish the cause entailing condensation and mould generation, i.e., ventilation, insulation or heating systems issue. This research shows the relevance of monitoring and processing environmental data to assign moisture risk levels and determine the origin of condensation or mould when dealing with a building atmospheric moisture excess.

Keywords: environmental monitoring, atmospheric moisture, protocols, mould

Procedia PDF Downloads 116
367 Simplified Measurement of Occupational Energy Expenditure

Authors: J. Wicks

Abstract:

Aim: To develop a simple methodology to allow collected heart rate (HR) data from inexpensive wearable devices to be expressed in a suitable format (METs) to quantitate occupational (and recreational) activity. Introduction: Assessment of occupational activity is commonly done by utilizing questionnaires in combination with prescribed MET levels of a vast range of previously measured activities. However for any individual the intensity of performing a specific activity can vary significantly. Ideally objective measurement of individual activity is preferred. Though there are a wide range of HR recording devices there is a distinct lack methodology to allow processing of collected data to quantitate energy expenditure (EE). The HR index equation expresses METs in relation to relative HR i.e. the ratio of activity HR to resting HR. The use of this equation provides a simple utility for objective measurement of EE. Methods: During a typical occupational work period of approximately 8 hours HR data was recorded using a Polar RS 400 wrist monitor. Recorded data was downloaded to a Windows PC and non HR data was stripped from the ASCII file using ‘Notepad’. The HR data was exported to a spread sheet program and sorted by HR range into a histogram format. Three HRs were determined, namely a resting HR (the HR delimiting the lowest 30 minutes of recorded data), a mean HR and a peak HR (the HR delimiting the highest 30 minutes of recorded data). HR indices were calculated (mean index equals mean HR/rest HR and peak index equals peak HR/rest HR) with mean and peak indices being converted to METs using the HR index equation. Conclusion: Inexpensive HR recording devices can be utilized to make reasonable estimates of occupational (or recreational) EE suitable for large scale demographic screening by utilizing the HR index equation. The intrinsic value of the HR index equation is that it is independent of factors that influence absolute HR, namely fitness, smoking and beta-blockade.

Keywords: energy expenditure, heart rate histograms, heart rate index, occupational activity

Procedia PDF Downloads 273
366 Which Mechanisms are Involved by Legume-Rhizobia Symbiosis to Increase Its Phosphorus Use Efficiency under Low Phosphorus Level?

Authors: B. Makoudi, R. Ghanimi, A. Bargaz, M. Mouradi, M. Farissi, A. Kabbaj, J. J. Drevon, C. Ghoulam

Abstract:

Legume species are able to establish a nitrogen fixing symbiosis with soil rhizobia that allows them, when it operates normally, to ensure their necessary nitrogen nutrition. This biological process needs high phosphorus (P) supply and consequently it is limited under low phosphorus availability. To overcome this constraint, legume-rhizobia symbiosis develops many mechanisms to increase P availability in the rhizosphere and also the efficiency of P fertilizers. The objectives of our research works are to understand the physiological and biochemical mechanisms implemented by legume-rhizobia symbiosis to increase its P use efficiency (PUE) in order to select legume genotypes-rhizobia strains combination more performing for BNF under P deficiency. Our studies were carried out on two grain legume species, common bean (Phaseolus vulgaris) and faba bean (Vicia faba) tested in farmers’ fields and in experimental station fewer than two soil phosphorus levels. Under field conditions, the P deficiency caused a significant decrease of Plant and nodule biomasses in all of the tested varieties with a difference between them. This P limitation increased the contents of available P in the rhizospheric soils that was positively correlated with the increase of phosphatases activities in the nodules and the rhizospheric soil. Some legume genotypes showed a significant increase of their P use efficiency under P deficiency. The P solubilization test showed that some rhizobia strains isolated from Haouz region presented an important capacity to grow on solid and liquid media with tricalcium phosphate as the only P source and their P solubilizing activity was confirmed by the assay of the released P in the liquid medium. Also, this P solubilizing activity was correlated with medium acidification and the excretion of acid phosphatases and phytases in the medium. Thus, we concluded that medium acidification and excretion of phosphatases in the rhizosphere are the prominent reactions for legume-rhizobia symbiosis to improve its P nutrition.

Keywords: legume, phosphorus deficiency, rhizobia, rhizospheric soil

Procedia PDF Downloads 285
365 Analysis and Identification of Trends in Electric Vehicle Crash Data

Authors: Cody Stolle, Mojdeh Asadollahipajouh, Khaleb Pafford, Jada Iwuoha, Samantha White, Becky Mueller

Abstract:

Battery-electric vehicles (BEVs) are growing in sales and popularity in the United States as an alternative to traditional internal combustion engine vehicles (ICEVs). BEVs are generally heavier than corresponding models of ICEVs, with large battery packs located beneath the vehicle floorpan, a “skateboard” chassis, and have front and rear crush space available in the trunk and “frunk” or front trunk. The geometrical and frame differences between the vehicles may lead to incompatibilities with gasoline vehicles during vehicle-to-vehicle crashes as well as run-off-road crashes with roadside barriers, which were designed to handle lighter ICEVs with higher centers-of-mass and with dedicated structural chasses. Crash data were collected from 10 states spanning a five-year period between 2017 and 2021. Vehicle Identification Number (VIN) codes were processed with the National Highway Traffic Safety Administration (NHTSA) VIN decoder to extract BEV models from ICEV models. Crashes were filtered to isolate only vehicles produced between 2010 and 2021, and the crash circumstances (weather, time of day, maximum injury) were compared between BEVs and ICEVs. In Washington, 436,613 crashes were identified, which satisfied the selection criteria, and 3,371 of these crashes (0.77%) involved a BEV. The number of crashes which noted a fire were comparable between BEVs and ICEVs of similar model years (0.3% and 0.33%, respectively), and no differences were discernable for the time of day, weather conditions, road geometry, or other prevailing factors (e.g., run-off-road). However, crashes involving BEVs rose rapidly; 31% of all BEV crashes occurred in just 2021. Results indicate that BEVs are performing comparably to ICEVs, and events surrounding BEV crashes are statistically indistinguishable from ICEV crashes.

Keywords: battery-electric vehicles, transportation safety, infrastructure crashworthiness, run-off-road crashes, ev crash data analysis

Procedia PDF Downloads 58
364 Investigating the Effectiveness of Multilingual NLP Models for Sentiment Analysis

Authors: Othmane Touri, Sanaa El Filali, El Habib Benlahmar

Abstract:

Natural Language Processing (NLP) has gained significant attention lately. It has proved its ability to analyze and extract insights from unstructured text data in various languages. It is found that one of the most popular NLP applications is sentiment analysis which aims to identify the sentiment expressed in a piece of text, such as positive, negative, or neutral, in multiple languages. While there are several multilingual NLP models available for sentiment analysis, there is a need to investigate their effectiveness in different contexts and applications. In this study, we aim to investigate the effectiveness of different multilingual NLP models for sentiment analysis on a dataset of online product reviews in multiple languages. The performance of several NLP models, including Google Cloud Natural Language API, Microsoft Azure Cognitive Services, Amazon Comprehend, Stanford CoreNLP, spaCy, and Hugging Face Transformers are being compared. The models based on several metrics, including accuracy, precision, recall, and F1 score, are being evaluated and compared to their performance across different categories of product reviews. In order to run the study, preprocessing of the dataset has been performed by cleaning and tokenizing the text data in multiple languages. Then training and testing each model has been applied using a cross-validation approach where randomly dividing the dataset into training and testing sets and repeating the process multiple times has been used. A grid search approach to optimize the hyperparameters of each model and select the best-performing model for each category of product reviews and language has been applied. The findings of this study provide insights into the effectiveness of different multilingual NLP models for Multilingual Sentiment Analysis and their suitability for different languages and applications. The strengths and limitations of each model were identified, and recommendations for selecting the most performant model based on the specific requirements of a project were provided. This study contributes to the advancement of research methods in multilingual NLP and provides a practical guide for researchers and practitioners in the field.

Keywords: NLP, multilingual, sentiment analysis, texts

Procedia PDF Downloads 61
363 Benefits of Whole-Body Vibration Training on Lower-Extremity Muscle Strength and Balance Control in Middle-Aged and Older Adults

Authors: Long-Shan Wu, Ming-Chen Ko, Chien-Chang Ho, Po-Fu Lee, Jenn-Woei Hsieh, Ching-Yu Tseng

Abstract:

This study aimed to determine the effects of whole-body vibration (WBV) training on lower-extremity muscle strength and balance control performance among community-dwelling middle-aged and older adults in the United States. Twenty-nine participants without any contraindication of performing WBV exercise completed all the study procedures. Participants were randomly assigned to do body weight exercise with either an individualized vibration frequency and amplitude, a fixed vibration frequency and amplitude, or no vibration. Isokinetic knee extensor power, limits of stability, and sit-to-stand tests were performed at the baseline and after 8 weeks of training. Neither the individualized frequency-amplitude WBV training protocol nor the fixed frequency-amplitude WBV training protocol improved isokinetic knee extensor power. The limits of stability endpoint excursion score for the individualized frequency-amplitude group increased by 8.8 (12.9%; p = 0.025) after training. No significant differences were observed in fixed and control group. The maximum excursion score for the individualized frequency-amplitude group at baseline increased by 9.2 (11.5%; p = 0.006) after training. The average weight transfer time score significantly decreased by 0.21 s in the fixed group. The participants in the individualized group showed a significant increase (3.2%) in weight rising index score after 8 weeks of WBV training. These results suggest that 8 weeks of WBV training improved limit of stability and sit-to-stand performance. Future studies need to determine whether WBV training improves other factors that can influence posture control.

Keywords: whole-body vibration training, muscle strength, balance control, middle-aged and older adults

Procedia PDF Downloads 199
362 Improvement of Environment and Climate Change Canada’s Gem-Hydro Streamflow Forecasting System

Authors: Etienne Gaborit, Dorothy Durnford, Daniel Deacu, Marco Carrera, Nathalie Gauthier, Camille Garnaud, Vincent Fortin

Abstract:

A new experimental streamflow forecasting system was recently implemented at the Environment and Climate Change Canada’s (ECCC) Canadian Centre for Meteorological and Environmental Prediction (CCMEP). It relies on CaLDAS (Canadian Land Data Assimilation System) for the assimilation of surface variables, and on a surface prediction system that feeds a routing component. The surface energy and water budgets are simulated with the SVS (Soil, Vegetation, and Snow) Land-Surface Scheme (LSS) at 2.5-km grid spacing over Canada. The routing component is based on the Watroute routing scheme at 1-km grid spacing for the Great Lakes and Nelson River watersheds. The system is run in two distinct phases: an analysis part and a forecast part. During the analysis part, CaLDAS outputs are used to force the routing system, which performs streamflow assimilation. In forecast mode, the surface component is forced with the Canadian GEM atmospheric forecasts and is initialized with a CaLDAS analysis. Streamflow performances of this new system are presented over 2019. Performances are compared to the current ECCC’s operational streamflow forecasting system, which is different from the new experimental system in many aspects. These new streamflow forecasts are also compared to persistence. Overall, the new streamflow forecasting system presents promising results, highlighting the need for an elaborated assimilation phase before performing the forecasts. However, the system is still experimental and is continuously being improved. Some major recent improvements are presented here and include, for example, the assimilation of snow cover data from remote sensing, a backward propagation of assimilated flow observations, a new numerical scheme for the routing component, and a new reservoir model.

Keywords: assimilation system, distributed physical model, offline hydro-meteorological chain, short-term streamflow forecasts

Procedia PDF Downloads 112
361 Preparing Curved Canals Using Mtwo and RaCe Rotary Instruments: A Comparison Study

Authors: Mimoza Canga, Vito Malagnino, Giulia Malagnino, Irene Malagnino

Abstract:

Objective: The objective of this study was to compare the effectiveness of Mtwo and RaCe rotary instruments, in cleaning and shaping root canals curvature. Material and Method: The present study was conducted on 160 simulated canals in resin blocks, with an angle curvature 15°-30°. These 160 simulated canals were divided into two groups, where each group consisted of 80 blocks. Each group was divided into two subgroups (n=40 canals each). The simulated canals subgroups were prepared with Mtwo and RaCe rotary nickel-titanium instruments. The root canals were measured at four different points of reference, starting at 13 mm from the orifice. In the first group, the canals were prepared using Mtwo rotary system (VDW, Munich, Germany). The Mtwo files used were: 10/0.04, 15/0.05, 20/0.06, and 25/0.06. These instruments entered in the full length of the canal. Each file was rotated in the canal until it reached the apical point. In the second group, the canals were prepared using RaCe instruments (La Chaux-De-Fonds, Switzerland), performing the crown down technique, using the torque electric control motor (VDWCO, Munich, Germany), with 600 RPM and 2n/cm as follow: ≠40/0.10, ≠35/0.08, ≠30/0.06, ≠25/0.04, ≠25/0.02. The data were recorded using SPSS version 23 software (Microsoft, IL, USA). Data analysis was done using ANOVA test. Results: The results obtained by using the Mtwo rotary instruments, showed that these instruments were able to clean and shape in the right-to-left motion curved canals, at different levels, without any deviation, and in perfect symmetry, with a P-value=0.000. The data showed that the greater the depth of the root canal, the greater the deviations of the RaCe rotary instruments. These deviations occurred in three levels, which are: S2(P=0.004), S3( P=0.007), S4(P=0.009). The Mtwo files can go deeper and create a greater angle in S4 level (21°-28°), compared to RaCe instruments with an angle equal to 19°-24°. Conclusion: The present study noted a clinically significant difference between Mtwo rotary instruments and RaCe rotary files used for the canal preparation and indicated that Mtwo instruments are a better choice for the curved canals.

Keywords: canal curvature, canal preparation, Mtwo, RaCe, resin blocks

Procedia PDF Downloads 97
360 Structural Design of a Relief Valve Considering Strength

Authors: Nam-Hee Kim, Jang-Hoon Ko, Kwon-Hee Lee

Abstract:

A relief valve is a mechanical element to keep safety by controlling high pressure. Usually, the high pressure is relieved by using the spring force and letting the fluid to flow from another way out of system. When its normal pressure is reached, the relief valve can return to initial state. The relief valve in this study has been applied for pressure vessel, evaporator, piping line, etc. The relief valve should be designed for smooth operation and should satisfy the structural safety requirement under operating condition. In general, the structural analysis is performed by following fluid flow analysis. In this process, the FSI (Fluid-Structure Interaction) is required to input the force obtained from the output of the flow analysis. Firstly, this study predicts the velocity profile and the pressure distribution in the given system. In this study, the assumptions for flow analysis are as follows: • The flow is steady-state and three-dimensional. • The fluid is Newtonian and incompressible. • The walls of the pipe and valve are smooth. The flow characteristics in this relief valve does not induce any problem. The commercial software ANSYS/CFX is utilized for flow analysis. On the contrary, very high pressure may cause structural problem due to severe stress. The relief valve is made of body, bonnet, guide, piston and nozzle, and its material is stainless steel. To investigate its structural safety, the worst case loading is considered as the pressure of 700 bar. The load is applied to inside the valve, which is greater than the load obtained from FSI. The maximum stress is calculated as 378 MPa by performing the finite element analysis. However, the value is greater than its allowable value. Thus, an alternative design is suggested to improve the structural performance through case study. We found that the sensitive design variable to the strength is the shape of the nozzle. The case study is to vary the size of the nozzle. Finally, it can be seen that the suggested design satisfy the structural design requirement. The FE analysis is performed by using the commercial software ANSYS/Workbench.

Keywords: relief valve, structural analysis, structural design, strength, safety factor

Procedia PDF Downloads 272
359 Development of Excellent Water-Repellent Coatings for Metallic and Ceramic Surfaces

Authors: Aditya Kumar

Abstract:

One of the most fascinating properties of various insects and plant surfaces in nature is their water-repellent (superhydrophobicity) capability. The nature offers new insights to learn and replicate the same in designing artificial superhydrophobic structures for a wide range of applications such as micro-fluidics, micro-electronics, textiles, self-cleaning surfaces, anti-corrosion, anti-fingerprint, oil/water separation, etc. In general, artificial superhydrophobic surfaces are synthesized by creating roughness and then treating the surface with low surface energy materials. In this work, various super-hydrophobic coatings on metallic surfaces (aluminum, steel, copper, steel mesh) were synthesized by chemical etching process using different etchants and fatty acid. Also, SiO2 nano/micro-particles embedded polyethylene, polystyrene, and poly(methyl methacrylate) superhydrophobic coatings were synthesized on glass substrates. Also, the effect of process parameters such as etching time, etchant concentration, and particle concentration on wettability was studied. To know the applications of the coatings, surface morphology, contact angle, self-cleaning, corrosion-resistance, and water-repellent characteristics were investigated at various conditions. Furthermore, durabilities of coatings were also studied by performing thermal, ultra-violet, and mechanical stability tests. The surface morphology confirms the creation of rough microstructures by chemical etching or by embedding particles, and the contact angle measurements reveal the superhydrophobic nature. Experimentally it is found that the coatings have excellent self-cleaning, anti-corrosion and water-repellent nature. These coatings also withstand mechanical disturbances such surface bending, adhesive peeling, and abrasion. Coatings are also found to be thermal and ultra-violet stable. Additionally, coatings are also reproducible. Hence aforesaid durable superhydrophobic surfaces have many potential industrial applications.

Keywords: superhydrophobic, water-repellent, anti-corrosion, self-cleaning

Procedia PDF Downloads 271
358 Three Types of Mud-Huts with Courtyards in Composite Climate: Thermal Performance in Summer and Winter

Authors: Janmejoy Gupta, Arnab Paul, Manjari Chakraborty

Abstract:

Jharkhand is a state located in the eastern part of India. The Tropic of Cancer (23.5 degree North latitude line) passes through Ranchi district in Jharkhand. Mud huts with burnt clay tiled roofs in Jharkhand are an integral component of the state’s vernacular architecture. They come in various shapes, with a number of them having a courtyard type of plan. In general, it has been stated that designing dwellings with courtyards in them is a climate-responsive strategy in composite climate. The truth behind this hypothesis is investigated in this paper. In this paper, three types of mud huts with courtyards situated in Ranchi district in Jharkhand are taken as a study and through temperature measurements in the south-side rooms and courtyards, in addition to Autodesk Ecotect (Version 2011) software simulations, their thermal performance throughout the year are observed. Temperature measurements are specifically taken during the peak of summer and winter and the average temperatures in the rooms and courtyards during seven day-periods in peak of summer and peak of winter are plotted graphically. Thereafter, on the basis of the study and software simulations, the hypothesis is verified and the thermally better performing dwelling types in summer and winter identified among the three sub-types studied. Certain recommendations with respect to increasing thermal comfort in courtyard type mud huts in general are also made. It is found that all courtyard type dwellings do not necessarily show better thermal performance in summer and winter in composite climate. The U shaped dwelling with open courtyard on southern side offers maximum amount of thermal-comfort inside the rooms in the hotter part of the year and the square hut with a central courtyard, with the courtyard being closed from all sides, shows superior thermal performance in winter. The courtyards in all the three case-studies are found to get excessively heated up during summer.

Keywords: courtyard, mud huts, simulations, temperature measurements, thermal performance

Procedia PDF Downloads 381
357 Collaborative Stylistic Group Project: A Drama Practical Analysis Application

Authors: Omnia F. Elkommos

Abstract:

In the course of teaching stylistics to undergraduate students of the Department of English Language and Literature, Faculty of Arts and Humanities, the linguistic tool kit of theories comes in handy and useful for the better understanding of the different literary genres: Poetry, drama, and short stories. In the present paper, a model of teaching of stylistics is compiled and suggested. It is a collaborative group project technique for use in the undergraduate diverse specialisms (Literature, Linguistics and Translation tracks) class. Students initially are introduced to the different linguistic tools and theories suitable for each literary genre. The second step is to apply these linguistic tools to texts. Students are required to watch videos performing the poems or play, for example, and search the net for interpretations of the texts by other authorities. They should be using a template (prepared by the researcher) that has guided questions leading students along in their analysis. Finally, a practical analysis would be written up using the practical analysis essay template (also prepared by the researcher). As per collaborative learning, all the steps include activities that are student-centered addressing differentiation and considering their three different specialisms. In the process of selecting the proper tools, the actual application and analysis discussion, students are given tasks that request their collaboration. They also work in small groups and the groups collaborate in seminars and group discussions. At the end of the course/module, students present their work also collaboratively and reflect and comment on their learning experience. The module/course uses a drama play that lends itself to the task: ‘The Bond’ by Amy Lowell and Robert Frost. The project results in an interpretation of its theme, characterization and plot. The linguistic tools are drawn from pragmatics, and discourse analysis among others.

Keywords: applied linguistic theories, collaborative learning, cooperative principle, discourse analysis, drama analysis, group project, online acting performance, pragmatics, speech act theory, stylistics, technology enhanced learning

Procedia PDF Downloads 134
356 High-Frequency Acoustic Microscopy Imaging of Pellet/Cladding Interface in Nuclear Fuel Rods

Authors: H. Saikouk, D. Laux, Emmanuel Le Clézio, B. Lacroix, K. Audic, R. Largenton, E. Federici, G. Despaux

Abstract:

Pressurized Water Reactor (PWR) fuel rods are made of ceramic pellets (e.g. UO2 or (U,Pu) O2) assembled in a zirconium cladding tube. By design, an initial gap exists between these two elements. During irradiation, they both undergo transformations leading progressively to the closure of this gap. A local and non destructive examination of the pellet/cladding interface could constitute a useful help to identify the zones where the two materials are in contact, particularly at high burnups when a strong chemical bonding occurs under nominal operating conditions in PWR fuel rods. The evolution of the pellet/cladding bonding during irradiation is also an area of interest. In this context, the Institute of Electronic and Systems (IES- UMR CNRS 5214), in collaboration with the Alternative Energies and Atomic Energy Commission (CEA), is developing a high frequency acoustic microscope adapted to the control and imaging of the pellet/cladding interface with high resolution. Because the geometrical, chemical and mechanical nature of the contact interface is neither axially nor radially homogeneous, 2D images of this interface need to be acquired via this ultrasonic system with a highly performing processing signal and by means of controlled displacement of the sample rod along both its axis and its circumference. Modeling the multi-layer system (water, cladding, fuel etc.) is necessary in this present study and aims to take into account all the parameters that have an influence on the resolution of the acquired images. The first prototype of this microscope and the first results of the visualization of the inner face of the cladding will be presented in a poster in order to highlight the potentials of the system, whose final objective is to be introduced in the existing bench MEGAFOX dedicated to the non-destructive examination of irradiated fuel rods at LECA-STAR facility in CEA-Cadarache.

Keywords: high-frequency acoustic microscopy, multi-layer model, non-destructive testing, nuclear fuel rod, pellet/cladding interface, signal processing

Procedia PDF Downloads 163
355 Investigation of Shear Strength, and Dilative Behavior of Coarse-grained Samples Using Laboratory Test and Machine Learning Technique

Authors: Ehsan Mehryaar, Seyed Armin Motahari Tabari

Abstract:

Coarse-grained soils are known and commonly used in a wide range of geotechnical projects, including high earth dams or embankments for their high shear strength. The most important engineering property of these soils is friction angle which represents the interlocking between soil particles and can be applied widely in designing and constructing these earth structures. Friction angle and dilative behavior of coarse-grained soils can be estimated from empirical correlations with in-situ testing and physical properties of the soil or measured directly in the laboratory performing direct shear or triaxial tests. Unfortunately, large-scale testing is difficult, challenging, and expensive and is not possible in most soil mechanic laboratories. So, it is common to remove the large particles and do the tests, which cannot be counted as an exact estimation of the parameters and behavior of the original soil. This paper describes a new methodology to simulate particles grading distribution of a well-graded gravel sample to a smaller scale sample as it can be tested in an ordinary direct shear apparatus to estimate the stress-strain behavior, friction angle, and dilative behavior of the original coarse-grained soil considering its confining pressure, and relative density using a machine learning method. A total number of 72 direct shear tests are performed in 6 different sizes, 3 different confining pressures, and 4 different relative densities. Multivariate Adaptive Regression Spline (MARS) technique was used to develop an equation in order to predict shear strength and dilative behavior based on the size distribution of coarse-grained soil particles. Also, an uncertainty analysis was performed in order to examine the reliability of the proposed equation.

Keywords: MARS, coarse-grained soil, shear strength, uncertainty analysis

Procedia PDF Downloads 139
354 Flow Reproduction Using Vortex Particle Methods for Wake Buffeting Analysis of Bluff Structures

Authors: Samir Chawdhury, Guido Morgenthal

Abstract:

The paper presents a novel extension of Vortex Particle Methods (VPM) where the study aims to reproduce a template simulation of complex flow field that is generated from impulsively started flow past an upstream bluff body at certain Reynolds number Re-Vibration of a structural system under upstream wake flow is often considered its governing design criteria. Therefore, the attention is given in this study especially for the reproduction of wake flow simulation. The basic methodology for the implementation of the flow reproduction requires the downstream velocity sampling from the template flow simulation; therefore, at particular distances from the upstream section the instantaneous velocity components are sampled using a series of square sampling-cells arranged vertically where each of the cell contains four velocity sampling points at its corner. Since the grid free Lagrangian VPM algorithm discretises vorticity on particle elements, the method requires transformation of the velocity components into vortex circulation, and finally the simulation of the reproduction of the template flow field by seeding these vortex circulations or particles into a free stream flow. It is noteworthy that the vortex particles have to be released into the free stream exactly at same rate of velocity sampling. Studies have been done, specifically, in terms of different sampling rates and velocity sampling positions to find their effects on flow reproduction quality. The quality assessments are mainly done, using a downstream flow monitoring profile, by comparing the characteristic wind flow profiles using several statistical turbulence measures. Additionally, the comparisons are performed using velocity time histories, snapshots of the flow fields, and the vibration of a downstream bluff section by performing wake buffeting analyses of the section under the original and reproduced wake flows. Convergence study is performed for the validation of the method. The study also describes the possibilities how to achieve flow reproductions with less computational effort.

Keywords: vortex particle method, wake flow, flow reproduction, wake buffeting analysis

Procedia PDF Downloads 287
353 Short Teaching Sessions for Emergency Front of Neck Access

Authors: S. M. C. Kelly, A. Hargreaves, S. Hargreaves

Abstract:

Introduction: The Can’t intubate, Can’t ventilate emergency scenario is one which has been shown to be managed badly in the past. Reasons identified included gaps in knowledge of the procedure and the emergency equipment used. We aimed to show an increase in confidence amongst anesthetists and operating department practitioners in the technique following a short tea trolley style teaching intervention. Methods: We carried out the teaching on a one-to-one basis. Two Anaesthetists visited each operating theatre during normal working days. One carried out the teaching session and one took over the intra‐operative care of the patient, releasing the listed anaesthetist for a short teaching session. The teaching was delivered to mixture of students and healthcare professionals, both anaesthetists and anaesthetic practitioners. The equipment includes a trolley, an airway manikin, size 10 scalpel, bougie and size 6.0 tracheal tube. The educator discussed the equipment, performed a demonstration and observed the participants performing the procedure. We asked each person to fill out a pre and post teaching questionnaire, stating their confidence with the procedure. Results: The teaching was delivered to 63 participants in total, which included 21 consultant anaesthetists, 23 trainee doctors and 19 anaesthetic practitioners. The teaching sessions lasted on average 9 minutes (range 5– 15 minutes). All participants reported an increase in confidence in both the equipment and technique in front of neck access. Anaesthetic practitioners reported the greatest increase in confidence (53%), with trainee anaesthetists reporting 27% increase and consultant anaesthetists 22%. Overall, confidence in the performance of emergency front of neck access increased by 31% after the teaching session. Discussion: Short ‘Trolley style’ teaching improves confidence in the equipment and technique used for the emergency front of neck access. This is true for students and for consultant anaesthetists. This teaching style is quick with minimal running costs and is relevant for all anesthetic departments.

Keywords: airway teaching, can't intubate can't ventilate, cricothyroidotomy, front-of-neck

Procedia PDF Downloads 124
352 Indian Premier League (IPL) Score Prediction: Comparative Analysis of Machine Learning Models

Authors: Rohini Hariharan, Yazhini R, Bhamidipati Naga Shrikarti

Abstract:

In the realm of cricket, particularly within the context of the Indian Premier League (IPL), the ability to predict team scores accurately holds significant importance for both cricket enthusiasts and stakeholders alike. This paper presents a comprehensive study on IPL score prediction utilizing various machine learning algorithms, including Support Vector Machines (SVM), XGBoost, Multiple Regression, Linear Regression, K-nearest neighbors (KNN), and Random Forest. Through meticulous data preprocessing, feature engineering, and model selection, we aimed to develop a robust predictive framework capable of forecasting team scores with high precision. Our experimentation involved the analysis of historical IPL match data encompassing diverse match and player statistics. Leveraging this data, we employed state-of-the-art machine learning techniques to train and evaluate the performance of each model. Notably, Multiple Regression emerged as the top-performing algorithm, achieving an impressive accuracy of 77.19% and a precision of 54.05% (within a threshold of +/- 10 runs). This research contributes to the advancement of sports analytics by demonstrating the efficacy of machine learning in predicting IPL team scores. The findings underscore the potential of advanced predictive modeling techniques to provide valuable insights for cricket enthusiasts, team management, and betting agencies. Additionally, this study serves as a benchmark for future research endeavors aimed at enhancing the accuracy and interpretability of IPL score prediction models.

Keywords: indian premier league (IPL), cricket, score prediction, machine learning, support vector machines (SVM), xgboost, multiple regression, linear regression, k-nearest neighbors (KNN), random forest, sports analytics

Procedia PDF Downloads 24
351 Radar Track-based Classification of Birds and UAVs

Authors: Altilio Rosa, Chirico Francesco, Foglia Goffredo

Abstract:

In recent years, the number of Unmanned Aerial Vehicles (UAVs) has significantly increased. The rapid development of commercial and recreational drones makes them an important part of our society. Despite the growing list of their applications, these vehicles pose a huge threat to civil and military installations: detection, classification and neutralization of such flying objects become an urgent need. Radar is an effective remote sensing tool for detecting and tracking flying objects, but scenarios characterized by the presence of a high number of tracks related to flying birds make especially challenging the drone detection task: operator PPI is cluttered with a huge number of potential threats and his reaction time can be severely affected. Flying birds compared to UAVs show similar velocity, RADAR cross-section and, in general, similar characteristics. Building from the absence of a single feature that is able to distinguish UAVs and birds, this paper uses a multiple features approach where an original feature selection technique is developed to feed binary classifiers trained to distinguish birds and UAVs. RADAR tracks acquired on the field and related to different UAVs and birds performing various trajectories were used to extract specifically designed target movement-related features based on velocity, trajectory and signal strength. An optimization strategy based on a genetic algorithm is also introduced to select the optimal subset of features and to estimate the performance of several classification algorithms (Neural network, SVM, Logistic regression…) both in terms of the number of selected features and misclassification error. Results show that the proposed methods are able to reduce the dimension of the data space and to remove almost all non-drone false targets with a suitable classification accuracy (higher than 95%).

Keywords: birds, classification, machine learning, UAVs

Procedia PDF Downloads 188
350 Performing Marginality and Contestation of Ethnic Identity: Dynamics of Identity Politics in Assam, India

Authors: Hare Krishna Doley

Abstract:

Drawing upon empirical data, this paper tries to examine how ethnic groups like Ahom, Moran, Motok, and Chutia creates and recreates ethnic boundaries while making claims for recognition as Scheduled Tribes (STs) under the Sixth Schedule of the Constitution of India, in the state of Assam. Underlying such claim is the distinct identity consciousness amongst these groups as they assert themselves originally as tribe drawing upon primordial elements. For them, tribal identity promises social justice and give credence to their claims of indigeneity while preserving their exclusivity within the multifarious society of Assam. Having complex inter-group relationships, these groups under study displays distinct as well as overlapping identities, which demonstrate fluidity of identities across groups while making claims for recognition. In this process, the binary of ‘us’ and ‘them’ are often constructed amongst these groups, which are in turn difficult to grasp as they share common historical linkages. This paper attempts to grapple with such complex relationships the studied groups and their assertion as distinct cultural entities while making ethnic boundaries on the basis of socio-cultural identities. Such claims also involve frequent negotiation with the Sate as well as with other ethnic groups, which further creates strife among indigenous groups for tribal identity. The paper argues that identity consciousnesses amongst groups have persisted since the introduction of resource distribution on ethnic lines; therefore, issues of exclusive ethnic identity in the state of Assam can be contextualised within the colonial and post-colonial politics of redrawing ethnic and spatial boundaries. Narrative of the ethnic leaders who are in the forefront of struggle for ST status revealed that it is not merely to secure preferential treatment, but it also encompasses entitlement to land and their socio-cultural identity as aboriginal. While noting the genesis of struggle by the ethnic associations for ST status, this paper will also delineate the interactions among ethnic groups and how the identity of tribe is being performed by them to be included in the official categories of ST.

Keywords: ethnic, identity, sixth schedule, tribe

Procedia PDF Downloads 178
349 2D and 3D Breast Cancer Cells Behave Differently to the Applied Free Palbociclib or the Palbociclib-Loaded Nanoparticles

Authors: Maryam Parsian, Pelin Mutlu, Ufuk Gunduz

Abstract:

Two-dimensional cell culture affords simplicity and low cost, but it has serious limitations; lacking cell-cell and cell-matrix interactions that are present in tissues. Cancer cells grown in 3D culture systems have distinct phenotypes of adhesion, growth, migration, invasion as well as profiles of gene and protein expression. These interactions cause the 3D-cultured cells to acquire morphological and cellular characteristics relevant to in vivo tumors. Palbociclib is a chemotherapeutic agent for the treatment of ER-positive and HER-negative metastatic breast cancer. Poly-amidoamine (PAMAM) dendrimer is a well-defined, special three-dimensional structure and has a multivalent surface and internal cavities that can play an essential role in drug delivery systems. In this study, palbociclib is loaded onto the magnetic PAMAM dendrimer. Hanging droplet method was used in order to form 3D spheroids. The possible toxic effects of both free drug and drug loaded nanoparticles were evaluated in 2D and 3D MCF-7, MD-MB-231 and SKBR-3 breast cancer cell culture models by performing MTT cell viability and Alamar Blue assays. MTT analysis was performed with six different doses from 1000 µg/ml to 25 µg/ml. Drug unloaded PAMAM dendrimer did not demonstrate significant toxicity on all breast cancer cell lines. The results showed that 3D spheroids are clearly less sensitive than 2D cell cultures to free palbociclib. Also, palbociclib loaded PAMAM dendrimers showed more toxic effect than free palbociclib in all cell lines at 2D and 3D cultures. The results suggest that the traditional cell culture method (2D) is insufficient for mimicking the actual tumor tissue. The response of the cancer cells to anticancer drugs is different in the 2D and 3D culture conditions. This study showed that breast cancer cells are more resistant to free palbociclib in 3D cultures than in 2D cultures. However, nanoparticle loaded drugs can be more cytotoxic when compared to free drug.

Keywords: 2D and 3D cell culture, breast cancer, palbociclibe, PAMAM magnetic nanoparticles

Procedia PDF Downloads 128
348 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 95
347 Atomic Decomposition Audio Data Compression and Denoising Using Sparse Dictionary Feature Learning

Authors: T. Bryan , V. Kepuska, I. Kostnaic

Abstract:

A method of data compression and denoising is introduced that is based on atomic decomposition of audio data using “basis vectors” that are learned from the audio data itself. The basis vectors are shown to have higher data compression and better signal-to-noise enhancement than the Gabor and gammatone “seed atoms” that were used to generate them. The basis vectors are the input weights of a Sparse AutoEncoder (SAE) that is trained using “envelope samples” of windowed segments of the audio data. The envelope samples are extracted from the audio data by performing atomic decomposition with Gabor or gammatone seed atoms. This process identifies segments of audio data that are locally coherent with the seed atoms. Envelope samples are extracted by identifying locally coherent audio data segments with Gabor or gammatone seed atoms, found by matching pursuit. The envelope samples are formed by taking the kronecker products of the atomic envelopes with the locally coherent data segments. Oracle signal-to-noise ratio (SNR) verses data compression curves are generated for the seed atoms as well as the basis vectors learned from Gabor and gammatone seed atoms. SNR data compression curves are generated for speech signals as well as early American music recordings. The basis vectors are shown to have higher denoising capability for data compression rates ranging from 90% to 99.84% for speech as well as music. Envelope samples are displayed as images by folding the time series into column vectors. This display method is used to compare of the output of the SAE with the envelope samples that produced them. The basis vectors are also displayed as images. Sparsity is shown to play an important role in producing the highest denoising basis vectors.

Keywords: sparse dictionary learning, autoencoder, sparse autoencoder, basis vectors, atomic decomposition, envelope sampling, envelope samples, Gabor, gammatone, matching pursuit

Procedia PDF Downloads 229
346 Comparison of Trunk and Hip Muscle Activities and Anterior Pelvic Tilt Angle during Three Different Bridging Exercises in Subjects with Chronic Low Back Pain

Authors: Da-Eun Kim, Heon-Seock Cynn, Sil-Ah Choi, A-Reum Shin

Abstract:

Bridging exercise in supine position with the hips and knees flexed have been commonly performed as one of the therapeutic exercises and is a comfortable and pain-free position to most individuals with chronic low back pain (CLBP). Many previous studies have investigated the beneficial way of performing bridging exercises to improve activation of abdominal and gluteal muscle and reduce muscle activity of hamstrings (HAM) and erector spinae (ES) and compensatory lumbopelvic motion. The purpose of this study was to compare the effects of three different bridging exercises on the HAM, ES, gluteus maximus (Gmax), gluteus medius (Gmed), and transverse abdominis/internal abdominis oblique (TrA/IO) activities and anterior pelvic tilt angle in subjects with CLBP. Seventeen subjects with CLBP participated in this study. They performed bridging under three different conditions (with 30° hip abduction, isometric hip abduction, and isometric hip adduction). Surface electromyography was used to measure muscle activity, and the ImageJ software was used to calculate anterior pelvic tilt angle. One-way repeated-measures analysis of variance was used to assess the statistical significance of the measured variables. HAM activity was significantly lower in bridging with 30° hip abduction and isometric hip abduction than in bridging with isometric hip adduction. Gmax and Gmed activities were significantly greater in bridging with isometric hip abduction than in bridging with 30° hip abduction and isometric hip adduction. TrA/IO muscle activity was significantly greater and anterior pelvic tilt angle was significantly lower in bridging with isometric hip adduction than in bridging with 30° hip abduction and isometric hip abduction. Bridging with isometric hip abduction using Thera-Band can effectively reduce HAM activity, and increase Gmax and Gmed activities in subjects with CLBP. Bridging with isometric hip adduction using a pressure biofeedback unit can be a beneficial exercise to improve TrA/IO activity and minimize anterior pelvic tilt in subjects with CLBP.

Keywords: bridging exercise, electromyography, low back pain, lower limb exercise

Procedia PDF Downloads 190
345 Knowledge Sharing Practices in the Healthcare Sector: Evidences from Primary Health Care Organizations in Indonesia

Authors: Galih Imaduddin

Abstract:

Knowledge has been viewed as one of the most important resources in organizations, including those that operate in the healthcare sector. On that basis, Knowledge Management (KM) is crucial for healthcare organizations to improve their productivity and ensure effective utilization of their resources. Despite the growing interests to understand how KM might work for healthcare organizations, there is only a modest amount of empirical inquiries which have specifically focused on the tools and initiatives to share knowledge. Hence, the main purpose of this paper is to investigate the way healthcare organizations, particularly public sector ones, utilize knowledge sharing tools and initiatives for the benefit of patient-care. Employing a qualitative method, 13 (thirteen) Community Health Centers (CHCs) from a high-performing district health setting in Indonesia were observed. Data collection and analysis involved a repetition of document retrievals and interviews (n=41) with multidisciplinary health professionals who work in these CHCs. A single case study was cultivated reflecting on the means that were used to share knowledge, along with the factors that inhibited the exchange of knowledge among those health professionals. The study discovers that all of the thirteen CHCs exhibited and applied knowledge sharing means which included knowledge documents, virtual communication channels (i.e. emails and chatting applications), and social learning forums such as staff meetings, morning briefings, and communities of practices. However, the intensity of utilization was different among these CHCs, in which organizational culture, leadership, professional boundaries, and employees’ technological aptitude were presumed to be the factors that inhibit knowledge sharing processes. Making a distance with the KM literature of other sectors, this study denounces the primacy of technology-based tools, suggesting that socially-based initiatives could be more reliable for sharing knowledge. This suggestion is largely due to the nature of healthcare work which is still predominantly based on the tacit form of knowledge.

Keywords: knowledge management, knowledge sharing, knowledge sharing tools and initiatives, knowledge sharing inhibitors, primary health care organizations

Procedia PDF Downloads 224
344 DNA Methylation Score Development for In utero Exposure to Paternal Smoking Using a Supervised Machine Learning Approach

Authors: Cristy Stagnar, Nina Hubig, Diana Ivankovic

Abstract:

The epigenome is a compelling candidate for mediating long-term responses to environmental effects modifying disease risk. The main goal of this research is to develop a machine learning-based DNA methylation score, which will be valuable in delineating the unique contribution of paternal epigenetic modifications to the germline impacting childhood health outcomes. It will also be a useful tool in validating self-reports of nonsmoking and in adjusting epigenome-wide DNA methylation association studies for this early-life exposure. Using secondary data from two population-based methylation profiling studies, our DNA methylation score is based on CpG DNA methylation measurements from cord blood gathered from children whose fathers smoked pre- and peri-conceptually. Each child’s mother and father fell into one of three class labels in the accompanying questionnaires -never smoker, former smoker, or current smoker. By applying different machine learning algorithms to the accessible resource for integrated epigenomic studies (ARIES) sub-study of the Avon longitudinal study of parents and children (ALSPAC) data set, which we used for training and testing of our model, the best-performing algorithm for classifying the father smoker and mother never smoker was selected based on Cohen’s κ. Error in the model was identified and optimized. The final DNA methylation score was further tested and validated in an independent data set. This resulted in a linear combination of methylation values of selected probes via a logistic link function that accurately classified each group and contributed the most towards classification. The result is a unique, robust DNA methylation score which combines information on DNA methylation and early life exposure of offspring to paternal smoking during pregnancy and which may be used to examine the paternal contribution to offspring health outcomes.

Keywords: epigenome, health outcomes, paternal preconception environmental exposures, supervised machine learning

Procedia PDF Downloads 168
343 Implementation of Enhanced Recovery after Surgery (ERAS) Protocols in Laparoscopic Sleeve Gastrectomy (LSG); A Systematic Review and Meta-analysis

Authors: Misbah Nizamani, Saira Malik

Abstract:

Introduction: Bariatric surgery is the most effective treatment for patients suffering from morbid obesity. Laparoscopic sleeve gastrectomy (LSG) accounts for over 50% of total bariatric procedures. The aim of our meta-analysis is to investigate the effectiveness and safety of Enhanced Recovery After Surgery (ERAS) protocols for patients undergoing laparoscopic sleeve gastrectomy. Method: To gather data, we searched PubMed, Google Scholar, ScienceDirect, and Cochrane Central. Eligible studies were randomized controlled trials and cohort studies involving adult patients (≥18 years) undergoing bariatric surgeries, i.e., Laparoscopic sleeve gastrectomy. Outcome measures included LOS, postoperative narcotic usage, postoperative pain score, postoperative nausea and vomiting, postoperative complications and mortality, emergency department visits and readmission rates. RevMan version 5.4 was used to analyze outcomes. Results: Three RCTs and three cohorts with 1522 patients were included in this study. ERAS group and control group were compared for eight outcomes. LOS was reduced significantly in the intervention group (p=0.00001), readmission rates had borderline differences (p=0.35) and higher postoperative complications in the control group, but the result was non-significant (p=0.68), whereas postoperative pain score was significantly reduced (p=0.005). Total MME requirements became significant after performing sensitivity analysis (p= 0.0004). Postoperative mortality could not be analyzed on account of invalid data showing 0% mortality in two cohort studies. Conclusion: This systemic review indicated the effectiveness of the application of ERAS protocols in LSG in reducing the length of stay, post-operative pain and total MME requirements postoperatively, indicating the feasibility and assurance of its application.

Keywords: eras protocol, sleeve gastrectomy, bariatric surgery, enhanced recovery after surgery

Procedia PDF Downloads 23
342 Heat-Induced Uncertainty of Industrial Computed Tomography Measuring a Stainless Steel Cylinder

Authors: Verena M. Moock, Darien E. Arce Chávez, Mariana M. Espejel González, Leopoldo Ruíz-Huerta, Crescencio García-Segundo

Abstract:

Uncertainty analysis in industrial computed tomography is commonly related to metrological trace tools, which offer precision measurements of external part features. Unfortunately, there is no such reference tool for internal measurements to profit from the unique imaging potential of X-rays. Uncertainty approximations for computed tomography are still based on general aspects of the industrial machine and do not adapt to acquisition parameters or part characteristics. The present study investigates the impact of the acquisition time on the dimensional uncertainty measuring a stainless steel cylinder with a circular tomography scan. The authors develop the figure difference method for X-ray radiography to evaluate the volumetric differences introduced within the projected absorption maps of the metal workpiece. The dimensional uncertainty is dominantly influenced by photon energy dissipated as heat causing the thermal expansion of the metal, as monitored by an infrared camera within the industrial tomograph. With the proposed methodology, we are able to show evolving temperature differences throughout the tomography acquisition. This is an early study showing that the number of projections in computer tomography induces dimensional error due to energy absorption. The error magnitude would depend on the thermal properties of the sample and the acquisition parameters by placing apparent non-uniform unwanted volumetric expansion. We introduce infrared imaging for the experimental display of metrological uncertainty in a particular metal part of symmetric geometry. We assess that the current results are of fundamental value to reach the balance between the number of projections and uncertainty tolerance when performing analysis with X-ray dimensional exploration in precision measurements with industrial tomography.

Keywords: computed tomography, digital metrology, infrared imaging, thermal expansion

Procedia PDF Downloads 97
341 Amsan Syndrome in Emergency Department

Authors: Okan Cakir, Okan Tatli

Abstract:

Acute motor and sensory axonal neuropathy (AMSAN) syndrome usually occurs following a postviral infection in two to four weeks and is a polyneuropathy characterized by axonal and sensorial degeneration as a rare variant of Gullian-Barre syndrome. In our case, we wanted to mention that a rare case of AMSAN Syndrome due to prior surgery. A 61-year-old male case admitted to emergency department with complaints of weakness in feet, numbness and incapability to walk. In his history, it was learned that endovascular aneurysm repair (EVAR) had applied for abdominal aort aneurysm two weeks ago before admission, his complaints had been for a couple of days increasingly and bilaterally, and there had been no infection disease history for four weeks. In physical examination, general status was good, vital signs were stable, and there was a mild paresis in dorsal flexion of feet in bilaterally lower extremities. No nuchal rigidity was determined. Other system examinations were normal. Urea:52 mg/dL (normal range: 15-44 mg/dL), creatinine: 1,05 mg/dL (normal range: 0,81-1,4 mg/dL), potassium: 3,68 mmol/L (normal range: 3,5-5,5 mmol/L), glycaemia: 142 mg/dL, calcium: 9,71 mg/dL (normal range: 8,5-10,5 mg/dL), erythrocyte sedimentation rate (ESR): 74 mm/h (normal range: 0-15 mm/h) were determined in biochemical tests. The case was consulted to neurology department and hospitalized. In performing electromyography, it was reported as a bilateral significant axonal degeneration with sensory-motor polyneuropathy. Normal ranges of glycaemia and protein levels were detected in lumbal punction. Viral markers and bucella, toxoplasma, and rubella markers were in normal range. Intravenous immunoglobulin (IVIG) was applied as a treatment, physical treatment programme was planned and the case discharged from neurology department. In our case, we mentioned that it should be considered polyneuropathy as an alternative diagnosis in cases admitting symptoms like weakness and numbness had a history of prior surgery.

Keywords: AMSAN Syndrome, emergency department, prior surgery, weakness

Procedia PDF Downloads 317
340 Forensic Entomology in Algeria

Authors: Meriem Taleb, Ghania Tail, Fatma Zohra Kara, Brahim Djedouani, T. Moussa

Abstract:

Forensic entomology is the use of insects and their arthropod relatives as silent witnesses to aid legal investigations by interpreting information concerning a death. The main purpose of forensic entomology is to establish the postmortem interval or PMI Postmortem interval is a matter of crucial importance in the investigations of homicide and other untimely deaths when the body found is after three days. Forensic entomology has grown immensely as a discipline in the past thirty years. In Algeria, forensic entomology was introduced in 2010 by the National Institute for Criminalistics and Criminology of the National Gendarmerie (NICC). However, all the work that has been done so far in this growing field in Algeria has been unknown at both the national and international levels. In this context, the aim of this paper is to describe the state of forensic entomology in Algeria. The Laboratory of Entomology of the NICC is the only one of its kind in Algeria. It started its activities in 2010, consisting of two specialists. The main missions of the laboratory are estimation of the PMI by the analysis of entomological evidence, and determination if the body was moved. Currently, the laboratory is performing different tasks such as the expert work required by investigators to estimate the PMI using the insects. The estimation is performed by the accumulated degree days method (ADD) in most of the cases except for those where the cadaver is in dry decay. To assure the quality of the entomological evidence, crime scene personnel are trained by the laboratory of Entomology of the NICC. Recently, undergraduate and graduate students have been studying carrion ecology and insect activity in different geographic locations of Algeria using rabbits and wild boar cadavers as animal models. The Laboratory of Entomology of the NICC has also been involved in some of these research projects. Entomotoxicology experiments are also conducted with the collaboration of the Toxicology Department of the NICC. By dint of hard work that has been performed by the Laboratory of Entomology of the NICC, official bodies have been adopting more and more the use of entomological evidence in criminal investigations in Algeria, which is commendable. It is important, therefore, that steps are taken to fill in the gaps in the knowledge necessary for entomological evidence to have a useful future in criminal investigations in Algeria.

Keywords: forensic entomology, corpse, insects, postmortem interval, expertise, Algeria

Procedia PDF Downloads 378
339 Utilization of Oat in Rabbit Feed for the Development of Healthier Rabbit Meat and Its Impact on Human Blood Lipid Profile

Authors: Muhammad Rizwan Tariq, Muhammad Issa Khan, Zulfiqar Ahmad, Muhammad Adnan Nasir, Muhammad Sameem Javed, Sheraz Ahmed

Abstract:

Functional foods may be a good tool that can be simply utilized in reducing community health expenses. Regular consumption of rabbit meat can offer patrons with bioactive components because the manipulation in rabbit feed is much successful to raise the levels of conjugated linoleic acid, ecosapentaenoic acid, decosahexaenoic acid, polyunsaturated fatty acids, selenium, tocopherol etc. and to reduce the ω-3/ω-6 ratio which is performing a major role in curing of cardiovascular and several other diseases. In comparison to the meats of other species, rabbit meat has higher amounts of protein with essential amino acids, especially in the muscles and low cholesterol contents that also have elevated digestibility. The present study was carried out to develop the functional rabbit meat by modifying feed ingredient of rabbit diet. Thirty-day old rabbits were fed with feeds containing 2 % and 4 % oat. The feeding trial was carried out for eight weeks. Rabbits were divided into three different groups and reared for the period of two months. T0 rabbits were considered control group while T1 rabbits were reared on 4% oat, and T2 were on 2% oat in the feed. At the end of the 8 weeks, the rabbits were slaughtered. Results presented in this study concluded that 4 % oat seed supplementation enhanced n-3 PUFA in meat. It was observed that oat seed supplementation also reduced fat percentage in the meat. Utilization of oat in the feed of rabbits significantly affected the pH, protein, fat, textural and concentration of polyunsaturated fatty acids. A study trial was conducted in order to examine the impact of functional meat on the blood lipid profile of human subjects. They were given rabbit meat in comparison to the chicken meat for the period of one month. The cholesterol, triglycerides and low density lipoprotein were found to be lower in blood serum of human subject group treated with 4 % oat meat.

Keywords: functional food, functional rabbit meat, meat quality, rabbit

Procedia PDF Downloads 333