Search results for: evidence based practice
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32315

Search results for: evidence based practice

20585 Human Rights Violation in Modern Society

Authors: Shenouda Salib Hosni Rofail

Abstract:

The interface between development and human rights has long been the subject of scholarly debate. As a result, a set of principles ranging from the right to development to a human rights-based approach to development has been adopted to understand the dynamics between the two concepts. Despite these attempts, the exact link between development and human rights is not yet fully understood. However, the inevitable interdependence between the two concepts and the idea that development efforts must be made while respecting human rights have gained prominence in recent years. On the other hand, the emergence of sustainable development as a widely accepted approach to development goals and policies further complicates this unresolved convergence. The place of sustainable development in the human rights discourse and its role in ensuring the sustainability of development programs require systematic research. The aim of this article is, therefore, to examine the relationship between development and human rights, with a particular focus on the place of the principles of sustainable development in international human rights law. It will continue to examine whether it recognizes the right to sustainable development. Thus, the Article states that the principles of sustainable development are recognized directly or implicitly in various human rights instruments, which is an affirmative answer to the question posed above. Accordingly, this document scrutinizes international and regional human rights instruments, as well as the case law and interpretations of human rights bodies, to support this hypothesis.

Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security.

Procedia PDF Downloads 20
20584 Generalized Correlation Coefficient in Genome-Wide Association Analysis of Cognitive Ability in Twins

Authors: Afsaneh Mohammadnejad, Marianne Nygaard, Jan Baumbach, Shuxia Li, Weilong Li, Jesper Lund, Jacob v. B. Hjelmborg, Lene Christensen, Qihua Tan

Abstract:

Cognitive impairment in the elderly is a key issue affecting the quality of life. Despite a strong genetic background in cognition, only a limited number of single nucleotide polymorphisms (SNPs) have been found. These explain a small proportion of the genetic component of cognitive function, thus leaving a large proportion unaccounted for. We hypothesize that one reason for this missing heritability is the misspecified modeling in data analysis concerning phenotype distribution as well as the relationship between SNP dosage and the phenotype of interest. In an attempt to overcome these issues, we introduced a model-free method based on the generalized correlation coefficient (GCC) in a genome-wide association study (GWAS) of cognitive function in twin samples and compared its performance with two popular linear regression models. The GCC-based GWAS identified two genome-wide significant (P-value < 5e-8) SNPs; rs2904650 near ZDHHC2 on chromosome 8 and rs111256489 near CD6 on chromosome 11. The kinship model also detected two genome-wide significant SNPs, rs112169253 on chromosome 4 and rs17417920 on chromosome 7, whereas no genome-wide significant SNPs were found by the linear mixed model (LME). Compared to the linear models, more meaningful biological pathways like GABA receptor activation, ion channel transport, neuroactive ligand-receptor interaction, and the renin-angiotensin system were found to be enriched by SNPs from GCC. The GCC model outperformed the linear regression models by identifying more genome-wide significant genetic variants and more meaningful biological pathways related to cognitive function. Moreover, GCC-based GWAS was robust in handling genetically related twin samples, which is an important feature in handling genetic confounding in association studies.

Keywords: cognition, generalized correlation coefficient, GWAS, twins

Procedia PDF Downloads 108
20583 Development of a Microfluidic Device for Low-Volume Sample Lysis

Authors: Abbas Ali Husseini, Ali Mohammad Yazdani, Fatemeh Ghadiri, Alper Şişman

Abstract:

We developed a microchip device that uses surface acoustic waves for rapid lysis of low level of cell samples. The device incorporates sharp-edge glass microparticles for improved performance. We optimized the lysis conditions for high efficiency and evaluated the device's feasibility for point-of-care applications. The microchip contains a 13-finger pair interdigital transducer with a 30-degree focused angle. It generates high-intensity acoustic beams that converge 6 mm away. The microchip operates at a frequency of 16 MHz, exciting Rayleigh waves with a 250 µm wavelength on the LiNbO3 substrate. Cell lysis occurs when Candida albicans cells and glass particles are placed within the focal area. The high-intensity surface acoustic waves induce centrifugal forces on the cells and glass particles, resulting in cell lysis through lateral forces from the sharp-edge glass particles. We conducted 42 pilot cell lysis experiments to optimize the surface acoustic wave-induced streaming. We varied electrical power, droplet volume, glass particle size, concentration, and lysis time. A regression machine-learning model determined the impact of each parameter on lysis efficiency. Based on these findings, we predicted optimal conditions: electrical signal of 2.5 W, sample volume of 20 µl, glass particle size below 10 µm, concentration of 0.2 µg, and a 5-minute lysis period. Downstream analysis successfully amplified a DNA target fragment directly from the lysate. The study presents an efficient microchip-based cell lysis method employing acoustic streaming and microparticle collisions within microdroplets. Integration of a surface acoustic wave-based lysis chip with an isothermal amplification method enables swift point-of-care applications.

Keywords: cell lysis, surface acoustic wave, micro-glass particle, droplet

Procedia PDF Downloads 65
20582 The Use of Information and Communication Technology within and between Emergency Medical Teams during a Disaster: A Qualitative study

Authors: Badryah Alshehri, Kevin Gormley, Gillian Prue, Karen McCutcheon

Abstract:

In a disaster event, sharing patient information between the pre-hospital Emergency Medical Services (EMS) and Emergency Department (ED) hospitals is a complex process during which important information may be altered or lost due to poor communication. The aim of this study was to critically discuss the current evidence base in relation to communication between pre- EMS hospital and ED hospital professionals by the use of Information and Communication Systems (ICT). This study followed the systematic approach; six electronic databases were searched: CINAHL, Medline, Embase, PubMed, Web of Science, and IEEE Xplore Digital Library were comprehensively searched in January 2018 and a second search was completed in April 2020 to capture more recent publications. The study selection process was undertaken independently by the study authors. Both qualitative and quantitative studies were chosen that focused on factors that are positively or negatively associated with coordinated communication between pre-hospital EMS and ED teams in a disaster event. These studies were assessed for quality, and the data were analyzed according to the key screening themes which emerged from the literature search. Twenty-two studies were included. Eleven studies employed quantitative methods, seven studies used qualitative methods, and four studies used mixed methods. Four themes emerged on communication between EMTs (pre-hospital EMS and ED staff) in a disaster event using the ICT. (1) Disaster preparedness plans and coordination. This theme reported that disaster plans are in place in hospitals, and in some cases, there are interagency agreements with pre-hospital and relevant stakeholders. However, the findings showed that the disaster plans highlighted in these studies lacked information regarding coordinated communications within and between the pre-hospital and hospital. (2) Communication systems used in the disaster. This theme highlighted that although various communication systems are used between and within hospitals and pre-hospitals, technical issues have influenced communication between teams during disasters. (3) Integrated information management systems. This theme suggested the need for an integrated health information system that can help pre-hospital and hospital staff to record patient data and ensure the data is shared. (4) Disaster training and drills. While some studies analyzed disaster drills and training, the majority of these studies were focused on hospital departments other than EMTs. These studies suggest the need for simulation disaster training and drills, including EMTs. This review demonstrates that considerable gaps remain in the understanding of the communication between the EMS and ED hospital staff in relation to response in disasters. The review shows that although different types of ICTs are used, various issues remain which affect coordinated communication among the relevant professionals.

Keywords: emergency medical teams, communication, information and communication technologies, disaster

Procedia PDF Downloads 115
20581 Optimal Design of Composite Cylindrical Shell Based on Nonlinear Finite Element Analysis

Authors: Haider M. Alsaeq

Abstract:

The present research is an attempt to figure out the best configuration of composite cylindrical shells of the sandwich type, i.e. the lightest design of such shells required to sustain a certain load over a certain area. The optimization is based on elastic-plastic geometrically nonlinear incremental-iterative finite element analysis. The nine-node degenerated curved shell element is used in which five degrees of freedom are specified at each nodal point, with a layered model. The formulation of the geometrical nonlinearity problem is carried out using the well-known total Lagrangian principle. For the structural optimization problem, which is dealt with as a constrained nonlinear optimization, the so-called Modified Hooke and Jeeves method is employed by considering the weight of the shell as the objective function with stress and geometrical constraints. It was concluded that the optimum design of composite sandwich cylindrical shell that have a rigid polyurethane foam core and steel facing occurs when the area covered by the shell becomes almost square with a ratio of core thickness to facing thickness lies between 45 and 49, while the optimum height to length ration varies from 0.03 to 0.08 depending on the aspect ratio of the shell and its boundary conditions.

Keywords: composite structure, cylindrical shell, optimization, non-linear analysis, finite element

Procedia PDF Downloads 382
20580 Optical Flow Technique for Supersonic Jet Measurements

Authors: Haoxiang Desmond Lim, Jie Wu, Tze How Daniel New, Shengxian Shi

Abstract:

This paper outlines the development of a novel experimental technique in quantifying supersonic jet flows, in an attempt to avoid seeding particle problems frequently associated with particle-image velocimetry (PIV) techniques at high Mach numbers. Based on optical flow algorithms, the idea behind the technique involves using high speed cameras to capture Schlieren images of the supersonic jet shear layers, before they are subjected to an adapted optical flow algorithm based on the Horn-Schnuck method to determine the associated flow fields. The proposed method is capable of offering full-field unsteady flow information with potentially higher accuracy and resolution than existing point-measurements or PIV techniques. Preliminary study via numerical simulations of a circular de Laval jet nozzle successfully reveals flow and shock structures typically associated with supersonic jet flows, which serve as useful data for subsequent validation of the optical flow based experimental results. For experimental technique, a Z-type Schlieren setup is proposed with supersonic jet operated in cold mode, stagnation pressure of 8.2 bar and exit velocity of Mach 1.5. High-speed single-frame or double-frame cameras are used to capture successive Schlieren images. As implementation of optical flow technique to supersonic flows remains rare, the current focus revolves around methodology validation through synthetic images. The results of validation test offers valuable insight into how the optical flow algorithm can be further improved to improve robustness and accuracy. Details of the methodology employed and challenges faced will be further elaborated in the final conference paper should the abstract be accepted. Despite these challenges however, this novel supersonic flow measurement technique may potentially offer a simpler way to identify and quantify the fine spatial structures within the shock shear layer.

Keywords: Schlieren, optical flow, supersonic jets, shock shear layer

Procedia PDF Downloads 303
20579 Growing Architecture, Technical Product Harvesting of Near Net Shape Building Components

Authors: Franziska Moser, Martin Trautz, Anna-Lena Beger, Manuel Löwer, Jörg Feldhusen, Jürgen Prell, Alexandra Wormit, Björn Usadel, Christoph Kämpfer, Thomas-Benjamin Seiler, Henner Hollert

Abstract:

The demand for bio-based materials and components in architecture has increased in recent years due to society’s heightened environmental awareness. Nowadays, most components are being developed via a substitution approach, which aims at replacing conventional components with natural alternatives who are then being processed, shaped and manufactured to fit the desired application. This contribution introduces a novel approach to the development of bio-based products that decreases resource consumption and increases recyclability. In this approach, natural organisms like plants or trees are not being used in a processed form, but grow into a near net shape before then being harvested and utilized as building components. By minimizing the conventional production steps, the amount of resources used in manufacturing decreases whereas the recyclability increases. This paper presents the approach of technical product harvesting, explains the theoretical basis as well as the matching process of product requirements and biological properties, and shows first results of the growth manipulation studies.

Keywords: design with nature, eco manufacturing, sustainable construction materials, technical product harvesting

Procedia PDF Downloads 487
20578 Smart BIM Documents - the Development of the Ontology-Based Tool for Employer Information Requirements (OntEIR), and its Transformation into SmartEIR

Authors: Shadan Dwairi

Abstract:

Defining proper requirements is one of the key factors for a successful construction projects. Although there have been many attempts put forward in assist in identifying requirements, but still this area is under developed. In Buildings Information Modelling (BIM) projects. The Employer Information Requirements (EIR) is the fundamental requirements document and a necessary ingredient in achieving a successful BIM project. The provision on full and clear EIR is essential to achieving BIM Level-2. As Defined by PAS 1192-2, EIR is a “pre-tender document that sets out the information to be delivered and the standards and processes to be adopted by the supplier as part of the project delivery process”. It also notes that “EIR should be incorporated into tender documentation to enable suppliers to produce an initial BIM Execution Plan (BEP)”. The importance of effective definition of EIR lies in its contribution to a better productivity during the construction process in terms of cost and time, in addition to improving the quality of the built asset. Proper and clear information is a key aspect of the EIR, in terms of the information it contains and more importantly the information the client receives at the end of the project that will enable the effective management and operation of the asset, where typically about 60%-80% of the cost is spent. This paper reports on the research done in developing the Ontology-based tool for Employer Information Requirements (OntEIR). OntEIR has proven the ability to produce a full and complete set of EIRs, which ensures that the clients’ information needs for the final model delivered by BIM is clearly defined from the beginning of the process. It also reports on the work being done into transforming OntEIR into a smart tool for Defining Employer Information Requirements (smartEIR). smartEIR transforms the OntEIR tool into enabling it to develop custom EIR- tailored for the: Project Type, Project Requirements, and the Client Capabilities. The initial idea behind smartEIR is moving away from the notion “One EIR fits All”. smartEIR utilizes the links made in OntEIR and creating a 3D matrix that transforms it into a smart tool. The OntEIR tool is based on the OntEIR framework that utilizes both Ontology and the Decomposition of Goals to elicit and extract the complete set of requirements needed for a full and comprehensive EIR. A new ctaegorisation system for requirements is also introduced in the framework and tool, which facilitates the understanding and enhances the clarification of the requirements especially for novice clients. Findings of the evaluation of the tool that was done with experts in the industry, showed that the OntEIR tool contributes towards effective and efficient development of EIRs that provide a better understanding of the information requirements as requested by BIM, and support the production of a complete BIM Execution Plan (BEP) and a Master Information Delivery Plan (MIDP).

Keywords: building information modelling, employer information requirements, ontology, web-based, tool

Procedia PDF Downloads 112
20577 Investigating the Steam Generation Potential of Lithium Bromide Based CuO Nanofluid under Simulated Solar Flux

Authors: Tamseela Habib, Muhammad Amjad, Muhammad Edokali, Masome Moeni, Olivia Pickup, Ali Hassanpour

Abstract:

Nanofluid-assisted steam generation is rapidly attracting attention amongst the scientific community since it can be applied in a wide range of industrial processes. Because of its high absorption rate of solar energy, nanoparticle-based solar steam generation could be a major contributor to many applications, including water desalination, sterilization and power generation. Lithium bromide-based iron oxide nanofluids have been previously studied in steam generation, which showed promising results. However, the efficiency of the system could be improved if a more heat-conductive nanofluid system could be utilised. In the current paper, we report on an experimental investigation of the photothermal conversion properties of functionalised Copper oxide (CuO) nanoparticles used in Lithium Bromide salt solutions. CuO binary nanofluid was prepared by chemical functionalization with polyethyleneimine (PEI). Long-term stability evaluation of prepared binary nanofluid was done by a high-speed centrifuge analyser which showed a 0.06 Instability index suggesting low agglomeration and sedimentation tendencies. This stability is also supported by the measurements from dynamic light scattering (DLS), transmission electron microscope (TEM), and ultraviolet-visible (UV-Vis) spectrophotometer. The fluid rheology is also characterised, which suggests the system exhibits a Newtonian fluid behavior. The photothermal conversion efficiency of different concentrations of CuO was experimentally investigated under a solar simulator. Experimental results reveal that the binary nanofluid in this study can remarkably increase the solar energy trapping efficiency and evaporation rate as compared to conventional fluids due to localized solar energy harvesting by the surface of the nanofluid. It was found that 0.1wt% CuO NP is the optimum nanofluid concentration for enhanced sensible and latent heat efficiencies.

Keywords: nanofluids, vapor absorption refrigeration system, steam generation, high salinity

Procedia PDF Downloads 68
20576 Improved Soil and Snow Treatment with the Rapid Update Cycle Land-Surface Model for Regional and Global Weather Predictions

Authors: Tatiana G. Smirnova, Stan G. Benjamin

Abstract:

Rapid Update Cycle (RUC) land surface model (LSM) was a land-surface component in several generations of operational weather prediction models at the National Center for Environment Prediction (NCEP) at the National Oceanic and Atmospheric Administration (NOAA). It was designed for short-range weather predictions with an emphasis on severe weather and originally was intentionally simple to avoid uncertainties from poorly known parameters. Nevertheless, the RUC LSM, when coupled with the hourly-assimilating atmospheric model, can produce a realistic evolution of time-varying soil moisture and temperature, as well as the evolution of snow cover on the ground surface. This result is possible only if the soil/vegetation/snow component of the coupled weather prediction model has sufficient skill to avoid long-term drift. RUC LSM was first implemented in the operational NCEP Rapid Update Cycle (RUC) weather model in 1998 and later in the Weather Research Forecasting Model (WRF)-based Rapid Refresh (RAP) and High-resolution Rapid Refresh (HRRR). Being available to the international WRF community, it was implemented in operational weather models in Austria, New Zealand, and Switzerland. Based on the feedback from the US weather service offices and the international WRF community and also based on our own validation, RUC LSM has matured over the years. Also, a sea-ice module was added to RUC LSM for surface predictions over the Arctic sea-ice. Other modifications include refinements to the snow model and a more accurate specification of albedo, roughness length, and other surface properties. At present, RUC LSM is being tested in the regional application of the Unified Forecast System (UFS). The next generation UFS-based regional Rapid Refresh FV3 Standalone (RRFS) model will replace operational RAP and HRRR at NCEP. Over time, RUC LSM participated in several international model intercomparison projects to verify its skill using observed atmospheric forcing. The ESM-SnowMIP was the last of these experiments focused on the verification of snow models for open and forested regions. The simulations were performed for ten sites located in different climatic zones of the world forced with observed atmospheric conditions. While most of the 26 participating models have more sophisticated snow parameterizations than in RUC, RUC LSM got a high ranking in simulations of both snow water equivalent and surface temperature. However, ESM-SnowMIP experiment also revealed some issues in the RUC snow model, which will be addressed in this paper. One of them is the treatment of grid cells partially covered with snow. RUC snow module computes energy and moisture budgets of snow-covered and snow-free areas separately by aggregating the solutions at the end of each time step. Such treatment elevates the importance of computing in the model snow cover fraction. Improvements to the original simplistic threshold-based approach have been implemented and tested both offline and in the coupled weather model. The detailed description of changes to the snow cover fraction and other modifications to RUC soil and snow parameterizations will be described in this paper.

Keywords: land-surface models, weather prediction, hydrology, boundary-layer processes

Procedia PDF Downloads 74
20575 Investigation of Information Security Incident Management Based on International Standard ISO/IEC 27002 in Educational Hospitals in 2014

Authors: Nahid Tavakoli, Asghar Ehteshami, Akbar Hassanzadeh, Fatemeh Amini

Abstract:

Introduction: The Information security incident management guidelines was been developed to help hospitals to meet their information security event and incident management requirements. The purpose of this Study was to investigate on Information Security Incident Management in Isfahan’s educational hospitals in accordance to ISO/IEC 27002 standards. Methods: This was a cross-sectional study to investigate on Information Security Incident Management of educational hospitals in 2014. Based on ISO/IEC 27002 standards, two checklists were applied to check the compliance with standards on Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements. One inspector was trained to carry out the assessments in the hospitals. The data was analyzed by SPSS. Findings: In general the score of compliance Information Security Incident Management requirements in two steps; Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements was %60. There was the significant difference in various compliance levels among the hospitals (p-valueKeywords: information security incident management, information security management, standards, hospitals

Procedia PDF Downloads 564
20574 Fetal Movement Study Using Biomimics of the Maternal March

Authors: V. Diaz, B. Pardo , D. Villegas

Abstract:

In premature births most babies have complications at birth, these complications can be reduced, if an atmosphere of relaxation is provided and is also similar to intrauterine life, for this, there are programs where their mothers lull and sway them; however, the conditions in which they do so and the way in they do it may not be the indicated. Here we describe an investigation based on the biomimics of the kinematics of human fetal movement, which consists of determining the movements that the fetus experiences and the deformations of the components that surround the fetus during a gentle walk at week 32 of the gestation stage. This research is based on a 3D model that has the anatomical structure of the pelvis, fetus, muscles, uterus and its most important supporting elements (ligaments). Normal load conditions are applied to this model according to the stage of gestation and the kinematics of a gentle walk of a pregnant mother, which focuses on the pelvic bone, this allows to receive a response from the other elements of the model. To accomplish this modeling and subsequent simulation Solidworks software was used. From this analysis, the curves that describe the movement of the fetus at three different points were obtained. Additionally, we could found the deformation of the uterus and the ligaments that support it, showing the characteristics that these tissues can have in the face of the support of the fetus. These data can be used for the construction of artifacts that help the normal development of premature infants.

Keywords: simulation, biomimic, uterine model, fetal movement study

Procedia PDF Downloads 152
20573 Offline Parameter Identification and State-of-Charge Estimation for Healthy and Aged Electric Vehicle Batteries Based on the Combined Model

Authors: Xiaowei Zhang, Min Xu, Saeid Habibi, Fengjun Yan, Ryan Ahmed

Abstract:

Recently, Electric Vehicles (EVs) have received extensive consideration since they offer a more sustainable and greener transportation alternative compared to fossil-fuel propelled vehicles. Lithium-Ion (Li-ion) batteries are increasingly being deployed in EVs because of their high energy density, high cell-level voltage, and low rate of self-discharge. Since Li-ion batteries represent the most expensive component in the EV powertrain, accurate monitoring and control strategies must be executed to ensure their prolonged lifespan. The Battery Management System (BMS) has to accurately estimate parameters such as the battery State-of-Charge (SOC), State-of-Health (SOH), and Remaining Useful Life (RUL). In order for the BMS to estimate these parameters, an accurate and control-oriented battery model has to work collaboratively with a robust state and parameter estimation strategy. Since battery physical parameters, such as the internal resistance and diffusion coefficient change depending on the battery state-of-life (SOL), the BMS has to be adaptive to accommodate for this change. In this paper, an extensive battery aging study has been conducted over 12-months period on 5.4 Ah, 3.7 V Lithium polymer cells. Instead of using fixed charging/discharging aging cycles at fixed C-rate, a set of real-world driving scenarios have been used to age the cells. The test has been interrupted every 5% capacity degradation by a set of reference performance tests to assess the battery degradation and track model parameters. As battery ages, the combined model parameters are optimized and tracked in an offline mode over the entire batteries lifespan. Based on the optimized model, a state and parameter estimation strategy based on the Extended Kalman Filter (EKF) and the relatively new Smooth Variable Structure Filter (SVSF) have been applied to estimate the SOC at various states of life.

Keywords: lithium-ion batteries, genetic algorithm optimization, battery aging test, parameter identification

Procedia PDF Downloads 253
20572 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 116
20571 Identification and Management of Septic Arthritis of the Untouched Glenohumeral Joint

Authors: Sumit Kanwar, Manisha Chand, Gregory Gilot

Abstract:

Background: Septic arthritis of the shoulder has infrequently been discussed. Focus on infection of the untouched shoulder has not heretofore been described. We present four patients with glenohumeral septic arthritis. Methods: Case 1: A 59 year old male with left shoulder pain in the anterior, posterior and superior aspects. Case 2: A 60 year old male with fever, chills, and generalized muscle aches. Case 3: A 70 year old male with right shoulder pain about the anterior and posterior aspects. Case 4: A 55 year old male with global right shoulder pain, swelling, and limited ROM. Results: In case 1, the left shoulder was affected. Physical examination, swelling was notable, there was global tenderness with a painful range of motion (ROM). The lab values indicated an erythrocyte sedimentation rate (ESR) of 96, and a C-reactive protein (CRP) of 304.30. Imaging studies were performed and MRI indicated a high suspicion for an abscess with osteomyelitis of the humeral head. Our second case’s left arm was affected. He had swelling, global tenderness and painful ROM. His ESR was 38, CRP was 14.9. X-ray showed severe arthritis. Case 3 differed with the right arm being affected. Again, global tenderness and painful ROM was observed. His ESR was 94, and CRP was 10.6. X-ray displayed an eroded glenoid space. Our fourth case’s right shoulder was affected. He had global tenderness and painful, limited ROM. ESR was 108 and CRP was 2.4. X-ray was non-significant. Discussion: Monoarticular septic arthritis of the virgin glenohumeral joint is seldom diagnosed in clinical practice. Common denominators include elevated ESR, painful, limited ROM, and involvement of the dominant arm. The male population is more frequently affected with an average age of 57. Septic arthritis is managed with incision and drainage or needle aspiration of synovial fluid supplemented with 3-6 weeks of intravenous antibiotics. Due to better irrigation and joint visualization, arthroscopy is preferred. Open surgical drainage may be indicated if the above methods fail. Conclusion: If a middle-aged male presents with vague anterior or posterior shoulder pain, elevated inflammatory markers and a low grade fever, an x-ray should be performed. If this displays degenerative joint disease, the complete further workup with advanced imaging, such as an MRI, CT scan, or an ultrasound. If these imaging modalities display anterior space joint effusion with soft tissue involvement, we can suspect septic arthritis of the untouched glenohumeral joint and surgery is indicated.

Keywords: glenohumeral joint, identification, infection, septic arthritis, shoulder

Procedia PDF Downloads 409
20570 Load Comparison between Different Positions during Elite Male Basketball Games: A Sport Metabolomics Approach

Authors: Kayvan Khoramipour, Abbas Ali Gaeini, Elham Shirzad, Øyvind Sandbakk

Abstract:

Basketball has different positions with individual movement profiles, which may influence metabolic demands. Accordingly, the present study aimed to compare the movement and metabolic load between different positions during elite male basketball games. Five main players of 14 teams (n = 70), who participated in the 2017-18 Iranian national basketball leagues, were selected as participants. The players were defined as backcourt (Posts 1-3) and frontcourt (Posts 4-5). Video based time motion analysis (VBTMA) was performed based on players’ individual running and shuffling speed using Dartfish software. Movements were classified into high and low intensity running with and without having the ball, as well as high and low-intensity shuffling and static movements. Mean frequency, duration, and distance were calculated for each class, except for static movements where only frequency was calculated. Saliva samples were collected from each player before and after 40-minute basketball games and analyzed using metabolomics. Principal component analysis (PCA) and Partial least square discriminant analysis (PLSDA) (for metabolomics data) and independent T-tests (for VBTMA) were used as statistical tests. Movement frequency, duration, and distance were higher in backcourt players (all p ≤ 0.05), while static movement frequency did not differ. Saliva samples showed that the levels of Taurine, Succinic acid, Citric acid, Pyruvate, Glycerol, Acetoacetic acid, Acetone, and Hypoxanthine were all higher in backcourt players, whereas Lactate, Alanine, 3-Metyl Histidine, and Methionine were higher in frontcourt players Based on metabolomics, we demonstrate that backcourt and frontcourt players have different metabolic profiles during games, where backcourt players move clearly more during games and therefore rely more on aerobic energy, whereas frontcourt players rely more on anaerobic energy systems in line with less dynamic but more static movement patterns.

Keywords: basketball, metabolomics, saliva, sport loadomics

Procedia PDF Downloads 101
20569 Massively Parallel Sequencing Improved Resolution for Paternity Testing

Authors: Xueying Zhao, Ke Ma, Hui Li, Yu Cao, Fan Yang, Qingwen Xu, Wenbin Liu

Abstract:

Massively parallel sequencing (MPS) technologies allow high-throughput sequencing analyses with a relatively affordable price and have gradually been applied to forensic casework. MPS technology identifies short tandem repeat (STR) loci based on sequence so that repeat motif variation within STRs can be detected, which may help one to infer the origin of the mutation in some cases. Here, we report on one case with one three-step mismatch (D18S51) in family trios based on both capillary electrophoresis (CE) and MPS typing. The alleles of the alleged father (AF) are [AGAA]₁₇AGAG[AGAA]₃ and [AGAA]₁₅. The mother’s alleles are [AGAA]₁₉ and [AGAA]₉AGGA[AGAA]₃. The questioned child’s (QC) alleles are [AGAA]₁₉ and [AGAA]₁₂. Given that the sequence variants in repeat regions of AF and mother are not observed in QC’s alleles, the QC’s allele [AGAA]₁₂ was likely inherited from the AF’s allele [AGAA]₁₅ by loss of three repeat [AGAA]. Besides, two new alleles of D18S51 in this study, [AGAA]₁₇AGAG[AGAA]₃ and [AGAA]₉AGGA[AGAA]₃, have not been reported before. All the results in this study were verified using Sanger-type sequencing. In summary, the MPS typing method can offer valuable information for forensic genetics research and play a promising role in paternity testing.

Keywords: family trios analysis, forensic casework, ion torrent personal genome machine (PGM), massively parallel sequencing (MPS)

Procedia PDF Downloads 289
20568 Support Vector Machine Based Retinal Therapeutic for Glaucoma Using Machine Learning Algorithm

Authors: P. S. Jagadeesh Kumar, Mingmin Pan, Yang Yung, Tracy Lin Huan

Abstract:

Glaucoma is a group of visual maladies represented by the scheduled optic nerve neuropathy; means to the increasing dwindling in vision ground, resulting in loss of sight. In this paper, a novel support vector machine based retinal therapeutic for glaucoma using machine learning algorithm is conservative. The algorithm has fitting pragmatism; subsequently sustained on correlation clustering mode, it visualizes perfect computations in the multi-dimensional space. Support vector clustering turns out to be comparable to the scale-space advance that investigates the cluster organization by means of a kernel density estimation of the likelihood distribution, where cluster midpoints are idiosyncratic by the neighborhood maxima of the concreteness. The predicted planning has 91% attainment rate on data set deterrent on a consolidation of 500 realistic images of resolute and glaucoma retina; therefore, the computational benefit of depending on the cluster overlapping system pedestal on machine learning algorithm has complete performance in glaucoma therapeutic.

Keywords: machine learning algorithm, correlation clustering mode, cluster overlapping system, glaucoma, kernel density estimation, retinal therapeutic

Procedia PDF Downloads 230
20567 Optimizing Irrigation Scheduling for Sustainable Agriculture: A Case Study of a Farm in Onitsha, Anambra State, Nigeria

Authors: Ejoh Nonso Francis

Abstract:

: Irrigation scheduling is a critical aspect of sustainable agriculture as it ensures optimal use of water resources, reduces water waste, and enhances crop yields. This paper presents a case study of a farm in Onitsha, Anambra State, Nigeria, where irrigation scheduling was optimized using a combination of soil moisture sensors and weather data. The study aimed to evaluate the effectiveness of this approach in improving water use efficiency and crop productivity. The results showed that the optimized irrigation scheduling approach led to a 30% reduction in water use while increasing crop yield by 20%. The study demonstrates the potential of technology-based irrigation scheduling to enhance sustainable agriculture in Nigeria and beyond.

Keywords: irrigation scheduling, sustainable agriculture, soil moisture sensors, weather data, water use efficiency, crop productivity, nigeria, onitsha, anambra state, technology-based irrigation scheduling, water resources, environmental degradation, crop water requirements, overwatering, water waste, farming systems, scalability

Procedia PDF Downloads 62
20566 Efficacy of Knowledge Management Practices in Selected Public Libraries in the Province of Kwazulu-Natal, South Africa

Authors: Petros Dlamini, Bethiweli Malambo, Maggie Masenya

Abstract:

Knowledge management practices are very important in public libraries, especial in the era of the information society. The success of public libraries depends on the recognition and application of knowledge management practices. The study investigates the value and challenges of knowledge management practices in public libraries. Three research objectives informed the study: to identify knowledge management practices in public libraries, understand the value of knowledge management practices in public libraries, and determine the factors hampering knowledge management practices in public libraries. The study was informed by the interpretivism research paradigm, which is associated with qualitative studies. In that light, the study collected data from eight librarians and or library heads, who were purposively selected from public libraries. The study adopted a social anthropological approach, which thoroughly evaluated each participant's response. Data was collected from the respondents through telephonic semi-structured interviews and assessed accordingly. Furthermore, the study used the latest content concept for data interpretation. The chosen data analysis method allowed the study to achieve its main purpose with concrete and valid information. The study's findings showed that all six (100%) selected public libraries apply knowledge management practices. The findings of the study revealed that public libraries have knowledge sharing as the main knowledge management practice. It was noted that public libraries employ many practices, but each library employed its practices of choice depending on their knowledge management practices structure. The findings further showed that knowledge management practices in public libraries are employed through meetings, training, information sessions, and awareness, to mention a few. The findings revealed that knowledge management practices make the libraries usable. Furthermore, it has been asserted that knowledge management practices in public libraries meet users’ needs and expectations and equip them with skills. It was discovered that all participating public libraries from Umkhanyakude district municipality valued their knowledge management practices as the pillar and foundation of services. Noticeably, knowledge management practices improve users ‘standard of living and build an information society. The findings of the study showed that librarians should be responsible for the value of knowledge management practices as they are qualified personnel. The results also showed that 83.35% of public libraries had factors hampering knowledge management practices. The factors are not limited to shortage of funds, resources and space, and political interference. Several suggestions were made to improve knowledge management practices in public libraries. These suggestions include improving the library budget, increasing libraries’ building sizes, and conducting more staff training.

Keywords: knowledge management, knowledge management practices, storage, dissemination

Procedia PDF Downloads 73
20565 Assessment of Routine Health Information System (RHIS) Quality Assurance Practices in Tarkwa Sub-Municipal Health Directorate, Ghana

Authors: Richard Okyere Boadu, Judith Obiri-Yeboah, Kwame Adu Okyere Boadu, Nathan Kumasenu Mensah, Grace Amoh-Agyei

Abstract:

Routine health information system (RHIS) quality assurance has become an important issue, not only because of its significance in promoting a high standard of patient care but also because of its impact on government budgets for the maintenance of health services. A routine health information system comprises healthcare data collection, compilation, storage, analysis, report generation, and dissemination on a routine basis in various healthcare settings. The data from RHIS give a representation of health status, health services, and health resources. The sources of RHIS data are normally individual health records, records of services delivered, and records of health resources. Using reliable information from routine health information systems is fundamental in the healthcare delivery system. Quality assurance practices are measures that are put in place to ensure the health data that are collected meet required quality standards. Routine health information system quality assurance practices ensure that data that are generated from the system are fit for use. This study considered quality assurance practices in the RHIS processes. Methods: A cross-sectional study was conducted in eight health facilities in Tarkwa Sub-Municipal Health Service in the western region of Ghana. The study involved routine quality assurance practices among the 90 health staff and management selected from facilities in Tarkwa Sub-Municipal who collected or used data routinely from 24th December 2019 to 20th January 2020. Results: Generally, Tarkwa Sub-Municipal health service appears to practice quality assurance during data collection, compilation, storage, analysis and dissemination. The results show some achievement in quality control performance in report dissemination (77.6%), data analysis (68.0%), data compilation (67.4%), report compilation (66.3%), data storage (66.3%) and collection (61.1%). Conclusions: Even though the Tarkwa Sub-Municipal Health Directorate engages in some control measures to ensure data quality, there is a need to strengthen the process to achieve the targeted percentage of performance (90.0%). There was a significant shortfall in quality assurance practices performance, especially during data collection, with respect to the expected performance.

Keywords: quality assurance practices, assessment of routine health information system quality, routine health information system, data quality

Procedia PDF Downloads 60
20564 Deep Learning Based, End-to-End Metaphor Detection in Greek with Recurrent and Convolutional Neural Networks

Authors: Konstantinos Perifanos, Eirini Florou, Dionysis Goutsos

Abstract:

This paper presents and benchmarks a number of end-to-end Deep Learning based models for metaphor detection in Greek. We combine Convolutional Neural Networks and Recurrent Neural Networks with representation learning to bear on the metaphor detection problem for the Greek language. The models presented achieve exceptional accuracy scores, significantly improving the previous state-of-the-art results, which had already achieved accuracy 0.82. Furthermore, no special preprocessing, feature engineering or linguistic knowledge is used in this work. The methods presented achieve accuracy of 0.92 and F-score 0.92 with Convolutional Neural Networks (CNNs) and bidirectional Long Short Term Memory networks (LSTMs). Comparable results of 0.91 accuracy and 0.91 F-score are also achieved with bidirectional Gated Recurrent Units (GRUs) and Convolutional Recurrent Neural Nets (CRNNs). The models are trained and evaluated only on the basis of training tuples, the related sentences and their labels. The outcome is a state-of-the-art collection of metaphor detection models, trained on limited labelled resources, which can be extended to other languages and similar tasks.

Keywords: metaphor detection, deep learning, representation learning, embeddings

Procedia PDF Downloads 136
20563 Cone Contrast Sensitivity of Normal Trichromats and Those with Red-Green Dichromats

Authors: Tatsuya Iizuka, Takushi Kawamorita, Tomoya Handa, Hitoshi Ishikawa

Abstract:

We report normative cone contrast sensitivity values and sensitivity and specificity values for a computer-based color vision test, the cone contrast test-HD (CCT-HD). The participants included 50 phakic eyes with normal color vision (NCV) and 20 dichromatic eyes (ten with protanopia and ten with deuteranopia). The CCT-HD was used to measure L, M, and S-CCT-HD scores (color vision deficiency, L-, M-cone logCS≦1.65, S-cone logCS≦0.425) to investigate the sensitivity and specificity of CCT-HD based on anomalous-type diagnosis with animalscope. The mean ± standard error L-, M-, S-cone logCS for protanopia were 0.90±0.04, 1.65±0.03, and 0.63±0.02, respectively; for deuteranopia 1.74±0.03, 1.31±0.03, and 0.61±0.06, respectively; and for age-matched NCV were 1.89±0.04, 1.84±0.04, and 0.60±0.03, respectively, with significant differences for each group except for S-CCT-HD (Bonferroni corrected α = 0.0167, p < 0.0167). The sensitivity and specificity of CCT-HD were 100% for protan and deutan in diagnosing abnormal types from 20 to 64 years of age, but the specificity decreased to 65% for protan and 55% for deutan in older persons > 65. CCT-HD is comparable to the diagnostic performance of the anomalous type in the anomaloscope for the 20-64-year-old age group. However, the results should be interpreted cautiously in those ≥ 65 years. They are more susceptible to acquired color vision deficiencies due to the yellowing of the crystalline lens and other factors.

Keywords: cone contrast test HD, color vision test, congenital color vision deficiency, red-green dichromacy, cone contrast sensitivity

Procedia PDF Downloads 86
20562 Exploring the Applications of Neural Networks in the Adaptive Learning Environment

Authors: Baladitya Swaika, Rahul Khatry

Abstract:

Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.

Keywords: computer adaptive tests, item response theory, machine learning, neural networks

Procedia PDF Downloads 163
20561 Fixed-Frequency Pulse Width Modulation-Based Sliding Mode Controller for Switching Multicellular Converter

Authors: Rihab Hamdi, Amel Hadri Hamida, Ouafae Bennis, Fatima Babaa, Sakina Zerouali

Abstract:

This paper features a sliding mode controller (SMC) for closed-loop voltage control of DC-DC three-cells buck converter connected in parallel, operating in continuous conduction mode (CCM), based on pulse-width modulation (PWM). To maintain the switching frequency, the approach is to incorporate a pulse-width modulation that utilizes an equivalent control, inferred by applying the SM control method, to produce a control sign to be contrasted and the fixed-frequency within the modulator. Detailed stability and transient performance analysis have been conducted using Lyapunov stability criteria to restrict the switching frequency variation facing wide variations in output load, input changes, and set-point changes. The results obtained confirm the effectiveness of the proposed control scheme in achieving an enhanced output transient performance while faithfully realizing its control objective in the event of abrupt and uncertain parameter variations. Simulations studies in MATLAB/Simulink environment are performed to confirm the idea.

Keywords: DC-DC converter, pulse width modulation, power electronics, sliding mode control

Procedia PDF Downloads 126
20560 Identification of Cellulose-Hydrolytic Thermophiles Isolated from Sg. Klah Hot Spring Based on 16S rDNA Gene Sequence

Authors: M. J. Norashirene, Y. Zakiah, S. Nurdiana, I. Nur Hilwani, M. H. Siti Khairiyah, M. J. Muhamad Arif

Abstract:

In this study, six bacterial isolates of a slightly thermophilic organism from the Sg. Klah hot spring, Malaysia were successfully isolated and designated as M7T55D1, M7T55D2, M7T55D3, M7T53D1, M7T53D2 and M7T53D3 respectively. The bacterial isolates were screened for their cellulose hydrolytic ability on Carboxymethlycellulose agar medium. The isolated bacterial strains were identified morphologically, biochemically and molecularly with the aid of 16S rDNA sequencing. All of the bacteria showed their optimum growth at a slightly alkaline pH of 7.5 with a temperature of 55°C. All strains were Gram-negative, non-spore forming type, strictly aerobic, catalase-positive and oxidase-positive with the ability to produce thermostable cellulase. Based on BLASTn results, bacterial isolates of M7T55D2 and M7T53D1 gave the highest homology (97%) with similarity to Tepidimonas ignava while isolates M7T55D1, M7T55D3, M7T53D2 and M7T53D3 showed their closest homology (97%-98%) with Tepidimonas thermarum. These cellulolytic thermophiles might have a commercial potential to produce valuable thermostable cellulase.

Keywords: cellulase, cellulolytic, thermophiles, 16S rDNA gene

Procedia PDF Downloads 331
20559 Chinese Students’ Use of Corpus Tools in an English for Academic Purposes Writing Course: Influence on Learning Behaviour, Performance Outcomes and Perceptions

Authors: Jingwen Ou

Abstract:

Writing for academic purposes in a second or foreign language poses a significant challenge for non-native speakers, particularly at the tertiary level, where English academic writing for L2 students is often hindered by difficulties in academic discourse, including vocabulary, academic register, and organization. The past two decades have witnessed a rising popularity in the application of the data-driven learning (DDL) approach in EAP writing instruction. In light of such a trend, this study aims to enhance the integration of DDL into English for academic purposes (EAP) writing classrooms by investigating the perception of Chinese college students regarding the use of corpus tools for improving EAP writing. Additionally, the research explores their corpus consultation behaviors during training to provide insights into corpus-assisted EAP instruction for DDL practitioners. Given the uprising popularity of DDL, this research aims to investigate Chinese university students’ use of corpus tools with three main foci: 1) the influence of corpus tools on learning behaviours, 2) the influence of corpus tools on students’ academic writing performance outcomes, and 3) students’ perceptions and potential perceptional changes towards the use of such tools. Three corpus tools, CQPWeb, Sketch Engine, and LancsBox X, are selected for investigation due to the scarcity of empirical research on patterns of learners’ engagement with a combination of multiple corpora. The research adopts a pre-test / post-test design for the evaluation of students’ academic writing performance before and after the intervention. Twenty participants will be divided into two groups: an intervention and a non-intervention group. Three corpus training workshops will be delivered at the beginning, middle, and end of a semester. An online survey and three separate focus group interviews are designed to investigate students’ perceptions of the use of corpus tools for improving academic writing skills, particularly the rhetorical functions in different essay sections. Insights from students’ consultation sessions indicated difficulties with DDL practice, including insufficiency of time to complete all tasks, struggle with technical set-up, unfamiliarity with the DDL approach and difficulty with some advanced corpus functions. Findings from the main study aim to provide pedagogical insights and training resources for EAP practitioners and learners.

Keywords: corpus linguistics, data-driven learning, English for academic purposes, tertiary education in China

Procedia PDF Downloads 40
20558 Side Effects of Dental Whitening: Published Data from the Literature

Authors: Ilma Robo, Saimir Heta, Emela Dalloshi, Nevila Alliu, Vera Ostreni

Abstract:

The dental whitening process, beyond the fact that it is a mini-invasive dental treatment, has effects on the dental structure, or on the pulp of the tooth, where it is applied. The electronic search was performed using keywords to find articles published within the last 10 years about side effects, assessed as such, of minimally invasive dental bleaching treatment. Methodology: In selected articles, the other aim of the study was to evaluate the side effects of bleaching based on the percentage and type of solution used, where the latter was evaluated on the basic solution used for bleaching. Results: The side effects of bleaching are evaluated in selected articles depending on the method of bleaching application, which means it is carried out with recommended solutions, or with mixtures of alternative solutions or substances based on Internet information. Short conclusion: The dental bleaching process has side effects which have not yet been definitively evaluated, experimentally in large samples of individuals or animals (mice or cattle) to arrive at accurate numerical conclusions. The trend of publications about this topic is increasing in recent years, as long as the trend for aesthetic facial treatments, including dental ones, is increasing.

Keywords: teeth whitening, side effects, permanent teeth, formed dental apex

Procedia PDF Downloads 44
20557 Intensifying Approach for Separation of Bio-Butanol Using Ionic Liquid as Green Solvent: Moving Towards Sustainable Biorefinery

Authors: Kailas L. Wasewar

Abstract:

Biobutanol has been considered as a potential and alternative biofuel relative to the most popular biodiesel and bioethanol. End product toxicity is the major problems in commercialization of fermentation based process which can be reduce to some possible extent by removing biobutanol simultaneously. Several techniques have been investigated for removing butanol from fermentation broth such as stripping, adsorption, liquid–liquid extraction, pervaporation, and membrane solvent extraction. Liquid–liquid extraction can be performed with high selectivity and is possible to carry out inside the fermenter. Conventional solvents have few drawbacks including toxicity, loss of solvent, high cost etc. Hence alternative solvents must be explored for the same. Room temperature ionic liquids (RTILs) composed entirely of ions are liquid at room temperature having negligible vapor pressure, non-flammability, and tunable physiochemical properties for a particular application which term them as “designer solvents”. Ionic liquids (ILs) have recently gained much attention as alternatives for organic solvents in many processes. In particular, ILs have been used as alternative solvents for liquid–liquid extraction. Their negligible vapor pressure allows the extracted products to be separated from ILs by conventional low pressure distillation with the potential for saving energy. Morpholinium, imidazolium, ammonium, phosphonium etc. based ionic liquids have been employed for the separation biobutanol. In present chapter, basic concepts of ionic liquids and application in separation have been presented. Further, type of ionic liquids including, conventional, functionalized, polymeric, supported membrane, and other ionic liquids have been explored. Also the effect of various performance parameters on separation of biobutanol by ionic liquids have been discussed and compared for different cation and anion based ionic liquids. The typical methodology for investigation have been adopted such as contacting the equal amount of biobutanol and ionic liquids for a specific time say, 30 minutes to confirm the equilibrium. Further, biobutanol phase were analyzed using GC to know the concentration of biobutanol and material balance were used to find the concentration in ionic liquid.

Keywords: biobutanol, separation, ionic liquids, sustainability, biorefinery, waste biomass

Procedia PDF Downloads 72
20556 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data

Authors: N. Borjalilu, P. Rabiei, A. Enjoo

Abstract:

Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.

Keywords: F-topsis, fuzzy set, flight data monitoring (FDM), flight safety

Procedia PDF Downloads 151