Search results for: data sensitivity
24586 The Twin Terminal of Pedestrian Trajectory Based on City Intelligent Model (CIM) 4.0
Authors: Chen Xi, Liu Xuebing, Lao Xueru, Kuan Sinman, Jiang Yike, Wang Hanwei, Yang Xiaolang, Zhou Junjie, Xie Jinpeng
Abstract:
To further promote the development of smart cities, the microscopic "nerve endings" of the City Intelligent Model (CIM) are extended to be more sensitive. In this paper, we develop a pedestrian trajectory twin terminal based on the CIM and CNN technology. It also uses 5G networks, architectural and geoinformatics technologies, convolutional neural networks, combined with deep learning networks for human behavior recognition models, to provide empirical data such as 'pedestrian flow data and human behavioral characteristics data', and ultimately form spatial performance evaluation criteria and spatial performance warning systems, to make the empirical data accurate and intelligent for prediction and decision making.Keywords: urban planning, urban governance, CIM, artificial intelligence, sustainable development
Procedia PDF Downloads 41924585 Using Tilted Façade to Reduce Thermal Discomfort in a UK Passivhaus Dwelling for a Warming Climate
Authors: Yahya Lavafpour, Steve Sharples
Abstract:
This study investigated the potential negative impacts of future UK climate change on dwellings. In particular, the risk of overheating was considered for a Passivhaus dwelling in London. The study used dynamic simulation modelling software to investigate the potential use of building geometry to control current and future overheating risks in the dwelling for London climate. Specifically, the focus was on the optimum inclination of a south façade to make use of the building’s shape to self-protect itself. A range of different inclined façades were examined to test their effectiveness in reducing the overheating risk. The research found that implementing a 115° tilted façade could completely eliminate the risk of overheating in current climate, but with some consequence for natural ventilation and daylighting. Future overheating was significantly reduced by the tilted façade. However, geometric considerations could not eradicate completely the risk of overheating particularly by the 2080s. The study also used CFD modelling and sensitivity analysis to investigate the effect of the façade geometry on the wind pressure distributions on and around the building surface. This was done to assess natural ventilation flows for alternative façade inclinations.Keywords: climate change, tilt façade, thermal comfort, passivhaus, overheating
Procedia PDF Downloads 76324584 An Extended Inverse Pareto Distribution, with Applications
Authors: Abdel Hadi Ebraheim
Abstract:
This paper introduces a new extension of the Inverse Pareto distribution in the framework of Marshal-Olkin (1997) family of distributions. This model is capable of modeling various shapes of aging and failure data. The statistical properties of the new model are discussed. Several methods are used to estimate the parameters involved. Explicit expressions are derived for different types of moments of value in reliability analysis are obtained. Besides, the order statistics of samples from the new proposed model have been studied. Finally, the usefulness of the new model for modeling reliability data is illustrated using two real data sets with simulation study.Keywords: pareto distribution, marshal-Olkin, reliability, hazard functions, moments, estimation
Procedia PDF Downloads 8224583 Dynamics of Bacterial Contamination and Oral Health Risks Associated with Currency Notes and Coins Circulating in Kampala City
Authors: Abdul Walusansa
Abstract:
In this paper, paper notes and coins were collected from general public in Kampala City where ready-to-eat food can be served, in order to survey for bacterial contamination. The total bacterial number and potentially pathogenic organisms loading on currency were tested. All isolated potential pathogens were also tested for antibiotic resistance against four most commonly prescribed antibiotics. 1. The bacterial counts on one hundred paper notes sample were ranging between 6~10918/cm cm-2,the median was 141/ cm-2, according to the data it was much higher than credit cards and Australian notes which were made of polymer. The bacterial counts on sixty coin samples were ranging between 2~380/cm-2, much less than paper notes. 2. Coliform (65.6%), E. coli (45.9%), S. aureus (41.7%), B. cereus (67.7%), Salmonella (19.8%) were isolated on one hundred paper notes. Coliform (22.4%), E. coli (5.2%), S. aureus (24.1%), B. cereus (34.5%), Salmonella (10.3%) were isolated from sixty coin samples. These results suggested a high rate of potential pathogens contamination of paper notes than coins. 3. Antibiotic resistances are commonly in most of the pathogens isolated on currency. Ampicillin resistance was found in 60%of Staphylococcus aureus isolated on currency, as well as 76.6% of E. coil and 40% of Salmonella. Erythromycin resistance was detected in 56.6% of S. aureus and in 80.0% of E. coli. All the pathogens isolated were sensitive to Norfloxacin, Salmonella and S. aureus also sensitive to Cefaclor. In this paper, we also studied the antimicrobial capability of metal coins, coins collected from different countries were tested for the ability to inhibit the growth of E. sakazakii, S. aureus, E. coli, L. monocytogenes and S. typhimurium. 1) E. sakazakii appeared very sensitive to metal coins, the second is S. aureus, but E. coli, L. monocytogenes and S. typhimurium are more resistant to these metal coin samples. 2) Coins made of Nickel-brass alloy and Copper-nickel alloy showed a better effect in anti-microbe than other metal coins, especially the ability to inhibited the growth of E. sakazakii and S. aureus, all the inhibition zones produced on nutrient agar are more than 20.6 mm. Aluminium-bronze alloy revealed weak anti-microbe activity to S. aureus and no effect to kill other pathogens. Coins made of stainless steel also can’t resist bacteria growth. 3) Surprisingly, one cent coins of USA which were made of 97.5% Zinc and 2.5% Cu showed a significant antimicrobial capability, the average inhibition zone of these five pathogens is 45.5 mm.Keywords: antibiotic sensitivity, bacteria, currency, coins, parasites
Procedia PDF Downloads 32924582 Potential Determinants of Research Output: Comparing Economics and Business
Authors: Osiris Jorge Parcero, Néstor Gandelman, Flavia Roldán, Josef Montag
Abstract:
This paper uses cross-country unbalanced panel data of up to 146 countries over the period 1996 to 2015 to be the first study to identify potential determinants of a country’s relative research output in Economics versus Business. More generally, it is also one of the first studies comparing Economics and Business. The results show that better policy-related data availability, higher income inequality, and lower ethnic fractionalization relatively favor economics. The findings are robust to two alternative fixed effects specifications, three alternative definitions of economics and business, two alternative measures of research output (publications and citations), and the inclusion of meaningful control variables. To the best of our knowledge, our paper is also the first to demonstrate the importance of policy-related data as drivers of economic research. Our regressions show that the availability of this type of data is the single most important factor associated with the prevalence of economics over business as a research domain. Thus, our work has policy implications, as the availability of policy-related data is partially under policy control. Moreover, it has implications for students, professionals, universities, university departments, and research-funding agencies that face choices between profiles oriented toward economics and those oriented toward business. Finally, the conclusions show potential lines for further research.Keywords: research output, publication performance, bibliometrics, economics, business, policy-related data
Procedia PDF Downloads 13424581 Assessment of Routine Health Information System (RHIS) Quality Assurance Practices in Tarkwa Sub-Municipal Health Directorate, Ghana
Authors: Richard Okyere Boadu, Judith Obiri-Yeboah, Kwame Adu Okyere Boadu, Nathan Kumasenu Mensah, Grace Amoh-Agyei
Abstract:
Routine health information system (RHIS) quality assurance has become an important issue, not only because of its significance in promoting a high standard of patient care but also because of its impact on government budgets for the maintenance of health services. A routine health information system comprises healthcare data collection, compilation, storage, analysis, report generation, and dissemination on a routine basis in various healthcare settings. The data from RHIS give a representation of health status, health services, and health resources. The sources of RHIS data are normally individual health records, records of services delivered, and records of health resources. Using reliable information from routine health information systems is fundamental in the healthcare delivery system. Quality assurance practices are measures that are put in place to ensure the health data that are collected meet required quality standards. Routine health information system quality assurance practices ensure that data that are generated from the system are fit for use. This study considered quality assurance practices in the RHIS processes. Methods: A cross-sectional study was conducted in eight health facilities in Tarkwa Sub-Municipal Health Service in the western region of Ghana. The study involved routine quality assurance practices among the 90 health staff and management selected from facilities in Tarkwa Sub-Municipal who collected or used data routinely from 24th December 2019 to 20th January 2020. Results: Generally, Tarkwa Sub-Municipal health service appears to practice quality assurance during data collection, compilation, storage, analysis and dissemination. The results show some achievement in quality control performance in report dissemination (77.6%), data analysis (68.0%), data compilation (67.4%), report compilation (66.3%), data storage (66.3%) and collection (61.1%). Conclusions: Even though the Tarkwa Sub-Municipal Health Directorate engages in some control measures to ensure data quality, there is a need to strengthen the process to achieve the targeted percentage of performance (90.0%). There was a significant shortfall in quality assurance practices performance, especially during data collection, with respect to the expected performance.Keywords: quality assurance practices, assessment of routine health information system quality, routine health information system, data quality
Procedia PDF Downloads 7924580 SPR Immunosensor for the Detection of Staphylococcus aureus
Authors: Muhammad Ali Syed, Arshad Saleem Bhatti, Chen-zhong Li, Habib Ali Bokhari
Abstract:
Surface plasmon resonance (SPR) biosensors have emerged as a promising technique for bioanalysis as well as microbial detection and identification. Real time, sensitive, cost effective, and label free detection of biomolecules from complex samples is required for early and accurate diagnosis of infectious diseases. Like many other types of optical techniques, SPR biosensors may also be successfully utilized for microbial detection for accurate, point of care, and rapid results. In the present study, we have utilized a commercially available automated SPR biosensor of BI company to study the microbial detection form water samples spiked with different concentration of Staphylococcus aureus bacterial cells. The gold thin film sensor surface was functionalized to react with proteins such as protein G, which was used for directed immobilization of monoclonal antibodies against Staphylococcus aureus. The results of our work reveal that this immunosensor can be used to detect very small number of bacterial cells with higher sensitivity and specificity. In our case 10^3 cells/ml of water have been successfully detected. Therefore, it may be concluded that this technique has a strong potential to be used in microbial detection and identification.Keywords: surface plasmon resonance (SPR), Staphylococcus aureus, biosensors, microbial detection
Procedia PDF Downloads 47524579 Heart Failure Identification and Progression by Classifying Cardiac Patients
Authors: Muhammad Saqlain, Nazar Abbas Saqib, Muazzam A. Khan
Abstract:
Heart Failure (HF) has become the major health problem in our society. The prevalence of HF has increased as the patient’s ages and it is the major cause of the high mortality rate in adults. A successful identification and progression of HF can be helpful to reduce the individual and social burden from this syndrome. In this study, we use a real data set of cardiac patients to propose a classification model for the identification and progression of HF. The data set has divided into three age groups, namely young, adult, and old and then each age group have further classified into four classes according to patient’s current physical condition. Contemporary Data Mining classification algorithms have been applied to each individual class of every age group to identify the HF. Decision Tree (DT) gives the highest accuracy of 90% and outperform all other algorithms. Our model accurately diagnoses different stages of HF for each age group and it can be very useful for the early prediction of HF.Keywords: decision tree, heart failure, data mining, classification model
Procedia PDF Downloads 40224578 Critically Analyzing the Application of Big Data for Smart Transportation: A Case Study of Mumbai
Authors: Tanuj Joshi
Abstract:
Smart transportation is fast emerging as a solution to modern cities’ approach mobility issues, delayed emergency response rate and high congestion on streets. Present day scenario with Google Maps, Waze, Yelp etc. demonstrates how information and communications technologies controls the intelligent transportation system. This intangible and invisible infrastructure is largely guided by the big data analytics. On the other side, the exponential increase in Indian urban population has intensified the demand for better services and infrastructure to satisfy the transportation needs of its citizens. No doubt, India’s huge internet usage is looked as an important resource to guide to achieve this. However, with a projected number of over 40 billion objects connected to the Internet by 2025, the need for systems to handle massive volume of data (big data) also arises. This research paper attempts to identify the ways of exploiting the big data variables which will aid commuters on Indian tracks. This study explores real life inputs by conducting survey and interviews to identify which gaps need to be targeted to better satisfy the customers. Several experts at Mumbai Metropolitan Region Development Authority (MMRDA), Mumbai Metro and Brihanmumbai Electric Supply and Transport (BEST) were interviewed regarding the Information Technology (IT) systems currently in use. The interviews give relevant insights and requirements into the workings of public transportation systems whereas the survey investigates the macro situation.Keywords: smart transportation, mobility issue, Mumbai transportation, big data, data analysis
Procedia PDF Downloads 17824577 End-to-End Pyramid Based Method for Magnetic Resonance Imaging Reconstruction
Authors: Omer Cahana, Ofer Levi, Maya Herman
Abstract:
Magnetic Resonance Imaging (MRI) is a lengthy medical scan that stems from a long acquisition time. Its length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach such as Compress Sensing (CS) or Parallel Imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. To achieve that, two conditions must be satisfied: i) the signal must be sparse under a known transform domain, and ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm must be applied to recover the signal. While the rapid advances in Deep Learning (DL) have had tremendous successes in various computer vision tasks, the field of MRI reconstruction is still in its early stages. In this paper, we present an end-to-end method for MRI reconstruction from k-space to image. Our method contains two parts. The first is sensitivity map estimation (SME), which is a small yet effective network that can easily be extended to a variable number of coils. The second is reconstruction, which is a top-down architecture with lateral connections developed for building high-level refinement at all scales. Our method holds the state-of-art fastMRI benchmark, which is the largest, most diverse benchmark for MRI reconstruction.Keywords: magnetic resonance imaging, image reconstruction, pyramid network, deep learning
Procedia PDF Downloads 9124576 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University
Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat
Abstract:
Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.Keywords: big data platforms, cloudera manager, Hadoop, MapReduce
Procedia PDF Downloads 35924575 Investigating the Effects of Data Transformations on a Bi-Dimensional Chi-Square Test
Authors: Alexandru George Vaduva, Adriana Vlad, Bogdan Badea
Abstract:
In this research, we conduct a Monte Carlo analysis on a two-dimensional χ2 test, which is used to determine the minimum distance required for independent sampling in the context of chaotic signals. We investigate the impact of transforming initial data sets from any probability distribution to new signals with a uniform distribution using the Spearman rank correlation on the χ2 test. This transformation removes the randomness of the data pairs, and as a result, the observed distribution of χ2 test values differs from the expected distribution. We propose a solution to this problem and evaluate it using another chaotic signal.Keywords: chaotic signals, logistic map, Pearson’s test, Chi Square test, bivariate distribution, statistical independence
Procedia PDF Downloads 9724574 Real Time Data Communication with FlightGear Using Simulink Over a UDP Protocol
Authors: Adil Loya, Ali Haider, Arslan A. Ghaffor, Abubaker Siddique
Abstract:
Simulation and modelling of Unmanned Aero Vehicle (UAV) has gained wide popularity in front of aerospace community. The demand of designing and modelling optimized control system for UAV has increased ten folds since last decade. The reason is next generation warfare is dependent on unmanned technologies. Therefore, this research focuses on the simulation of nonlinear UAV dynamics on Simulink and its integration with Flightgear. There has been lots of research on implementation of optimizing control using Simulink, however, there are fewer known techniques to simulate these dynamics over Flightgear and a tedious technique of acquiring data has been tackled in this research horizon. Sending data to Flightgear is easy but receiving it from Simulink is not that straight forward, i.e. we can only receive control data on the output. However, in this research we have managed to get the data out from the Flightgear by implementation of level 2 s-function block within Simulink. Moreover, the results captured from Flightgear over a Universal Datagram Protocol (UDP) communication are then compared with the attitude signal that were sent previously. This provide useful information regarding the difference in outputs attained from Simulink to Flightgear. It was found that values received on Simulink were in high agreement with that of the Flightgear output. And complete study has been conducted in a discrete way.Keywords: aerospace, flight control, flightgear, communication, Simulink
Procedia PDF Downloads 28624573 An Impact of Stock Price Movements on Cross Listed Companies: A Study of Indian ADR and Domestic Stock Prices
Authors: Kanhaiya Singh
Abstract:
Indian corporate sector has been raising resources through various international financial instruments important among them are Global depository receipts (GDRs) and American Depository Receipts (ADRs). The purpose of raising resources through such instruments is multifold such as lower cost of capital, increased visibility of the company, liberal tax environment, increased trading liquidity etc. One of the significant reason is also the value addition of the company in terms of market capitalization. Obviously, the stocks of such companies are cross listed, one in India and other at the International stock exchange. The sensitivity and movements of stock prices on one stock exchange as compared to other may have an impact on the price movement of the particular scrip. If there is any relationship exists is an issue of study. Having this in view this study is an attempt to identify the extent of impact of price movement of the scrip on one stock exchange on account of change in the prices on the counter stock exchange. Also there is an attempt to find out the difference between pre and post cross listed domestic firm. The study also analyses the impact of exchange rate movements on stock prices.Keywords: ADR, GDR, cross listing, liquidity, exchange rate
Procedia PDF Downloads 38124572 Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications
Authors: Janusz Bedkowski, Grzegorz Kisala, Michal Wlasiuk, Piotr Pokorski
Abstract:
Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM.Keywords: SLAM, ground truth, navigation, LiDAR, visual odometry, mapping
Procedia PDF Downloads 6924571 Prediction of Gully Erosion with Stochastic Modeling by using Geographic Information System and Remote Sensing Data in North of Iran
Authors: Reza Zakerinejad
Abstract:
Gully erosion is a serious problem that threading the sustainability of agricultural area and rangeland and water in a large part of Iran. This type of water erosion is the main source of sedimentation in many catchment areas in the north of Iran. Since in many national assessment approaches just qualitative models were applied the aim of this study is to predict the spatial distribution of gully erosion processes by means of detail terrain analysis and GIS -based logistic regression in the loess deposition in a case study in the Golestan Province. This study the DEM with 25 meter result ion from ASTER data has been used. The Landsat ETM data have been used to mapping of land use. The TreeNet model as a stochastic modeling was applied to prediction the susceptible area for gully erosion. In this model ROC we have set 20 % of data as learning and 20 % as learning data. Therefore, applying the GIS and satellite image analysis techniques has been used to derive the input information for these stochastic models. The result of this study showed a high accurate map of potential for gully erosion.Keywords: TreeNet model, terrain analysis, Golestan Province, Iran
Procedia PDF Downloads 53524570 Data Science/Artificial Intelligence: A Possible Panacea for Refugee Crisis
Authors: Avi Shrivastava
Abstract:
In 2021, two heart-wrenching scenes, shown live on television screens across countries, painted a grim picture of refugees. One of them was of people clinging onto an airplane's wings in their desperate attempt to flee war-torn Afghanistan. They ultimately fell to their death. The other scene was the U.S. government authorities separating children from their parents or guardians to deter migrants/refugees from coming to the U.S. These events show the desperation refugees feel when they are trying to leave their homes in disaster zones. However, data paints a grave picture of the current refugee situation. It also indicates that a bleak future lies ahead for the refugees across the globe. Data and information are the two threads that intertwine to weave the shimmery fabric of modern society. Data and information are often used interchangeably, but they differ considerably. For example, information analysis reveals rationale, and logic, while data analysis, on the other hand, reveals a pattern. Moreover, patterns revealed by data can enable us to create the necessary tools to combat huge problems on our hands. Data analysis paints a clear picture so that the decision-making process becomes simple. Geopolitical and economic data can be used to predict future refugee hotspots. Accurately predicting the next refugee hotspots will allow governments and relief agencies to prepare better for future refugee crises. The refugee crisis does not have binary answers. Given the emotionally wrenching nature of the ground realities, experts often shy away from realistically stating things as they are. This hesitancy can cost lives. When decisions are based solely on data, emotions can be removed from the decision-making process. Data also presents irrefutable evidence and tells whether there is a solution or not. Moreover, it also responds to a nonbinary crisis with a binary answer. Because of all that, it becomes easier to tackle a problem. Data science and A.I. can predict future refugee crises. With the recent explosion of data due to the rise of social media platforms, data and insight into data has solved many social and political problems. Data science can also help solve many issues refugees face while staying in refugee camps or adopted countries. This paper looks into various ways data science can help solve refugee problems. A.I.-based chatbots can help refugees seek legal help to find asylum in the country they want to settle in. These chatbots can help them find a marketplace where they can find help from the people willing to help. Data science and technology can also help solve refugees' many problems, including food, shelter, employment, security, and assimilation. The refugee problem seems to be one of the most challenging for social and political reasons. Data science and machine learning can help prevent the refugee crisis and solve or alleviate some of the problems that refugees face in their journey to a better life. With the explosion of data in the last decade, data science has made it possible to solve many geopolitical and social issues.Keywords: refugee crisis, artificial intelligence, data science, refugee camps, Afghanistan, Ukraine
Procedia PDF Downloads 7324569 Closed Mitral Valvotomy: A Safe and Promising Procedure
Authors: Sushil Kumar Singh, Kumar Rahul, Vivek Tewarson, Sarvesh Kumar, Shobhit Kumar
Abstract:
Objective: Rheumatic mitral stenosis continues to be a major public health problem in developing countries. When the left atrium (LA) is unable to fill the left ventricle (LV) at normal LA pressures due to impaired relaxation and impaired compliance, diastolic dysfunction occurs. The assessment of left ventricular (LV) diastolic function and filling pressures is of clinical importance to identify underlying cardiac disease, its treatment, and to assess prognosis. 2D echocardiography can detect diastolic dysfunction with excellent sensitivity and minimal risk when compared to the gold standard of invasive pressure-volume measurements. Material and Method: This was a one-year study consisting of twenty-nine patients of isolated rheumatic severe mitral stenosis. Data was analyzed preoperative and post operative (at one month follow-up). Transthoracic 2D echocardiographic parameters of the diastolic function are transmitral flow, pulmonary venous flow, mitral annular tissue doppler, and color M-mode doppler. In our study, mitral valve orifice area, ejection fraction, deceleration time, E/A-wave, E/E’-wave, myocardial performance index of left ventricle (Tei index ), and Mitral inflow propagation velocity were included for echocardiographic evaluation. The statistical analysis was performed on SPSS Version 15.0 statistical analysis software. Result: Twenty-nine patients underwent successful closed mitral commissurotomy for isolated mitral stenosis. The outcome measures were observed pre-operatively and at one-month follow-up. The majority of patients were in NYHA grade III (69.0%) in the preoperative period, which improved to NYHA grade I (48.3%) after closed mitral commissurotomy. Post-surgery mitral valve area increased from 0.77 ± 0.13 to 2.32 ± 0.26 cm, ejection fraction increased from 61.38 ± 4.61 to 64.79 ± 3.22. There was a decrease in deceleration time from 231.55 ± 49.31 to 168.28 ± 14.30 ms, E/A ratio from 1.70 ± 0.54 from 0.89 ± 0.39, E/E’ ratio from 14.59 ± 3.34 to 8.86 ± 3.03. In addition, there was improvement in TIE index from 0.50 ± 0.03 to 0.39 ± 0.06 and mitral inflow propagation velocity from 47.28 ± 3.71 to 57.86 ± 3.19 cm/sec. In peri-operative and follow-up, there was no incidence of severe mitral regurgitation (MR). There was no thromboembolic incident and no mortality.Keywords: closed mitral valvotomy, mitral stenosis, open mitral commissurotomy, balloon mitral valvotomy
Procedia PDF Downloads 8524568 Human Performance Evaluating of Advanced Cardiac Life Support Procedure Using Fault Tree and Bayesian Network
Authors: Shokoufeh Abrisham, Seyed Mahmoud Hossieni, Elham Pishbin
Abstract:
In this paper, a hybrid method based on the fault tree analysis (FTA) and Bayesian networks (BNs) are employed to evaluate the team performance quality of advanced cardiac life support (ACLS) procedures in emergency department. According to American Heart Association (AHA) guidelines, a category relying on staff action leading to clinical incidents and also some discussions with emergency medicine experts, a fault tree model for ACLS procedure is obtained based on the human performance. The obtained FTA model is converted into BNs, and some different scenarios are defined to demonstrate the efficiency and flexibility of the presented model of BNs. Also, a sensitivity analysis is conducted to indicate the effects of team leader presence and uncertainty knowledge of experts on the quality of ACLS. The proposed model based on BNs shows that how the results of risk analysis can be closed to reality comparing to the obtained results based on only FTA in medical procedures.Keywords: advanced cardiac life support, fault tree analysis, Bayesian belief networks, numan performance, healthcare systems
Procedia PDF Downloads 14724567 A Spatial Point Pattern Analysis to Recognize Fail Bit Patterns in Semiconductor Manufacturing
Authors: Youngji Yoo, Seung Hwan Park, Daewoong An, Sung-Shick Kim, Jun-Geol Baek
Abstract:
The yield management system is very important to produce high-quality semiconductor chips in the semiconductor manufacturing process. In order to improve quality of semiconductors, various tests are conducted in the post fabrication (FAB) process. During the test process, large amount of data are collected and the data includes a lot of information about defect. In general, the defect on the wafer is the main causes of yield loss. Therefore, analyzing the defect data is necessary to improve performance of yield prediction. The wafer bin map (WBM) is one of the data collected in the test process and includes defect information such as the fail bit patterns. The fail bit has characteristics of spatial point patterns. Therefore, this paper proposes the feature extraction method using the spatial point pattern analysis. Actual data obtained from the semiconductor process is used for experiments and the experimental result shows that the proposed method is more accurately recognize the fail bit patterns.Keywords: semiconductor, wafer bin map, feature extraction, spatial point patterns, contour map
Procedia PDF Downloads 38424566 The Study on How Social Cues in a Scene Modulate Basic Object Recognition Proces
Authors: Shih-Yu Lo
Abstract:
Stereotypes exist in almost every society, affecting how people interact with each other. However, to our knowledge, the influence of stereotypes was rarely explored in the context of basic perceptual processes. This study aims to explore how the gender stereotype affects object recognition. Participants were presented with a series of scene pictures, followed by a target display with a man or a woman, holding a weapon or a non-weapon object. The task was to identify whether the object in the target display was a weapon or not. Although the gender of the object holder could not predict whether he or she held a weapon, and was irrelevant to the task goal, the participant nevertheless tended to identify the object as a weapon when the object holder was a man than a woman. The analysis based on the signal detection theory showed that the stereotype effect on object recognition mainly resulted from the participant’s bias to make a 'weapon' response when a man was in the scene instead of a woman in the scene. In addition, there was a trend that the participant’s sensitivity to differentiate a weapon from a non-threating object was higher when a woman was in the scene than a man was in the scene. The results of this study suggest that the irrelevant social cues implied in the visual scene can be very powerful that they can modulate the basic object recognition process.Keywords: gender stereotype, object recognition, signal detection theory, weapon
Procedia PDF Downloads 20924565 The Measurement of the Multi-Period Efficiency of the Turkish Health Care Sector
Authors: Erhan Berk
Abstract:
The purpose of this study is to examine the efficiency and productivity of the health care sector in Turkey based on four years of health care cross-sectional data. Efficiency measures are calculated by a nonparametric approach known as Data Envelopment Analysis (DEA). Productivity is measured by the Malmquist index. The research shows how DEA-based Malmquist productivity index can be operated to appraise the technology and productivity changes resulted in the Turkish hospitals which are located all across the country.Keywords: data envelopment analysis, efficiency, health care, Malmquist Index
Procedia PDF Downloads 33524564 The Role of Cyfra 21-1 in Diagnosing Non Small Cell Lung Cancer (NSCLC)
Authors: H. J. T. Kevin Mozes, Dyah Purnamasari
Abstract:
Background: Lung cancer accounted for the fourth most common cancer in Indonesia. 85% of all lung cancer cases are the Non-Small Cell Lung Cancer (NSCLC). The indistinct signs and symptoms of NSCLC sometimes lead to misdiagnosis. The gold standard assessment for the diagnosis of NSCLC is the histopathological biopsy, which is invasive. Cyfra 21-1 is a tumor marker, which can be found in the intermediate protein structure in the epitel. The accuracy of Cyfra 21-1 in diagnosing NSCLC is not yet known, so this report is made to seek the answer for the question above. Methods: Literature searching is done using online databases. Proquest and Pubmed are online databases being used in this report. Then, literature selection is done by excluding and including based on inclusion criterias and exclusion criterias. The selected literature is then being appraised using the criteria of validity, importance, and validity. Results: From six journals appraised, five of them are valid. Sensitivity value acquired from all five literature is ranging from 50-84.5 %, meanwhile the specificity is 87.8 %-94.4 %. Likelihood the ratio of all appraised literature is ranging from 5.09 -10.54, which categorized to Intermediate High. Conclusion: Serum Cyfra 21-1 is a sensitive and very specific tumor marker for diagnosis of non-small cell lung cancer (NSCLC).Keywords: cyfra 21-1, diagnosis, nonsmall cell lung cancer, NSCLC, tumor marker
Procedia PDF Downloads 23224563 Impact Deformation and Fracture Behaviour of Cobalt-Based Haynes 188 Superalloy
Authors: Woei-Shyan Lee, Hao-Chien Kao
Abstract:
The impact deformation and fracture behaviour of cobalt-based Haynes 188 superalloy are investigated by means of a split Hopkinson pressure bar. Impact tests are performed at strain rates ranging from 1×103 s-1 to 5×103 s-1 and temperatures between 25°C and 800°C. The experimental results indicate that the flow response and fracture characteristics of cobalt-based Haynes 188 superalloy are significantly dependent on the strain rate and temperature. The flow stress, work hardening rate and strain rate sensitivity all increase with increasing strain rate or decreasing temperature. It is shown that the impact response of the Haynes 188 specimens is adequately described by the Zerilli-Armstrong fcc model. The fracture analysis results indicate that the Haynes 188 specimens fail predominantly as the result of intensive localised shearing. Furthermore, it is shown that the flow localisation effect leads to the formation of adiabatic shear bands. The fracture surfaces of the deformed Haynes 188 specimens are characterised by dimple- and / or cleavage-like structure with knobby features. The knobby features are thought to be the result of a rise in the local temperature to a value greater than the melting point.Keywords: Haynes 188 alloy, impact, strain rate and temperature effect, adiabatic shearing
Procedia PDF Downloads 35924562 Comparison Of Data Mining Models To Predict Future Bridge Conditions
Authors: Pablo Martinez, Emad Mohamed, Osama Mohsen, Yasser Mohamed
Abstract:
Highway and bridge agencies, such as the Ministry of Transportation in Ontario, use the Bridge Condition Index (BCI) which is defined as the weighted condition of all bridge elements to determine the rehabilitation priorities for its bridges. Therefore, accurate forecasting of BCI is essential for bridge rehabilitation budgeting planning. The large amount of data available in regard to bridge conditions for several years dictate utilizing traditional mathematical models as infeasible analysis methods. This research study focuses on investigating different classification models that are developed to predict the bridge condition index in the province of Ontario, Canada based on the publicly available data for 2800 bridges over a period of more than 10 years. The data preparation is a key factor to develop acceptable classification models even with the simplest one, the k-NN model. All the models were tested, compared and statistically validated via cross validation and t-test. A simple k-NN model showed reasonable results (within 0.5% relative error) when predicting the bridge condition in an incoming year.Keywords: asset management, bridge condition index, data mining, forecasting, infrastructure, knowledge discovery in databases, maintenance, predictive models
Procedia PDF Downloads 19124561 Piql Preservation Services - A Holistic Approach to Digital Long-Term Preservation
Authors: Alexander Rych
Abstract:
Piql Preservation Services (“Piql”) is a turnkey solution designed for secure, migration-free long- term preservation of digital data. Piql sets an open standard for long- term preservation for the future. It consists of equipment and processes needed for writing and retrieving digital data. Exponentially growing amounts of data demand for logistically effective and cost effective processes. Digital storage media (hard disks, magnetic tape) exhibit limited lifetime. Repetitive data migration to overcome rapid obsolescence of hardware and software bears accelerated risk of data loss, data corruption or even manipulation and adds significant repetitive costs for hardware and software investments. Piql stores any kind of data in its digital as well as analog form securely for 500 years. The medium that provides this is a film reel. Using photosensitive film polyester base, a very stable material that is known for its immutability over hundreds of years, secure and cost-effective long- term preservation can be provided. The film reel itself is stored in a packaging capable of protecting the optical storage medium. These components have undergone extensive testing to ensure longevity of up to 500 years. In addition to its durability, film is a true WORM (write once- read many) medium. It therefore is resistant to editing or manipulation. Being able to store any form of data onto the film makes Piql a superior solution for long-term preservation. Paper documents, images, video or audio sequences – all of those file formats and documents can be preserved in its native file structure. In order to restore the encoded digital data, only a film scanner, a digital camera or any appropriate optical reading device will be needed in the future. Every film reel includes an index section describing the data saved on the film. It also contains a content section carrying meta-data, enabling users in the future to rebuild software in order to read and decode the digital information.Keywords: digital data, long-term preservation, migration-free, photosensitive film
Procedia PDF Downloads 39224560 Statistical Correlation between Logging-While-Drilling Measurements and Wireline Caliper Logs
Authors: Rima T. Alfaraj, Murtadha J. Al Tammar, Khaqan Khan, Khalid M. Alruwaili
Abstract:
OBJECTIVE/SCOPE (25-75): Caliper logging data provides critical information about wellbore shape and deformations, such as stress-induced borehole breakouts or washouts. Multiarm mechanical caliper logs are often run using wireline, which can be time-consuming, costly, and/or challenging to run in certain formations. To minimize rig time and improve operational safety, it is valuable to develop analytical solutions that can estimate caliper logs using available Logging-While-Drilling (LWD) data without the need to run wireline caliper logs. As a first step, the objective of this paper is to perform statistical analysis using an extensive datasetto identify important physical parameters that should be considered in developing such analytical solutions. METHODS, PROCEDURES, PROCESS (75-100): Caliper logs and LWD data of eleven wells, with a total of more than 80,000 data points, were obtained and imported into a data analytics software for analysis. Several parameters were selected to test the relationship of the parameters with the measured maximum and minimum caliper logs. These parameters includegamma ray, porosity, shear, and compressional sonic velocities, bulk densities, and azimuthal density. The data of the eleven wells were first visualized and cleaned.Using the analytics software, several analyses were then preformed, including the computation of Pearson’s correlation coefficients to show the statistical relationship between the selected parameters and the caliper logs. RESULTS, OBSERVATIONS, CONCLUSIONS (100-200): The results of this statistical analysis showed that some parameters show good correlation to the caliper log data. For instance, the bulk density and azimuthal directional densities showedPearson’s correlation coefficients in the range of 0.39 and 0.57, which wererelatively high when comparedto the correlation coefficients of caliper data with other parameters. Other parameters such as porosity exhibited extremely low correlation coefficients to the caliper data. Various crossplots and visualizations of the data were also demonstrated to gain further insights from the field data. NOVEL/ADDITIVE INFORMATION (25-75): This study offers a unique and novel look into the relative importance and correlation between different LWD measurements and wireline caliper logs via an extensive dataset. The results pave the way for a more informed development of new analytical solutions for estimating the size and shape of the wellbore in real-time while drilling using LWD data.Keywords: LWD measurements, caliper log, correlations, analysis
Procedia PDF Downloads 12124559 F-VarNet: Fast Variational Network for MRI Reconstruction
Authors: Omer Cahana, Maya Herman, Ofer Levi
Abstract:
Magnetic resonance imaging (MRI) is a long medical scan that stems from a long acquisition time. This length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach, such as compress sensing (CS) or parallel imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. In order to achieve that, two properties have to exist: i) the signal must be sparse under a known transform domain, ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm needs to be applied to recover the signal. While the rapid advance in the deep learning (DL) field, which has demonstrated tremendous successes in various computer vision task’s, the field of MRI reconstruction is still in an early stage. In this paper, we present an extension of the state-of-the-art model in MRI reconstruction -VarNet. We utilize VarNet by using dilated convolution in different scales, which extends the receptive field to capture more contextual information. Moreover, we simplified the sensitivity map estimation (SME), for it holds many unnecessary layers for this task. Those improvements have shown significant decreases in computation costs as well as higher accuracy.Keywords: MRI, deep learning, variational network, computer vision, compress sensing
Procedia PDF Downloads 16224558 Inversion of Gravity Data for Density Reconstruction
Authors: Arka Roy, Chandra Prakash Dubey
Abstract:
Inverse problem generally used for recovering hidden information from outside available data. Vertical component of gravity field we will be going to use for underneath density structure calculation. Ill-posing nature is main obstacle for any inverse problem. Linear regularization using Tikhonov formulation are used for appropriate choice of SVD and GSVD components. For real time data handle, signal to noise ratios should have to be less for reliable solution. In our study, 2D and 3D synthetic model with rectangular grid are used for gravity field calculation and its corresponding inversion for density reconstruction. Fine grid also we have considered to hold any irregular structure. Keeping in mind of algebraic ambiguity factor number of observation point should be more than that of number of data point. Picard plot is represented here for choosing appropriate or main controlling Eigenvalues for a regularized solution. Another important study is depth resolution plot (DRP). DRP are generally used for studying how the inversion is influenced by regularizing or discretizing. Our further study involves real time gravity data inversion of Vredeforte Dome South Africa. We apply our method to this data. The results include density structure is in good agreement with known formation in that region, which puts an additional support of our method.Keywords: depth resolution plot, gravity inversion, Picard plot, SVD, Tikhonov formulation
Procedia PDF Downloads 21224557 DeepOmics: Deep Learning for Understanding Genome Functioning and the Underlying Genetic Causes of Disease
Authors: Vishnu Pratap Singh Kirar, Madhuri Saxena
Abstract:
Advancement in sequence data generation technologies is churning out voluminous omics data and posing a massive challenge to annotate the biological functional features. With so much data available, the use of machine learning methods and tools to make novel inferences has become obvious. Machine learning methods have been successfully applied to a lot of disciplines, including computational biology and bioinformatics. Researchers in computational biology are interested to develop novel machine learning frameworks to classify the huge amounts of biological data. In this proposal, it plan to employ novel machine learning approaches to aid the understanding of how apparently innocuous mutations (in intergenic DNA and at synonymous sites) cause diseases. We are also interested in discovering novel functional sites in the genome and mutations in which can affect a phenotype of interest.Keywords: genome wide association studies (GWAS), next generation sequencing (NGS), deep learning, omics
Procedia PDF Downloads 97