Search results for: automated
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 833

Search results for: automated

473 Design Application Procedures of 15 Storied 3D Reinforced Concrete Shear Wall-Frame Structure

Authors: H. Nikzad, S. Yoshitomi

Abstract:

This paper presents the design application and reinforcement detailing of 15 storied reinforced concrete shear wall-frame structure based on linear static analysis. Databases are generated for section sizes based on automated structural optimization method utilizing Active-set Algorithm in MATLAB platform. The design constraints of allowable section sizes, capacity criteria and seismic provisions for static loads, combination of gravity and lateral loads are checked and determined based on ASCE 7-10 documents and ACI 318-14 design provision. The result of this study illustrates the efficiency of proposed method, and is expected to provide a useful reference in designing of RC shear wall-frame structures.

Keywords: design constraints, ETABS, linear static analysis, MATLAB, RC shear wall-frame structures, structural optimization

Procedia PDF Downloads 227
472 Interpretation and Clustering Framework for Analyzing ECG Survey Data

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-Pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 441
471 Automated Pothole Detection Using Convolution Neural Networks and 3D Reconstruction Using Stereovision

Authors: Eshta Ranyal, Kamal Jain, Vikrant Ranyal

Abstract:

Potholes are a severe threat to road safety and a major contributing factor towards road distress. In the Indian context, they are a major road hazard. Timely detection of potholes and subsequent repair can prevent the roads from deteriorating. To facilitate the roadway authorities in the timely detection and repair of potholes, we propose a pothole detection methodology using convolutional neural networks. The YOLOv3 model is used as it is fast and accurate in comparison to other state-of-the-art models. You only look once v3 (YOLOv3) is a state-of-the-art, real-time object detection system that features multi-scale detection. A mean average precision(mAP) of 73% was obtained on a training dataset of 200 images. The dataset was then increased to 500 images, resulting in an increase in mAP. We further calculated the depth of the potholes using stereoscopic vision by reconstruction of 3D potholes. This enables calculating pothole volume, its extent, which can then be used to evaluate the pothole severity as low, moderate, high.

Keywords: CNN, pothole detection, pothole severity, YOLO, stereovision

Procedia PDF Downloads 110
470 A Comparative Study of Malware Detection Techniques Using Machine Learning Methods

Authors: Cristina Vatamanu, Doina Cosovan, Dragos Gavrilut, Henri Luchian

Abstract:

In the past few years, the amount of malicious software increased exponentially and, therefore, machine learning algorithms became instrumental in identifying clean and malware files through semi-automated classification. When working with very large datasets, the major challenge is to reach both a very high malware detection rate and a very low false positive rate. Another challenge is to minimize the time needed for the machine learning algorithm to do so. This paper presents a comparative study between different machine learning techniques such as linear classifiers, ensembles, decision trees or various hybrids thereof. The training dataset consists of approximately 2 million clean files and 200.000 infected files, which is a realistic quantitative mixture. The paper investigates the above mentioned methods with respect to both their performance (detection rate and false positive rate) and their practicability.

Keywords: ensembles, false positives, feature selection, one side class algorithm

Procedia PDF Downloads 261
469 Analysis of ECGs Survey Data by Applying Clustering Algorithm

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring the prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 327
468 Understanding Mudrocks and Their Shear Strength Deterioration Associated with Inundation

Authors: Haslinda Nahazanan, Afshin Asadi, Zainuddin Md. Yusoff, Nik Nor Syahariati Nik Daud

Abstract:

Mudrocks is considered as a problematic material due to their unexpected behaviour specifically when they are contacting with water or being exposed to the atmosphere. Many instability problems of cutting slopes were found lying on high slaking mudrocks. It has become one of the major concerns to geotechnical engineer as mudrocks cover up to 50% of sedimentary rocks in the geologic records. Mudrocks display properties between soils and rocks which can be very hard to understand. Therefore, this paper aims to review the definition, mineralogy, geo-chemistry, classification and engineering properties of mudrocks. As water has become one of the major factors that will rapidly change the behaviour of mudrocks, a review on the shear strength of mudrocks in Derbyshire has been made using a fully automated hydraulic stress path testing system under three states: dry, short-term inundated and long-term inundated. It can be seen that the strength of mudrocks has deteriorated as it condition changed from dry to short-term inundated and finally to long-term inundated.

Keywords: mudrocks, sedimentary rocks, inundation, shear strength

Procedia PDF Downloads 211
467 Cotton Crops Vegetative Indices Based Assessment Using Multispectral Images

Authors: Muhammad Shahzad Shifa, Amna Shifa, Muhammad Omar, Aamir Shahzad, Rahmat Ali Khan

Abstract:

Many applications of remote sensing to vegetation and crop response depend on spectral properties of individual leaves and plants. Vegetation indices are usually determined to estimate crop biophysical parameters like crop canopies and crop leaf area indices with the help of remote sensing. Cotton crops assessment is performed with the help of vegetative indices. Remotely sensed images from an optical multispectral radiometer MSR5 are used in this study. The interpretation is based on the fact that different materials reflect and absorb light differently at different wavelengths. Non-normalized and normalized forms of these datasets are analyzed using two complementary data mining algorithms; K-means and K-nearest neighbor (KNN). Our analysis shows that the use of normalized reflectance data and vegetative indices are suitable for an automated assessment and decision making.

Keywords: cotton, condition assessment, KNN algorithm, clustering, MSR5, vegetation indices

Procedia PDF Downloads 291
466 Analyzing the Effectiveness of Different Testing Techniques in Ensuring Software Quality

Authors: R. M. P. C. Bandara, M. L. L. Weerasinghe, K. T. C. R. Kumari, A. G. D. R. Hansika, D. I. De Silva, D. M. T. H. Dias

Abstract:

Software testing is an essential process in software development that aims to identify defects and ensure that software is functioning as intended. Various testing techniques are employed to achieve this goal, but the effectiveness of these techniques varies. This research paper analyzes the effectiveness of different testing techniques in ensuring software quality. The paper explores different testing techniques, including manual and automated testing, and evaluates their effectiveness in terms of identifying defects, reducing the number of defects in software, and ensuring that software meets its functional and non-functional requirements. Moreover, the paper will also investigate the impact of factors such as testing time, test coverage, and testing environment on the effectiveness of these techniques. This research aims to provide valuable insights into the effectiveness of different testing techniques, enabling software development teams to make informed decisions about the testing approach that is best suited to their needs. By improving testing techniques, the number of defects in software can be reduced, enhancing the quality of software and ultimately providing better software for users.

Keywords: software testing life cycle, software testing techniques, software testing strategies, effectiveness, software quality

Procedia PDF Downloads 51
465 A Survey on the Blockchain Smart Contract System: Security Strengths and Weaknesses

Authors: Malaw Ndiaye, Karim Konate

Abstract:

Smart contracts are computer protocols that facilitate, verify, and execute the negotiation or execution of a contract, or that render a contractual term unnecessary. Blockchain and smart contracts can be used to facilitate almost any financial transaction. Thanks to these smart contracts, the settlement of dividends and coupons could be automated. Smart contracts have become lucrative and profitable targets for attackers because they can hold a great amount of money. Smart contracts, although widely used in blockchain technology, are far from perfect due to security concerns. Since there are recent studies on smart contract security, none of them systematically study the strengths and weaknesses of smart contract security. Some have focused on an analysis of program-related vulnerabilities by providing a taxonomy of vulnerabilities. Other studies are responsible for listing the series of attacks linked to smart contracts. Although a series of attacks are listed, there is a lack of discussions and proposals on improving security. This survey takes stock of smart contract security from a more comprehensive perspective by correlating the level of vulnerability and systematic review of security levels in smart contracts.

Keywords: blockchain, Bitcoin, smart contract, criminal smart contract, security

Procedia PDF Downloads 136
464 Modernization of the Economic Price Adjustment Software

Authors: Roger L. Goodwin

Abstract:

The US Consumer Price Indices (CPIs) measures hundreds of items in the US economy. Many social programs and government benefits index to the CPIs. In mid to late 1990, much research went into changes to the CPI by a Congressional Advisory Committee. One thing can be said from the research is that, aside from there are alternative estimators for the CPI; any fundamental change to the CPI will affect many government programs. The purpose of this project is to modernize an existing process. This paper will show the development of a small, visual, software product that documents the Economic Price Adjustment (EPA) for long-term contracts. The existing workbook does not provide the flexibility to calculate EPAs where the base-month and the option-month are different. Nor does the workbook provide automated error checking. The small, visual, software product provides the additional flexibility and error checking. This paper presents the feedback to project.

Keywords: Consumer Price Index, Economic Price Adjustment, contracts, visualization tools, database, reports, forms, event procedures

Procedia PDF Downloads 291
463 ATM Location Problem and Cash Management in ATM's

Authors: M. Erol Genevois, D. Celik, H. Z. Ulukan

Abstract:

Automated teller machines (ATMs) can be considered among one of the most important service facilities in the banking industry. The investment in ATMs and the impact on the banking industry is growing steadily in every part of the world. The banks take into consideration many factors like safety, convenience, visibility, cost in order to determine the optimum locations of ATMs. Today, ATMs are not only available in bank branches but also at retail locations. Another important factor is the cash management in ATMs. A cash demand model for every ATM is needed in order to have an efficient cash management system. This forecasting model is based on historical cash demand data which is highly related to the ATMs location. So, the location and the cash management problem should be considered together. Although the literature survey on facility location models is quite large, it is surprising that there are only few studies which handle together ATMs location and cash management problem. In order to fulfill the gap, this paper provides a general review on studies, efforts and development in ATMs location and cash management problem.

Keywords: ATM location problem, cash management problem, ATM cash replenishment problem, literature review in ATMs

Procedia PDF Downloads 453
462 Automatic API Regression Analyzer and Executor

Authors: Praveena Sridhar, Nihar Devathi, Parikshit Chakraborty

Abstract:

As the software product changes versions across releases, there are changes to the API’s and features and the upgrades become necessary. Hence, it becomes imperative to get the impact of upgrading the dependent components. This tool finds out API changes across two versions and their impact on other API’s followed by execution of the automated regression suites relevant to updates and their impacted areas. This tool has 4 layer architecture, each layer with its own unique pre-assigned capability which it does and sends the required information to next layer. This are the 4 layers. 1) Comparator: Compares the two versions of API. 2) Analyzer: Analyses the API doc and gives the modified class and its dependencies along with implemented interface details. 3) Impact Filter: Find the impact of the modified class on the other API methods. 4) Auto Executer: Based on the output given by Impact Filter, Executor will run the API regression Suite. Tool reads the java doc and extracts the required information of classes, interfaces and enumerations. The extracted information is saved into a data structure which shows the class details and its dependencies along with interfaces and enumerations that are listed in the java doc.

Keywords: automation impact regression, java doc, executor, analyzer, layers

Procedia PDF Downloads 461
461 Design of a Pneumonia Ontology for Diagnosis Decision Support System

Authors: Sabrina Azzi, Michal Iglewski, Véronique Nabelsi

Abstract:

Diagnosis error problem is frequent and one of the most important safety problems today. One of the main objectives of our work is to propose an ontological representation that takes into account the diagnostic criteria in order to improve the diagnostic. We choose pneumonia disease since it is one of the frequent diseases affected by diagnosis errors and have harmful effects on patients. To achieve our aim, we use a semi-automated method to integrate diverse knowledge sources that include publically available pneumonia disease guidelines from international repositories, biomedical ontologies and electronic health records. We follow the principles of the Open Biomedical Ontologies (OBO) Foundry. The resulting ontology covers symptoms and signs, all the types of pneumonia, antecedents, pathogens, and diagnostic testing. The first evaluation results show that most of the terms are covered by the ontology. This work is still in progress and represents a first and major step toward a development of a diagnosis decision support system for pneumonia.

Keywords: Clinical decision support system, Diagnostic errors, Ontology, Pneumonia

Procedia PDF Downloads 159
460 Generation of Automated Alarms for Plantwide Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.

Keywords: detection, monitoring, process data, noise

Procedia PDF Downloads 214
459 Detecting Heartbeat Architectural Tactic in Source Code Using Program Analysis

Authors: Ananta Kumar Das, Sujit Kumar Chakrabarti

Abstract:

Architectural tactics such as heartbeat, ping-echo, encapsulate, encrypt data are techniques that are used to achieve quality attributes of a system. Detecting architectural tactics has several benefits: it can aid system comprehension (e.g., legacy systems) and in the estimation of quality attributes such as safety, security, maintainability, etc. Architectural tactics are typically spread over the source code and are implicit. For large codebases, manual detection is often not feasible. Therefore, there is a need for automated methods of detection of architectural tactics. This paper presents a formalization of the heartbeat architectural tactic and a program analytic approach to detect this tactic in source code. The experiment of the proposed method is done on a set of Java applications. The outcome of the experiment strongly suggests that the method compares well with a manual approach in terms of its sensitivity and specificity, and far supersedes a manual exercise in terms of its scalability.

Keywords: software architecture, architectural tactics, detecting architectural tactics, program analysis, AST, alias analysis

Procedia PDF Downloads 120
458 The Effects of Logistics Applications on Logistics Activities of Service Providers: An Assessment of a 3PL Company in Turkey

Authors: Fatmanur Avar, Kubra G. Kostepen, Seda Lafci

Abstract:

In today’s world, technological innovations have brought out entirely new business understanding. Companies operating in logistics have become more flexible to business trends such as digitalization, innovation, sustainability, flexibility, and productivity. Through the arrival of the fourth industrial revolution called as industry 4.0 approach, the logistics concepts have been redefined. By adopting automated planning and scheduling, organizing and controlling systems such as Transportation Management System (TMS), Enterprise Resource Planning (ERP), warehouse control systems, it will be possible for businesses to be ahead of logistics process. In this research, the aim is to reveal the effects of logistics 4.0 applications for a third party logistics service provider (3PL) located in Turkey. Also, the impacts of logistics 4.0 on key performance indicators (KPI) are examined under the scope of the study. As a methodology, a semi-structured interview is conducted with a global 3PL company and data collected from interviews is analyzed with content analysis. At the end of the analysis, it is presented the effects of logistics 4.0 applications on logistics activities of the company. Limitations and suggestions are also offered.

Keywords: key performance indicators, KPI, logistics activities, logistics 4.0, 3PL

Procedia PDF Downloads 142
457 Application of Artificial Neural Network in Initiating Cleaning Of Photovoltaic Solar Panels

Authors: Mohamed Mokhtar, Mostafa F. Shaaban

Abstract:

Among the challenges facing solar photovoltaic (PV) systems in the United Arab Emirates (UAE), dust accumulation on solar panels is considered the most severe problem that faces the growth of solar power plants. The accumulation of dust on the solar panels significantly degrades output from these panels. Hence, solar PV panels have to be cleaned manually or using costly automated cleaning methods. This paper focuses on initiating cleaning actions when required to reduce maintenance costs. The cleaning actions are triggered only when the dust level exceeds a threshold value. The amount of dust accumulated on the PV panels is estimated using an artificial neural network (ANN). Experiments are conducted to collect the required data, which are used in the training of the ANN model. Then, this ANN model will be fed by the output power from solar panels, ambient temperature, and solar irradiance, and thus, it will be able to estimate the amount of dust accumulated on solar panels at these conditions. The model was tested on different case studies to confirm the accuracy of the developed model.

Keywords: machine learning, dust, PV panels, renewable energy

Procedia PDF Downloads 116
456 Evaluation of Antibiotic Resistance and Extended-Spectrum β-Lactamases Production Rates of Gram Negative Rods in a University Research and Practice Hospital, 2012-2015

Authors: Recep Kesli, Cengiz Demir, Onur Turkyilmaz, Hayriye Tokay

Abstract:

Objective: Gram-negative rods are a large group of bacteria, and include many families, genera, and species. Most clinical isolates belong to the family Enterobacteriaceae. Resistance due to the production of extended-spectrum β-lactamases (ESBLs) is a difficulty in the handling of Enterobacteriaceae infections, but other mechanisms of resistance are also emerging, leading to multidrug resistance and threatening to create panresistant species. We aimed in this study to evaluate resistance rates of Gram-negative rods bacteria isolated from clinical specimens in Microbiology Laboratory, Afyon Kocatepe University, ANS Research and Practice Hospital, between October 2012 and September 2015. Methods: The Gram-negative rods strains were identified by conventional methods and VITEK 2 automated identification system (bio-Mérieux, Marcy l’etoile, France). Antibiotic resistance tests were performed by both the Kirby-Bauer disk-diffusion and automated Antimicrobial Susceptibility Testing (AST, bio-Mérieux, Marcy l’etoile, France) methods. Disk diffusion results were evaluated according to the standards of Clinical and Laboratory Standards Institute (CLSI). Results: Of the totally isolated 1.701 Enterobacteriaceae strains 1434 (84,3%) were Klebsiella pneumoniae, 171 (10%) were Enterobacter spp., 96 (5.6%) were Proteus spp., and 639 Nonfermenting gram negatives, 477 (74.6%) were identified as Pseudomonas aeruginosa, 135 (21.1%) were Acinetobacter baumannii and 27 (4.3%) were Stenotrophomonas maltophilia. The ESBL positivity rate of the totally studied Enterobacteriaceae group were 30.4%. Antibiotic resistance rates for Klebsiella pneumoniae were as follows: amikacin 30.4%, gentamicin 40.1%, ampicillin-sulbactam 64.5%, cefepime 56.7%, cefoxitin 35.3%, ceftazidime 66.8%, ciprofloxacin 65.2%, ertapenem 22.8%, imipenem 20.5%, meropenem 20.5 %, and trimethoprim-sulfamethoxazole 50.1%, and for 114 Enterobacter spp were detected as; amikacin 26.3%, gentamicin 31.5%, cefepime 26.3%, ceftazidime 61.4%, ciprofloxacin 8.7%, ertapenem 8.7%, imipenem 12.2%, meropenem 12.2%, and trimethoprim-sulfamethoxazole 19.2 %. Resistance rates for Proteus spp. were: 24,3% meropenem, 26.2% imipenem, 20.2% amikacin 10.5% cefepim, 33.3% ciprofloxacin and levofloxacine, 31.6% ceftazidime, 20% ceftriaxone, 15.2% gentamicin, 26.6% amoxicillin-clavulanate, and 26.2% trimethoprim-sulfamethoxale. Resistance rates of P. aeruginosa was found as follows: Amikacin 32%, gentamicin 42 %, imipenem 43%, merpenem 43%, ciprofloxacin 50%, levofloxacin 52%, cefepim 38%, ceftazidim 63%, piperacillin/tacobactam 85%, for Acinetobacter baumannii; Amikacin 53.3%, gentamicin 56.6 %, imipenem 83%, merpenem 86%, ciprofloxacin 100%, ceftazidim 100%, piperacillin/tacobactam 85 %, colisitn 0 %, and for S. malthophilia; levofloxacin 66.6 % and trimethoprim/sulfamethoxozole 0 %. Conclusions: This study showed that resistance in Gram-negative rods was a serious clinical problem in our hospital and suggested the need to perform typification of the isolated bacteria with susceptibility testing regularly in the routine laboratory procedures. This application guided to empirical antibiotic treatment choices truly, as a consequence of the reality that each hospital shows different resistance profiles.

Keywords: antibiotic resistance, gram negative rods, ESBL, VITEK 2

Procedia PDF Downloads 302
455 An Indoor Guidance System Combining Near Field Communication and Bluetooth Low Energy Beacon Technologies

Authors: Rung-Shiang Cheng, Wei-Jun Hong, Jheng-Syun Wang, Kawuu W. Lin

Abstract:

Users rely increasingly on Location-Based Services (LBS) and automated navigation/guidance systems nowadays. However, while such services are easily implemented in outdoor environments using Global Positioning System (GPS) technology, a requirement still exists for accurate localization and guidance schemes in indoor settings. Accordingly, the present study presents a methodology based on GPS, Bluetooth Low Energy (BLE) beacons, and Near Field Communication (NFC) technology. Through establishing graphic information and the design of algorithm, this study develops a guidance system for indoor and outdoor on smartphones, with aim to provide users a smart life through this system. The presented system is implemented on a smartphone and evaluated on a student campus environment. The experimental results confirm the ability of the presented app to switch automatically from an outdoor mode to an indoor mode and to guide the user to the requested target destination via the shortest possible route.

Keywords: beacon, indoor, BLE, Dijkstra algorithm

Procedia PDF Downloads 271
454 SVID: Structured Vulnerability Intelligence for Building Deliberated Vulnerable Environment

Authors: Wenqing Fan, Yixuan Cheng, Wei Huang

Abstract:

The diversity and complexity of modern IT systems make it almost impossible for internal teams to find vulnerabilities in all software before the software is officially released. The emergence of threat intelligence and vulnerability reporting policy has greatly reduced the burden on software vendors and organizations to find vulnerabilities. However, to prove the existence of the reported vulnerability, it is necessary but difficult for security incident response team to build a deliberated vulnerable environment from the vulnerability report with limited and incomplete information. This paper presents a structured, standardized, machine-oriented vulnerability intelligence format, that can be used to automate the orchestration of Deliberated Vulnerable Environment (DVE). This paper highlights the important role of software configuration and proof of vulnerable specifications in vulnerability intelligence, and proposes a triad model, which is called DIR (Dependency Configuration, Installation Configuration, Runtime Configuration), to define software configuration. Finally, this paper has also implemented a prototype system to demonstrate that the orchestration of DVE can be automated with the intelligence.

Keywords: DIR triad model, DVE, vulnerability intelligence, vulnerability recurrence

Procedia PDF Downloads 97
453 Correlation of Hyperlipidemia with Platelet Parameters in Blood Donors

Authors: S. Nishat Fatima Rizvi, Tulika Chandra, Abbas Ali Mahdi, Devisha Agarwal

Abstract:

Introduction: Blood components are an unexplored area prone to numerous discoveries which influence patient’s care. Experiments at different levels will further change the present concept of blood banking. Hyperlipidemia is a condition of elevated plasma level of low-density lipoprotein (LDL) as well as decreased plasma level of high-density lipoprotein (HDL). Studies show that platelets play a vital role in the progression of atherosclerosis and thrombosis, a major cause of death worldwide. They are activated by many triggers like elevated LDL in the blood resulting in aggregation and formation of plaques. Hyperlipidemic platelets are frequently transfused to patients with various disorders. Screening the random donor platelets for hyperlipidemia and correlating the condition with other donor criteria such as lipid rich diet, oral contraceptive pills intake, weight, alcohol intake, smoking, sedentary lifestyle, family history of heart diseases will lead to further deciding the exclusion criteria for donor selection. This will help in making the patients safe as well as the donor deferral criteria more stringent to improve the quality of blood supply. Technical evaluation and assessment will enable blood bankers to supply safe blood and improve the guidelines for blood safety. Thus, we try to study the correlation between hyperlipidemic platelets with platelets parameters, weight, and specific history of the donors. Methodology: This case control study included 100 blood samples of Blood donors, out of 100 only 30 samples were found to be hyperlipidemic and were included as cases, while rest were taken as controls. Lipid Profile were measured by fully automated analyzer (TRIGL:triglycerides),(LDL-C:LDL –Cholesterol plus 2nd generation),CHOL 2: Cholesterol Gen 2), HDL C 3: HDL-Cholesterol plus 3rdgeneration)-(Cobas C311-Roche Diagnostic).And Platelets parameters were analyzed by the Sysmex KX21 automated hematology analyzer. Results: A significant correlation was found amongst hyperlipidemic level in single time donor. In which 80% donors have history of heart disease, 66.66% donors have sedentary life style, 83.3% donors were smokers, 50% donors were alcoholic, and 63.33% donors had taken lipid rich diet. Active physical activity was found amongst 40% donors. We divided donors sample in two groups based on their body weight. In group 1, hyperlipidemic samples: Platelet Parameters were 75% in normal 25% abnormal in >70Kg weight while in 50-70Kg weight 90% were normal 10% were abnormal. In-group 2, Non Hyperlipidemic samples: platelet Parameters were 95% normal and 5% abnormal in >70Kg weight, while in 50-70Kg Weight, 66.66% normal and 33.33% abnormal. Conclusion: The findings indicate that Hyperlipidemic status of donors may affect the platelet parameters and can be distinguished on history by their weight, Smoking, Alcoholic intake, Sedentary lifestyle, Active physical activity, Lipid rich diet, Oral contraceptive pills intake, and Family history of heart disease. However further studies on a large sample size will affirm this finding.

Keywords: blood donors, hyperlipidemia, platelet, weight

Procedia PDF Downloads 279
452 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery

Authors: Evans Belly, Imdad Rizvi, M. M. Kadam

Abstract:

Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.

Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery

Procedia PDF Downloads 290
451 Unreliable Production Lines with Simultaneously Unbalanced Operation Time Means, Breakdown, and Repair Rates

Authors: Sabry Shaaban, Tom McNamara, Sarah Hudson

Abstract:

This paper investigates the benefits of deliberately unbalancing both operation time means (MTs) and unreliability (failure and repair rates) for non-automated production lines.The lines were simulated with various line lengths, buffer capacities, degrees of imbalance and patterns of MT and unreliability imbalance. Data on two performance measures, namely throughput (TR) and average buffer level (ABL) were gathered, analyzed and compared to a balanced line counterpart. A number of conclusions were made with respect to the ranking of configurations, as well as to the relationships among the independent design parameters and the dependent variables. It was found that the best configurations are a balanced line arrangement and a monotone decreasing MT order, coupled with either a decreasing or a bowl unreliability configuration, with the first generally resulting in a reduced TR and the second leading to a lower ABL than those of a balanced line.

Keywords: unreliable production lines, unequal mean operation times, unbalanced failure and repair rates, throughput, average buffer level

Procedia PDF Downloads 462
450 Rapid, Automated Characterization of Microplastics Using Laser Direct Infrared Imaging and Spectroscopy

Authors: Andreas Kerstan, Darren Robey, Wesam Alvan, David Troiani

Abstract:

Over the last 3.5 years, Quantum Cascade Lasers (QCL) technology has become increasingly important in infrared (IR) microscopy. The advantages over fourier transform infrared (FTIR) are that large areas of a few square centimeters can be measured in minutes and that the light intensive QCL makes it possible to obtain spectra with excellent S/N, even with just one scan. A firmly established solution of the laser direct infrared imaging (LDIR) 8700 is the analysis of microplastics. The presence of microplastics in the environment, drinking water, and food chains is gaining significant public interest. To study their presence, rapid and reliable characterization of microplastic particles is essential. Significant technical hurdles in microplastic analysis stem from the sheer number of particles to be analyzed in each sample. Total particle counts of several thousand are common in environmental samples, while well-treated bottled drinking water may contain relatively few. While visual microscopy has been used extensively, it is prone to operator error and bias and is limited to particles larger than 300 µm. As a result, vibrational spectroscopic techniques such as Raman and FTIR microscopy have become more popular, however, they are time-consuming. There is a demand for rapid and highly automated techniques to measure particle count size and provide high-quality polymer identification. Analysis directly on the filter that often forms the last stage in sample preparation is highly desirable as, by removing a sample preparation step it can both improve laboratory efficiency and decrease opportunities for error. Recent advances in infrared micro-spectroscopy combining a QCL with scanning optics have created a new paradigm, LDIR. It offers improved speed of analysis as well as high levels of automation. Its mode of operation, however, requires an IR reflective background, and this has, to date, limited the ability to perform direct “on-filter” analysis. This study explores the potential to combine the filter with an infrared reflective surface filter. By combining an IR reflective material or coating on a filter membrane with advanced image analysis and detection algorithms, it is demonstrated that such filters can indeed be used in this way. Vibrational spectroscopic techniques play a vital role in the investigation and understanding of microplastics in the environment and food chain. While vibrational spectroscopy is widely deployed, improvements and novel innovations in these techniques that can increase the speed of analysis and ease of use can provide pathways to higher testing rates and, hence, improved understanding of the impacts of microplastics in the environment. Due to its capability to measure large areas in minutes, its speed, degree of automation and excellent S/N, the LDIR could also implemented for various other samples like food adulteration, coatings, laminates, fabrics, textiles and tissues. This presentation will highlight a few of them and focus on the benefits of the LDIR vs classical techniques.

Keywords: QCL, automation, microplastics, tissues, infrared, speed

Procedia PDF Downloads 33
449 Unauthorized License Verifier and Secure Access to Vehicle

Authors: G. Prakash, L. Mohamed Aasiq, N. Dhivya, M. Jothi Mani, R. Mounika, B. Gomathi

Abstract:

In our day to day life, many people met with an accident due to various reasons like over speed, overload in the vehicle, violation of the traffic rules, etc. Driving license system is difficult task for the government to monitor. To prevent non-licensees from driving who are causing most of the accidents, a new system is proposed. The proposed system consists of a smart card capable of storing the license details of a particular person. Vehicles such as cars, bikes etc., should have a card reader capable of reading the particular license. A person, who wishes to drive the vehicle, should insert the card (license) in the vehicle and then enter the password in the keypad. If the license data stored in the card and database about the entire license holders in the microcontroller matches, he/she can proceed for ignition after the automated opening of the fuel tank valve, otherwise the user is restricted to use the vehicle. Moreover, overload detector in our proposed system verifies and then prompts the user to avoid overload before driving. This increases the security of vehicles and also ensures safe driving by preventing accidents.

Keywords: license, verifier, EEPROM, secure, overload detection

Procedia PDF Downloads 217
448 The Grammatical Dictionary Compiler: A System for Kartvelian Languages

Authors: Liana Lortkipanidze, Nino Amirezashvili, Nino Javashvili

Abstract:

The purpose of the grammatical dictionary is to provide information on the morphological and syntactic characteristics of the basic word in the dictionary entry. The electronic grammatical dictionaries are used as a tool of automated morphological analysis for texts processing. The Georgian Grammatical Dictionary should contain grammatical information for each word: part of speech, type of declension/conjugation, grammatical forms of the word (paradigm), alternative variants of basic word/lemma. In this paper, we present the system for compiling the Georgian Grammatical Dictionary automatically. We propose dictionary-based methods for extending grammatical lexicons. The input lexicon contains only a few number of words with identical grammatical features. The extension is based on similarity measures between features of words; more precisely, we add words to the extended lexicons, which are similar to those, which are already in the grammatical dictionary. Our dictionaries are corpora-based, and for the compiling, we introduce the method for lemmatization of unknown words, i.e., words of which neither full form nor lemma is in the grammatical dictionary.

Keywords: acquisition of lexicon, Georgian grammatical dictionary, lemmatization rules, morphological processor

Procedia PDF Downloads 116
447 Transparency of Algorithmic Decision-Making: Limits Posed by Intellectual Property Rights

Authors: Olga Kokoulina

Abstract:

Today, algorithms are assuming a leading role in various areas of decision-making. Prompted by a promise to provide increased economic efficiency and fuel solutions for pressing societal challenges, algorithmic decision-making is often celebrated as an impartial and constructive substitute for human adjudication. But in the face of this implied objectivity and efficiency, the application of algorithms is also marred with mounting concerns about embedded biases, discrimination, and exclusion. In Europe, vigorous debates on risks and adverse implications of algorithmic decision-making largely revolve around the potential of data protection laws to tackle some of the related issues. For example, one of the often-cited venues to mitigate the impact of potentially unfair decision-making practice is a so-called 'right to explanation'. In essence, the overall right is derived from the provisions of the General Data Protection Regulation (‘GDPR’) ensuring the right of data subjects to access and mandating the obligation of data controllers to provide the relevant information about the existence of automated decision-making and meaningful information about the logic involved. Taking corresponding rights and obligations in the context of the specific provision on automated decision-making in the GDPR, the debates mainly focus on efficacy and the exact scope of the 'right to explanation'. In essence, the underlying logic of the argued remedy lies in a transparency imperative. Allowing data subjects to acquire as much knowledge as possible about the decision-making process means empowering individuals to take control of their data and take action. In other words, forewarned is forearmed. The related discussions and debates are ongoing, comprehensive, and, often, heated. However, they are also frequently misguided and isolated: embracing the data protection law as ultimate and sole lenses are often not sufficient. Mandating the disclosure of technical specifications of employed algorithms in the name of transparency for and empowerment of data subjects potentially encroach on the interests and rights of IPR holders, i.e., business entities behind the algorithms. The study aims at pushing the boundaries of the transparency debate beyond the data protection regime. By systematically analysing legal requirements and current judicial practice, it assesses the limits of the transparency requirement and right to access posed by intellectual property law, namely by copyrights and trade secrets. It is asserted that trade secrets, in particular, present an often-insurmountable obstacle for realising the potential of the transparency requirement. In reaching that conclusion, the study explores the limits of protection afforded by the European Trade Secrets Directive and contrasts them with the scope of respective rights and obligations related to data access and portability enshrined in the GDPR. As shown, the far-reaching scope of the protection under trade secrecy is evidenced both through the assessment of its subject matter as well as through the exceptions from such protection. As a way forward, the study scrutinises several possible legislative solutions, such as flexible interpretation of the public interest exception in trade secrets as well as the introduction of the strict liability regime in case of non-transparent decision-making.

Keywords: algorithms, public interest, trade secrets, transparency

Procedia PDF Downloads 101
446 Automation Test Method and HILS Environment Configuration for Hydrogen Storage System Management Unit Verification

Authors: Jaejeogn Kim, Jeongmin Hong, Jungin Lee

Abstract:

The Hydrogen Storage System Management Unit (HMU) is a controller that manages hydrogen charging and storage. It detects hydrogen leaks and tank pressure and temperature, calculates the charging concentration and remaining amount, and controls the opening and closing of the hydrogen tank valve. Since this role is an important part of the vehicle behavior and stability of Fuel Cell Electric Vehicles (FCEV), verifying the HMU controller is an essential part. To perform verification under various conditions, it is necessary to increase time efficiency based on an automated verification environment and increase the reliability of the controller by applying numerous test cases. To this end, we introduce the HMU controller automation verification method by applying the HILS environment and an automation test program with the ASAM XIL standard.

Keywords: HILS, ASAM, fuel cell electric vehicle, automation test, hydrogen storage system

Procedia PDF Downloads 17
445 Modeling of Building a Conceptual Scheme for Multimodal Freight Transportation Information System

Authors: Gia Surguladze, Nino Topuria, Lily Petriashvili, Giorgi Surguladze

Abstract:

Modeling of building processes of a multimodal freight transportation support information system is discussed based on modern CASE technologies. Functional efficiencies of ports in the eastern part of the Black Sea are analyzed taking into account their ecological, seasonal, resource usage parameters. By resources, we mean capacities of berths, cranes, automotive transport, as well as work crews and neighbouring airports. For the purpose of designing database of computer support system for Managerial (Logistics) function, using Object-Role Modeling (ORM) tool (NORMA – Natural ORM Architecture) is proposed, after which Entity Relationship Model (ERM) is generated in automated process. The software is developed based on Process-Oriented and Service-Oriented architecture, in Visual Studio.NET environment.

Keywords: seaport resources, business-processes, multimodal transportation, CASE technology, object-role model, entity relationship model, SOA

Procedia PDF Downloads 399
444 Characterization of Surface Suction Grippers for Continuous-Discontinuous Fiber Reinforced Semi-Finished Parts of an Automated Handling and Preforming Operation

Authors: Jürgen Fleischer, Woramon Pangboonyanon, Dominic Lesage

Abstract:

Non-metallic lightweight materials such as fiber reinforced plastics (FRP) become very significant at present. Prepregs e.g. SMC and unidirectional tape (UD-tape) are one of raw materials used to produce FRP. This study concerns with the manufacturing steps of handling and preforming of this UD-SMC and focuses on the investigation of gripper characteristics regarding gripping forces in normal and lateral direction, in order to identify suitable operating pressures for a secure gripping operation. A reliable handling and preforming operation results in a higher adding value of the overall process chain. As a result, the suitable operating pressures depending on travelling direction for each material type could be shown. Moreover, system boundary conditions regarding allowable pulling force in normal and lateral directions during preforming could be measured.

Keywords: continuous-discontinuous fiber reinforced plastics, UD-SMC-prepreg, handling, preforming, prepregs, sheet moulding compounds, surface suction gripper

Procedia PDF Downloads 195