Search results for: Object detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4492

Search results for: Object detection

1702 Multi-Sensor Target Tracking Using Ensemble Learning

Authors: Bhekisipho Twala, Mantepu Masetshaba, Ramapulana Nkoana

Abstract:

Multiple classifier systems combine several individual classifiers to deliver a final classification decision. However, an increasingly controversial question is whether such systems can outperform the single best classifier, and if so, what form of multiple classifiers system yields the most significant benefit. Also, multi-target tracking detection using multiple sensors is an important research field in mobile techniques and military applications. In this paper, several multiple classifiers systems are evaluated in terms of their ability to predict a system’s failure or success for multi-sensor target tracking tasks. The Bristol Eden project dataset is utilised for this task. Experimental and simulation results show that the human activity identification system can fulfill requirements of target tracking due to improved sensors classification performances with multiple classifier systems constructed using boosting achieving higher accuracy rates.

Keywords: single classifier, ensemble learning, multi-target tracking, multiple classifiers

Procedia PDF Downloads 271
1701 Botnet Detection with ML Techniques by Using the BoT-IoT Dataset

Authors: Adnan Baig, Ishteeaq Naeem, Saad Mansoor

Abstract:

The Internet of Things (IoT) gadgets have advanced quickly in recent years, and their use is steadily rising daily. However, cyber-attackers can target these gadgets due to their distributed nature. Additionally, many IoT devices have significant security flaws in their implementation and design, making them vulnerable to security threats. Hence, these threats can cause important data security and privacy loss from a single attack on network devices or systems. Botnets are a significant security risk that can harm the IoT network; hence, sophisticated techniques are required to mitigate the risk. This work uses a machine learning-based method to identify IoT orchestrated by botnets. The proposed technique identifies the net attack by distinguishing between legitimate and malicious traffic. This article proposes a hyperparameter tuning model to improvise the method to improve the accuracy of existing processes. The results demonstrated an improved and more accurate indication of botnet-based cyber-attacks.

Keywords: Internet of Things, Botnet, BoT-IoT dataset, ML techniques

Procedia PDF Downloads 15
1700 Automatic Diagnosis of Electrical Equipment Using Infrared Thermography

Authors: Y. Laib Dit Leksir, S. Bouhouche

Abstract:

Analysis and processing of data bases resulting from infrared thermal measurements made on the electrical installation requires the development of new tools in order to obtain correct and additional information to the visual inspections. Consequently, the methods based on the capture of infrared digital images show a great potential and are employed increasingly in various fields. Although, there is an enormous need for the development of effective techniques to analyse these data base in order to extract relevant information relating to the state of the equipments. Our goal consists in introducing recent techniques of modeling based on new methods, image and signal processing to develop mathematical models in this field. The aim of this work is to capture the anomalies existing in electrical equipments during an inspection of some machines using A40 Flir camera. After, we use binarisation techniques in order to select the region of interest and we make comparison between these methods of thermal images obtained to choose the best one.

Keywords: infrared thermography, defect detection, troubleshooting, electrical equipment

Procedia PDF Downloads 477
1699 Metrology in Egyptian Architecture, Interrelation with Archaeology

Authors: Monica M. Marcos

Abstract:

In the framework of Archaeological Research, Heritage Conservation and Restoration, the object of study is metrology applied in composition of religious architecture in ancient Egypt, and usefulness in Archaology. The objective is the determination of the geometric and metrological relations in architectural models and the module used in the initial project of the buildings. The study and data collection of religious buildings, tombs and temples of the ancient Egypt, is completed with plans. The measurements systematization and buildings modulation makes possible to establish common compositional parameters, with a module determined by the measurement unit used. The measurement system corresponding to the main period of egyptian history, was the Egyptian royal cubit. The analysis of units measurements, used in architectural design, provides exact numbers on buildable spaces dimensions. It allows establishing proportional relationships between them, and finding a geometric composition module, on which the original project was based. This responds to a philosophical and functional concept of projected spaces. In the heritage rehabilitation and restoration field, knowledge of metrology helps in excavation, reconstruction and restoration of construction elements. The correct use of metrology contributes to the identification of possible work areas, helping to locate where the damaged or missing areas are. Also in restoration projects, metrology is useful for reordering and locating decontextualized parts of buildings. The conversion of measurements taken in the current International System to the ancient egyptian measurements, allows understand its conceptual purpose and its functionality, which makes easier to carry out archaeological intervention. In the work carried out in archaeological excavations, metrology is an essential tool for locating sites and establishing work zones.

Keywords: egyptology, metrology, archaeology, measurements, Egyptian cubit

Procedia PDF Downloads 26
1698 Biosensors for Parathion Based on Au-Pd Nanoparticles Modified Electrodes

Authors: Tian-Fang Kang, Chao-Nan Ge, Rui Li

Abstract:

An electrochemical biosensor for the determination of organophosphorus pesticides was developed based on electrochemical co-deposition of Au and Pd nanoparticles on glassy carbon electrode (GCE). Energy disperse spectroscopy (EDS) analysis was used for characterization of the surface structure. Scanning electron micrograph (SEM) demonstrates that the films are uniform and the nanoclusters are homogeneously distributed on the GCE surface. Acetylcholinesterase (AChE) was immobilized on the Au and Pd nanoparticle modified electrode (Au-Pd/GCE) by cross-linking with glutaraldehyde. The electrochemical behavior of thiocholine at the biosensor (AChE/Au-Pd/GCE) was studied. The biosensors exhibited substantial electrocatalytic effect on the oxidation of thiocholine. The peak current of linear scan voltammetry (LSV) of thiocholine at the biosensor is proportional to the concentration of acetylthiocholine chloride (ATCl) over the range of 2.5 × 10-6 to 2.5 × 10-4 M in 0.1 M phosphate buffer solution (pH 7.0). The percent inhibition of acetylcholinesterase was proportional to the logarithm of parathion concentration in the range of 4.0 × 10-9 to 1.0 × 10-6 M. The detection limit of parathion was 2.6 × 10-9 M. The proposed method exhibited high sensitivity and good reproducibility.

Keywords: acetylcholinesterase, Au-Pd nanoparticles, electrochemical biosensors, parathion

Procedia PDF Downloads 407
1697 Use of Predictive Food Microbiology to Determine the Shelf-Life of Foods

Authors: Fatih Tarlak

Abstract:

Predictive microbiology can be considered as an important field in food microbiology in which it uses predictive models to describe the microbial growth in different food products. Predictive models estimate the growth of microorganisms quickly, efficiently, and in a cost-effective way as compared to traditional methods of enumeration, which are long-lasting, expensive, and time-consuming. The mathematical models used in predictive microbiology are mainly categorised as primary and secondary models. The primary models are the mathematical equations that define the growth data as a function of time under a constant environmental condition. The secondary models describe the effects of environmental factors, such as temperature, pH, and water activity (aw) on the parameters of the primary models, including the maximum specific growth rate and lag phase duration, which are the most critical growth kinetic parameters. The combination of primary and secondary models provides valuable information to set limits for the quantitative detection of the microbial spoilage and assess product shelf-life.

Keywords: shelf-life, growth model, predictive microbiology, simulation

Procedia PDF Downloads 214
1696 Face Tracking and Recognition Using Deep Learning Approach

Authors: Degale Desta, Cheng Jian

Abstract:

The most important factor in identifying a person is their face. Even identical twins have their own distinct faces. As a result, identification and face recognition are needed to tell one person from another. A face recognition system is a verification tool used to establish a person's identity using biometrics. Nowadays, face recognition is a common technique used in a variety of applications, including home security systems, criminal identification, and phone unlock systems. This system is more secure because it only requires a facial image instead of other dependencies like a key or card. Face detection and face identification are the two phases that typically make up a human recognition system.The idea behind designing and creating a face recognition system using deep learning with Azure ML Python's OpenCV is explained in this paper. Face recognition is a task that can be accomplished using deep learning, and given the accuracy of this method, it appears to be a suitable approach. To show how accurate the suggested face recognition system is, experimental results are given in 98.46% accuracy using Fast-RCNN Performance of algorithms under different training conditions.

Keywords: deep learning, face recognition, identification, fast-RCNN

Procedia PDF Downloads 140
1695 Hardware Error Analysis and Severity Characterization in Linux-Based Server Systems

Authors: Nikolaos Georgoulopoulos, Alkis Hatzopoulos, Konstantinos Karamitsios, Konstantinos Kotrotsios, Alexandros I. Metsai

Abstract:

In modern server systems, business critical applications run in different types of infrastructure, such as cloud systems, physical machines and virtualization. Often, due to high load and over time, various hardware faults occur in servers that translate to errors, resulting to malfunction or even server breakdown. CPU, RAM and hard drive (HDD) are the hardware parts that concern server administrators the most regarding errors. In this work, selected RAM, HDD and CPU errors, that have been observed or can be simulated in kernel ring buffer log files from two groups of Linux servers, are investigated. Moreover, a severity characterization is given for each error type. Better understanding of such errors can lead to more efficient analysis of kernel logs that are usually exploited for fault diagnosis and prediction. In addition, this work summarizes ways of simulating hardware errors in RAM and HDD, in order to test the error detection and correction mechanisms of a Linux server.

Keywords: hardware errors, Kernel logs, Linux servers, RAM, hard disk, CPU

Procedia PDF Downloads 158
1694 Data Quality Enhancement with String Length Distribution

Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda

Abstract:

Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.

Keywords: string classification, data quality, feature selection, probability distribution, string length

Procedia PDF Downloads 319
1693 Rapid Method for the Determination of Acid Dyes by Capillary Electrophoresis

Authors: Can Hu, Huixia Shi, Hongcheng Mei, Jun Zhu, Hongling Guo

Abstract:

Textile fibers are important trace evidence and frequently encountered in criminal investigations. A significant aspect of fiber evidence examination is the determination of fiber dyes. Although several instrumental methods have been developed for dyes detection, the analysis speed is not fast enough yet. A rapid dye analysis method is still needed to further improve the efficiency of case handling. Capillary electrophoresis has the advantages of high separation speed and high separation efficiency and is an ideal method for the rapid analysis of fiber dyes. In this paper, acid dyes used for protein fiber dyeing were determined by a developed short-end injection capillary electrophoresis technique. Five acid red dyes with similar structures were successfully baseline separated within 5 min. The separation reproducibility is fairly good for the relative standard deviation of retention time is 0.51%. The established method is rapid and accurate which has great potential to be applied in forensic setting.

Keywords: acid dyes, capillary electrophoresis, fiber evidence, rapid determination

Procedia PDF Downloads 147
1692 A Network-Theorical Perspective on Music Analysis

Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria

Abstract:

The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.

Keywords: computational musicology, mathematical music modelling, music analysis, style classification

Procedia PDF Downloads 104
1691 A Passive Digital Video Authentication Technique Using Wavelet Based Optical Flow Variation Thresholding

Authors: R. S. Remya, U. S. Sethulekshmi

Abstract:

Detecting the authenticity of a video is an important issue in digital forensics as Video is used as a silent evidence in court such as in child pornography, movie piracy cases, insurance claims, cases involving scientific fraud, traffic monitoring etc. The biggest threat to video data is the availability of modern open video editing tools which enable easy editing of videos without leaving any trace of tampering. In this paper, we propose an efficient passive method for inter-frame video tampering detection, its type and location by estimating the optical flow of wavelet features of adjacent frames and thresholding the variation in the estimated feature. The performance of the algorithm is compared with the z-score thresholding and achieved an efficiency above 95% on all the tested databases. The proposed method works well for videos with dynamic (forensics) as well as static (surveillance) background.

Keywords: discrete wavelet transform, optical flow, optical flow variation, video tampering

Procedia PDF Downloads 360
1690 A Network Approach to Analyzing Financial Markets

Authors: Yusuf Seedat

Abstract:

The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.

Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks

Procedia PDF Downloads 192
1689 Artificial Intelligence-Based Detection of Individuals Suffering from Vestibular Disorder

Authors: Dua Hişam, Serhat İkizoğlu

Abstract:

Identifying the problem behind balance disorder is one of the most interesting topics in the medical literature. This study has considerably enhanced the development of artificial intelligence (AI) algorithms applying multiple machine learning (ML) models to sensory data on gait collected from humans to classify between normal people and those suffering from Vestibular System (VS) problems. Although AI is widely utilized as a diagnostic tool in medicine, AI models have not been used to perform feature extraction and identify VS disorders through training on raw data. In this study, three machine learning (ML) models, the Random Forest Classifier (RF), Extreme Gradient Boosting (XGB), and K-Nearest Neighbor (KNN), have been trained to detect VS disorder, and the performance comparison of the algorithms has been made using accuracy, recall, precision, and f1-score. With an accuracy of 95.28 %, Random Forest Classifier (RF) was the most accurate model.

Keywords: vestibular disorder, machine learning, random forest classifier, k-nearest neighbor, extreme gradient boosting

Procedia PDF Downloads 70
1688 Trinary Affinity—Mathematic Verification and Application (1): Construction of Formulas for the Composite and Prime Numbers

Authors: Liang Ming Zhong, Yu Zhong, Wen Zhong, Fei Fei Yin

Abstract:

Trinary affinity is a description of existence: every object exists as it is known and spoken of, in a system of 2 differences (denoted dif1, dif₂) and 1 similarity (Sim), equivalently expressed as dif₁ / Sim / dif₂ and kn / 0 / tkn (kn = the known, tkn = the 'to be known', 0 = the zero point of knowing). They are mathematically verified and illustrated in this paper by the arrangement of all integers onto 3 columns, where each number exists as a difference in relation to another number as another difference, and the 2 difs as arbitrated by a third number as the Sim, resulting in a trinary affinity or trinity of 3 numbers, of which one is the known, the other the 'to be known', and the third the zero (0) from which both the kn and tkn are measured and specified. Consequently, any number is horizontally specified either as 3n, or as '3n – 1' or '3n + 1', and vertically as 'Cn + c', so that any number seems to occur at the intersection of its X and Y axes and represented by its X and Y coordinates, as any point on Earth’s surface by its latitude and longitude. Technically, i) primes are viewed and treated as progenitors, and composites as descending from them, forming families of composites, each capable of being measured and specified from its own zero called in this paper the realistic zero (denoted 0r, as contrasted to the mathematic zero, 0m), which corresponds to the constant c, and the nature of which separates the composite and prime numbers, and ii) any number is considered as having a magnitude as well as a position, so that a number is verified as a prime first by referring to its descriptive formula and then by making sure that no composite number can possibly occur on its position, by dividing it with factors provided by the composite number formulas. The paper consists of 3 parts: 1) a brief explanation of the trinary affinity of things, 2) the 8 formulas that represent ALL the primes, and 3) families of composite numbers, each represented by a formula. A composite number family is described as 3n + f₁‧f₂. Since there are an infinitely large number of composite number families, to verify the primality of a great probable prime, we have to have it divided with several or many a f₁ from a range of composite number formulas, a procedure that is as laborious as it is the surest way to verifying a great number’s primality. (So, it is possible to substitute planned division for trial division.)

Keywords: trinary affinity, difference, similarity, realistic zero

Procedia PDF Downloads 212
1687 GPU Based High Speed Error Protection for Watermarked Medical Image Transmission

Authors: Md Shohidul Islam, Jongmyon Kim, Ui-pil Chong

Abstract:

Medical image is an integral part of e-health care and e-diagnosis system. Medical image watermarking is widely used to protect patients’ information from malicious alteration and manipulation. The watermarked medical images are transmitted over the internet among patients, primary and referred physicians. The images are highly prone to corruption in the wireless transmission medium due to various noises, deflection, and refractions. Distortion in the received images leads to faulty watermark detection and inappropriate disease diagnosis. To address the issue, this paper utilizes error correction code (ECC) with (8, 4) Hamming code in an existing watermarking system. In addition, we implement the high complex ECC on a graphics processing units (GPU) to accelerate and support real-time requirement. Experimental results show that GPU achieves considerable speedup over the sequential CPU implementation, while maintaining 100% ECC efficiency.

Keywords: medical image watermarking, e-health system, error correction, Hamming code, GPU

Procedia PDF Downloads 292
1686 Comparison of Concentration of Heavy Metals in PM2.5 Analyzed in Three Different Global Research Institutions Using X-Ray Fluorescence

Authors: Sungroul Kim, Yeonjin Kim

Abstract:

This study was conducted by comparing the concentrations of heavy metals analyzed from the same samples with three X-Ray fluorescence (XRF) spectrometer in three different global research institutions, including PAN (A Branch of Malvern Panalytical, Seoul, South Korea), RTI (Research Triangle Institute, NC, U.S.A), and aerosol laboratory in Harvard University, Boston, U.S.A. To achieve our research objectives, the indoor air filter samples were collected at homes (n=24) of adults or child asthmatics then analyzed in PAN followed by Harvard University and RTI consecutively. Descriptive statistics were conducted for data comparison as well as correlation and simple regression analysis using R version 4.0.3. As a result, detection rates of most heavy metals analyzed in three institutions were about 90%. Of the 25 elements commonly analyzed among those institutions, 16 elements showed an R² (coefficient of determination) of 0.7 or higher (10 components were 0.9 or higher). The findings of this study demonstrated that XRF was a useful device ensuring reproducibility and compatibility for measuring heavy metals in PM2.5 collected from indoor air of asthmatics’ home.

Keywords: heavy metals, indoor air quality, PM2.5, X-ray fluorescence

Procedia PDF Downloads 201
1685 A Review of HVDC Modular Multilevel Converters Subjected to DC and AC Faults

Authors: Jude Inwumoh, Adam P. R. Taylor, Kosala Gunawardane

Abstract:

Modular multilevel converters (MMC) exhibit a highly scalable and modular characteristic with good voltage/power expansion, fault tolerance capability, low output harmonic content, good redundancy, and a flexible front-end configuration. Fault detection, location, and isolation, as well as maintaining fault ride-through (FRT), are major challenges to MMC reliability and power supply sustainability. Different papers have been reviewed to seek the best MMC configuration with fault capability. DC faults are the most common fault, while the probability that AC fault occurs in a modular multilevel converter (MCC) is low; though, AC faults consequence are severe. This paper reviews several MMC topologies and modulation techniques in tackling faults. These fault control strategies are compared based on cost, complexity, controllability, and power loss. A meshed network of half-bridge (HB) MMC topology was optimal in rendering fault ride through than any other MMC topologies but only when combined with DC circuit breakers (CBS), AC CBS, and fault current limiters (FCL).

Keywords: MMC-HVDC, DC faults, fault current limiters, control scheme

Procedia PDF Downloads 140
1684 Single Cell and Spatial Transcriptomics: A Beginners Viewpoint from the Conceptual Pipeline

Authors: Leo Nnamdi Ozurumba-Dwight

Abstract:

Messenger ribooxynucleic acid (mRNA) molecules are compositional, protein-based. These proteins, encoding mRNA molecules (which collectively connote the transcriptome), when analyzed by RNA sequencing (RNAseq), unveils the nature of gene expression in the RNA. The obtained gene expression provides clues of cellular traits and their dynamics in presentations. These can be studied in relation to function and responses. RNAseq is a practical concept in Genomics as it enables detection and quantitative analysis of mRNA molecules. Single cell and spatial transcriptomics both present varying avenues for expositions in genomic characteristics of single cells and pooled cells in disease conditions such as cancer, auto-immune diseases, hematopoietic based diseases, among others, from investigated biological tissue samples. Single cell transcriptomics helps conduct a direct assessment of each building unit of tissues (the cell) during diagnosis and molecular gene expressional studies. A typical technique to achieve this is through the use of a single-cell RNA sequencer (scRNAseq), which helps in conducting high throughput genomic expressional studies. However, this technique generates expressional gene data for several cells which lack presentations on the cells’ positional coordinates within the tissue. As science is developmental, the use of complimentary pre-established tissue reference maps using molecular and bioinformatics techniques has innovatively sprung-forth and is now used to resolve this set back to produce both levels of data in one shot of scRNAseq analysis. This is an emerging conceptual approach in methodology for integrative and progressively dependable transcriptomics analysis. This can support in-situ fashioned analysis for better understanding of tissue functional organization, unveil new biomarkers for early-stage detection of diseases, biomarkers for therapeutic targets in drug development, and exposit nature of cell-to-cell interactions. Also, these are vital genomic signatures and characterizations of clinical applications. Over the past decades, RNAseq has generated a wide array of information that is igniting bespoke breakthroughs and innovations in Biomedicine. On the other side, spatial transcriptomics is tissue level based and utilized to study biological specimens having heterogeneous features. It exposits the gross identity of investigated mammalian tissues, which can then be used to study cell differentiation, track cell line trajectory patterns and behavior, and regulatory homeostasis in disease states. Also, it requires referenced positional analysis to make up of genomic signatures that will be sassed from the single cells in the tissue sample. Given these two presented approaches to RNA transcriptomics study in varying quantities of cell lines, with avenues for appropriate resolutions, both approaches have made the study of gene expression from mRNA molecules interesting, progressive, developmental, and helping to tackle health challenges head-on.

Keywords: transcriptomics, RNA sequencing, single cell, spatial, gene expression.

Procedia PDF Downloads 124
1683 Value of Unilateral Spinal Anaesthesia For Hip Fracture Surgery In The Elderly (75 Cases)

Authors: Fedili Benamar, Beloulou Mohamed Lamine, Ouahes Hassane, Ghattas Samir

Abstract:

Background and aims: While in Western countries, unilateral spinal anesthesia has been widely practiced for a long time, it remains little known in the local anesthesia community, and has not been the object of many studies. However, it is a simple, practical and effective technique. Our objective was to evaluate this practice in emergency anesthesia management in frail patients and to compare it with conventional spinal anesthesia. Methods: This is a prospective, observational, comparative study between hypobaric unilateral and conventional spinal anaesthesia for hip fracture surgery carried out in the operating room of the university military hospital of Staoueli. The work was spread over of 12-month period from 2019 to 2020. The parameters analyzed were hemodynamic variations, vasopressor use, block efficiency, postoperative adverse events, and postoperative morphine consumption. Results: -75 cases (mean age 72±14 years) -Group1= 41 patients (54.6%) divided into (ASA1=14.6% ASA2=60.98% ASA3=24.39%) single shoot spinal anaesthesia -Group2= 34 patients (45.3%) divided into (ASA1=2.9%, ASA2=26.4% ASA3=61.7%, ASA4=8.8%) unilateral hypobaric spinal anesthesia. -Hemodynamic variations were more severe in group 1 (51% hypotension) compared to 30% in group 2 RR=1.69 and odds ratio=2.4 -these variations were more marked in the ASA3 subgroup (group 1=70% hypotension versus group 2=30%) with an RR=2.33 and an odds ratio=5.44 -39% of group 1 required vasoactive drugs (15mg +/- 11) versus 32% of group 2 (8mg+/- 6.49) - no difference in the use of morphine in post-op. Conclusions: Within the limits of the population studied, this work demonstrates the clinical value of unilateral spinal anesthesia in ortho-trauma surgery in the frail patient.

Keywords: spinal anaesthesia, vasopressor, morphine, hypobaric unilateral spinal anesthesia, ropivacaine, hip surgery, eldery, hemodynamic

Procedia PDF Downloads 76
1682 Detecting Cyberbullying, Spam and Bot Behavior and Fake News in Social Media Accounts Using Machine Learning

Authors: M. D. D. Chathurangi, M. G. K. Nayanathara, K. M. H. M. M. Gunapala, G. M. R. G. Dayananda, Kavinga Yapa Abeywardena, Deemantha Siriwardana

Abstract:

Due to the growing popularity of social media platforms at present, there are various concerns, mostly cyberbullying, spam, bot accounts, and the spread of incorrect information. To develop a risk score calculation system as a thorough method for deciphering and exposing unethical social media profiles, this research explores the most suitable algorithms to our best knowledge in detecting the mentioned concerns. Various multiple models, such as Naïve Bayes, CNN, KNN, Stochastic Gradient Descent, Gradient Boosting Classifier, etc., were examined, and the best results were taken into the development of the risk score system. For cyberbullying, the Logistic Regression algorithm achieved an accuracy of 84.9%, while the spam-detecting MLP model gained 98.02% accuracy. The bot accounts identifying the Random Forest algorithm obtained 91.06% accuracy, and 84% accuracy was acquired for fake news detection using SVM.

Keywords: cyberbullying, spam behavior, bot accounts, fake news, machine learning

Procedia PDF Downloads 40
1681 Tax Evasion with Mobility between the Regular and Irregular Sectors

Authors: Xavier Ruiz Del Portal

Abstract:

This paper incorporates mobility between the legal and black economies into a model of tax evasion with endogenous labor supply in which underreporting is possible in one sector but impossible in the other. We have found that the results of the effects along the extensive margin (number of evaders) become more robust and conclusive than those along the intensive margin (hours of illegal work) usually considered by the literature. In particular, it is shown that the following policies reduce the number of evaders: (a) larger and more progressive evasion penalties; (b) higher detection probabilities; (c) an increase in the legal sector wage rate; (d) a decrease in the moonlighting wage rate; (e) higher costs for creating opportunities to evade; (f) lower opportunities to evade, and (g) greater psychological costs of tax evasion. When tax concealment and illegal work also are taken into account, the effects do not vary significantly under the assumptions in Cowell (1985), except for the fact that policies (a) and (b) only hold as regards low- and middle-income groups and policies (e) and (f) as regards high-income groups.

Keywords: income taxation, tax evasion, extensive margin responses, the penalty system

Procedia PDF Downloads 156
1680 Grating Scale Thermal Expansion Error Compensation for Large Machine Tools Based on Multiple Temperature Detection

Authors: Wenlong Feng, Zhenchun Du, Jianguo Yang

Abstract:

To decrease the grating scale thermal expansion error, a novel method which based on multiple temperature detections is proposed. Several temperature sensors are installed on the grating scale and the temperatures of these sensors are recorded. The temperatures of every point on the grating scale are calculated by interpolating between adjacent sensors. According to the thermal expansion principle, the grating scale thermal expansion error model can be established by doing the integral for the variations of position and temperature. A novel compensation method is proposed in this paper. By applying the established error model, the grating scale thermal expansion error is decreased by 90% compared with no compensation. The residual positioning error of the grating scale is less than 15um/10m and the accuracy of the machine tool is significant improved.

Keywords: thermal expansion error of grating scale, error compensation, machine tools, integral method

Procedia PDF Downloads 368
1679 Research on the Renewal and Utilization of Space under the Bridge in Chongqing Based on Spatial Potential Evaluation

Authors: Xvelian Qin

Abstract:

Urban "organic renewal" based on the development of existing resources in high-density urban areas has become the mainstream of urban development in the new era. As an important stock resource of public space in high-density urban areas, promoting its value remodeling is an effective way to alleviate the shortage of public space resources. However, due to the lack of evaluation links in the process of underpass space renewal, a large number of underpass space resources have been left idle, facing the problems of low space conversion efficiency, lack of accuracy in development decision-making, and low adaptability of functional positioning to citizens' needs. Therefore, it is of great practical significance to construct the evaluation system of under-bridge space renewal potential and explore the renewal mode. In this paper, some of the under-bridge spaces in the main urban area of Chongqing are selected as the research object. Through the questionnaire interviews with the users of the built excellent space under the bridge, three types of six levels and twenty-two potential evaluation indexes of "objective demand factor, construction feasibility factor and construction suitability factor" are selected, including six levels of land resources, infrastructure, accessibility, safety, space quality and ecological environment. The analytical hierarchy process and expert scoring method are used to determine the index weight, construct the potential evaluation system of the space under the bridge in high-density urban areas of Chongqing, and explore the direction of renewal and utilization of its suitability. To provide feasible theoretical basis and scientific decision support for the use of under bridge space in the future.

Keywords: high density urban area, potential evaluation, space under bridge, updated using

Procedia PDF Downloads 71
1678 Nondestructive Testing for Reinforced Concrete Buildings with Active Infrared Thermography

Authors: Huy Q. Tran, Jungwon Huh, Kiseok Kwak, Choonghyun Kang

Abstract:

Infrared thermography (IRT) technique has been proven to be a good method for nondestructive evaluation of concrete material. In the building, a broad range of applications has been used such as subsurface defect inspection, energy loss, and moisture detection. The purpose of this research is to consider the qualitative and quantitative performance of reinforced concrete deteriorations using active infrared thermography technique. An experiment of three different heating regimes was conducted on a concrete slab in the laboratory. The thermal characteristics of the IRT method, i.e., absolute contrast and observation time, are investigated. A linear relationship between the observation time and the real depth was established with a well linear regression R-squared of 0.931. The results showed that the absolute contrast above defective area increases with the rise of the size of delamination and the heating time. In addition, the depth of delamination can be predicted by using the proposal relationship of this study.

Keywords: concrete building, infrared thermography, nondestructive evaluation, subsurface delamination

Procedia PDF Downloads 283
1677 Performance Evaluation of Cement Mortar with Crushed Stone Dust as Fine Aggregates

Authors: Pradeep Kumar

Abstract:

The present work is based on application of cement mortar with natural sand and discontinuous steel fiber through which bending behavior of skinny beam was evaluated. This research is to study the effects of combining reinforcing steel meshes (continuous steel reinforcement) with discontinuous fibers as reinforcement in skinny walled Portland cement based cement mortar with crushed stone dust as a fine aggregate. The term ‘skinny’ means thickness of the beams is less than 25 mm. The main idea behind this combination is to satisfy the ultimate strength limit state through the steel mesh reinforcement (as a main reinforcement) and to control the cracking under service loads through fiber (Recron 3s) reinforcement (as secondary reinforcement). The main object of this study is to carry out the bending behavior of mortar reinforced thin beam with only one layer of steel mesh (with various transfer wire spacing) and with a recron 3s (Reliance) fifers. The wide experimental program with bending tests is undertaken. The following variables are investigated: (a) the reference mesh size - 25.4 x 25.4 mm and 50.8 x 50.8 mm; (b) the transverse wire spacing - 25.4 mm, 50.8 mm, and no transverse wires; (c) the type of fibers – Reliance (Recron 3s, 6mm length); and (d) the fiber volume fraction – 0.1% and 0.25%. Some of the main conclusions are: (a) the use of recron 3s fibers leads to a little better overall performance than that with no fiber; (b) an increase in equivalent stress is observed when 0.1% RF,0.25% R Fibers are used; (c) when 25.4 x 50.8 size steel mesh is used, no noticeable change in behavior is observed in comparison to specimens without fibers; and (d) for no fibers 0.1% and o.1% RF the transverse wire spacing has some little effect on the equivalent stress for RF fibers, the transverse wire has no influence but the equivalent stress are increased.

Keywords: cement mortar, crushed stone dust, fibre, steel mesh

Procedia PDF Downloads 314
1676 Emotional Analysis for Text Search Queries on Internet

Authors: Gemma García López

Abstract:

The goal of this study is to analyze if search queries carried out in search engines such as Google, can offer emotional information about the user that performs them. Knowing the emotional state in which the Internet user is located can be a key to achieve the maximum personalization of content and the detection of worrying behaviors. For this, two studies were carried out using tools with advanced natural language processing techniques. The first study determines if a query can be classified as positive, negative or neutral, while the second study extracts emotional content from words and applies the categorical and dimensional models for the representation of emotions. In addition, we use search queries in Spanish and English to establish similarities and differences between two languages. The results revealed that text search queries performed by users on the Internet can be classified emotionally. This allows us to better understand the emotional state of the user at the time of the search, which could involve adapting the technology and personalizing the responses to different emotional states.

Keywords: emotion classification, text search queries, emotional analysis, sentiment analysis in text, natural language processing

Procedia PDF Downloads 142
1675 Formalizing a Procedure for Generating Uncertain Resource Availability Assumptions Based on Real Time Logistic Data Capturing with Auto-ID Systems for Reactive Scheduling

Authors: Lars Laußat, Manfred Helmus, Kamil Szczesny, Markus König

Abstract:

As one result of the project “Reactive Construction Project Scheduling using Real Time Construction Logistic Data and Simulation”, a procedure for using data about uncertain resource availability assumptions in reactive scheduling processes has been developed. Prediction data about resource availability is generated in a formalized way using real-time monitoring data e.g. from auto-ID systems on the construction site and in the supply chains. The paper focuses on the formalization of the procedure for monitoring construction logistic processes, for the detection of disturbance and for generating of new and uncertain scheduling assumptions for the reactive resource constrained simulation procedure that is and will be further described in other papers.

Keywords: auto-ID, construction logistic, fuzzy, monitoring, RFID, scheduling

Procedia PDF Downloads 516
1674 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring

Authors: Hyun-Woo Cho

Abstract:

Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.

Keywords: calibration model, monitoring, quality improvement, feature selection

Procedia PDF Downloads 357
1673 Charging-Vacuum Helium Mass Spectrometer Leak Detection Technology in the Application of Space Products Leak Testing and Error Control

Authors: Jijun Shi, Lichen Sun, Jianchao Zhao, Lizhi Sun, Enjun Liu, Chongwu Guo

Abstract:

Because of the consistency of pressure direction, more short cycle, and high sensitivity, Charging-Vacuum helium mass spectrometer leak testing technology is the most popular leak testing technology for the seal testing of the spacecraft parts, especially the small and medium size ones. Usually, auxiliary pump was used, and the minimum detectable leak rate could reach 5E-9Pa•m3/s, even better on certain occasions. Relative error is more important when evaluating the results. How to choose the reference leak, the background level of helium, and record formats would affect the leak rate tested. In the linearity range of leak testing system, it would reduce 10% relative error if the reference leak with larger leak rate was used, and the relative error would reduce obviously if the background of helium was low efficiently, the record format of decimal was used, and the more stable data were recorded.

Keywords: leak testing, spacecraft parts, relative error, error control

Procedia PDF Downloads 456