Search results for: events detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5336

Search results for: events detection

2366 Automatic Diagnosis of Electrical Equipment Using Infrared Thermography

Authors: Y. Laib Dit Leksir, S. Bouhouche

Abstract:

Analysis and processing of data bases resulting from infrared thermal measurements made on the electrical installation requires the development of new tools in order to obtain correct and additional information to the visual inspections. Consequently, the methods based on the capture of infrared digital images show a great potential and are employed increasingly in various fields. Although, there is an enormous need for the development of effective techniques to analyse these data base in order to extract relevant information relating to the state of the equipments. Our goal consists in introducing recent techniques of modeling based on new methods, image and signal processing to develop mathematical models in this field. The aim of this work is to capture the anomalies existing in electrical equipments during an inspection of some machines using A40 Flir camera. After, we use binarisation techniques in order to select the region of interest and we make comparison between these methods of thermal images obtained to choose the best one.

Keywords: infrared thermography, defect detection, troubleshooting, electrical equipment

Procedia PDF Downloads 463
2365 Flood Hazard and Risk Mapping to Assess Ice-Jam Flood Mitigation Measures

Authors: Karl-Erich Lindenschmidt, Apurba Das, Joel Trudell, Keanne Russell

Abstract:

In this presentation, we explore options for mitigating ice-jam flooding along the Athabasca River in western Canada. Not only flood hazard, expressed in this case as the probability of flood depths and extents being exceeded, but also flood risk, in which annual expected damages are calculated. Flood risk is calculated, which allows a cost-benefit analysis to be made so that decisions on the best mitigation options are not based solely on flood hazard but also on the costs related to flood damages and the benefits of mitigation. The river ice model is used to simulate extreme ice-jam flood events with which scenarios are run to determine flood exposure and damages in flood-prone areas along the river. We will concentrate on three mitigation options – the placement of a dike, artificial breakage of the ice cover along the river, the installation of an ice-control structure, and the construction of a reservoir. However, any mitigation option is not totally failsafe. For example, dikes can still be overtopped and breached, and ice jams may still occur in areas of the river where ice covers have been artificially broken up. Hence, for all options, it is recommended that zoning of building developments away from greater flood hazard areas be upheld. Flood mitigation can have a negative effect of giving inhabitants a false sense of security that flooding may not happen again, leading to zoning policies being relaxed. (Text adapted from Lindenschmidt [2022] "Ice Destabilization Study - Phase 2", submitted to the Regional Municipality of Wood Buffalo, Alberta, Canada)

Keywords: ice jam, flood hazard, flood risk river ice modelling, flood risk

Procedia PDF Downloads 159
2364 Biosensors for Parathion Based on Au-Pd Nanoparticles Modified Electrodes

Authors: Tian-Fang Kang, Chao-Nan Ge, Rui Li

Abstract:

An electrochemical biosensor for the determination of organophosphorus pesticides was developed based on electrochemical co-deposition of Au and Pd nanoparticles on glassy carbon electrode (GCE). Energy disperse spectroscopy (EDS) analysis was used for characterization of the surface structure. Scanning electron micrograph (SEM) demonstrates that the films are uniform and the nanoclusters are homogeneously distributed on the GCE surface. Acetylcholinesterase (AChE) was immobilized on the Au and Pd nanoparticle modified electrode (Au-Pd/GCE) by cross-linking with glutaraldehyde. The electrochemical behavior of thiocholine at the biosensor (AChE/Au-Pd/GCE) was studied. The biosensors exhibited substantial electrocatalytic effect on the oxidation of thiocholine. The peak current of linear scan voltammetry (LSV) of thiocholine at the biosensor is proportional to the concentration of acetylthiocholine chloride (ATCl) over the range of 2.5 × 10-6 to 2.5 × 10-4 M in 0.1 M phosphate buffer solution (pH 7.0). The percent inhibition of acetylcholinesterase was proportional to the logarithm of parathion concentration in the range of 4.0 × 10-9 to 1.0 × 10-6 M. The detection limit of parathion was 2.6 × 10-9 M. The proposed method exhibited high sensitivity and good reproducibility.

Keywords: acetylcholinesterase, Au-Pd nanoparticles, electrochemical biosensors, parathion

Procedia PDF Downloads 390
2363 Use of Predictive Food Microbiology to Determine the Shelf-Life of Foods

Authors: Fatih Tarlak

Abstract:

Predictive microbiology can be considered as an important field in food microbiology in which it uses predictive models to describe the microbial growth in different food products. Predictive models estimate the growth of microorganisms quickly, efficiently, and in a cost-effective way as compared to traditional methods of enumeration, which are long-lasting, expensive, and time-consuming. The mathematical models used in predictive microbiology are mainly categorised as primary and secondary models. The primary models are the mathematical equations that define the growth data as a function of time under a constant environmental condition. The secondary models describe the effects of environmental factors, such as temperature, pH, and water activity (aw) on the parameters of the primary models, including the maximum specific growth rate and lag phase duration, which are the most critical growth kinetic parameters. The combination of primary and secondary models provides valuable information to set limits for the quantitative detection of the microbial spoilage and assess product shelf-life.

Keywords: shelf-life, growth model, predictive microbiology, simulation

Procedia PDF Downloads 187
2362 Face Tracking and Recognition Using Deep Learning Approach

Authors: Degale Desta, Cheng Jian

Abstract:

The most important factor in identifying a person is their face. Even identical twins have their own distinct faces. As a result, identification and face recognition are needed to tell one person from another. A face recognition system is a verification tool used to establish a person's identity using biometrics. Nowadays, face recognition is a common technique used in a variety of applications, including home security systems, criminal identification, and phone unlock systems. This system is more secure because it only requires a facial image instead of other dependencies like a key or card. Face detection and face identification are the two phases that typically make up a human recognition system.The idea behind designing and creating a face recognition system using deep learning with Azure ML Python's OpenCV is explained in this paper. Face recognition is a task that can be accomplished using deep learning, and given the accuracy of this method, it appears to be a suitable approach. To show how accurate the suggested face recognition system is, experimental results are given in 98.46% accuracy using Fast-RCNN Performance of algorithms under different training conditions.

Keywords: deep learning, face recognition, identification, fast-RCNN

Procedia PDF Downloads 118
2361 Hardware Error Analysis and Severity Characterization in Linux-Based Server Systems

Authors: Nikolaos Georgoulopoulos, Alkis Hatzopoulos, Konstantinos Karamitsios, Konstantinos Kotrotsios, Alexandros I. Metsai

Abstract:

In modern server systems, business critical applications run in different types of infrastructure, such as cloud systems, physical machines and virtualization. Often, due to high load and over time, various hardware faults occur in servers that translate to errors, resulting to malfunction or even server breakdown. CPU, RAM and hard drive (HDD) are the hardware parts that concern server administrators the most regarding errors. In this work, selected RAM, HDD and CPU errors, that have been observed or can be simulated in kernel ring buffer log files from two groups of Linux servers, are investigated. Moreover, a severity characterization is given for each error type. Better understanding of such errors can lead to more efficient analysis of kernel logs that are usually exploited for fault diagnosis and prediction. In addition, this work summarizes ways of simulating hardware errors in RAM and HDD, in order to test the error detection and correction mechanisms of a Linux server.

Keywords: hardware errors, Kernel logs, Linux servers, RAM, hard disk, CPU

Procedia PDF Downloads 140
2360 Data Quality Enhancement with String Length Distribution

Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda

Abstract:

Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.

Keywords: string classification, data quality, feature selection, probability distribution, string length

Procedia PDF Downloads 305
2359 Azan in Funeral: A Local Islamic Tradition in Indonesia

Authors: Muhajirin Gafar

Abstract:

In Indonesia, Azan not only used as a reminder or call to prayer, its also used at the birth of a child, as the direction of the Prophet Muhammad PBUH, but also become part of a 'tradition’ to echoed in obsequies or the funeral, even there is a tradition in which the Azan echoed in the four corners of the grave. This tradition has become a necessity and has become part of the local Islamic culture preserved from time to time, although it certainly can not be known legal basis underlying the tradition. Based on the phenomenon, this paper proposed three research objective, namely: 1) To described the history about tradition Azan in funeral, 2) To analyze some of the postulates supporting the occurrence of the tradition, 3) To find out the postulates/ hadist which has been arranged in accordance with the instructions of the Prophet Muhammad PBUH about the rules of funeral. To reconstruct the history of the emergence of events azan tradition in the funeral this research used historical method, while the second and third objective used library research. Data and facts systematically processed and analyzed so as to be able to answer the questions of what, who, where, when, how, and why an event occurred. Finally, this research used Takhrij al-hadith a method to look at the validity of the arguments of the hadith. Result found that tradition of Azan in funeral has been around since the presence of Islam in Indonesia. This tradition continued and became a local Islamic culture which spread almost all over Indonesia, even considered part of religious guidance. While there are no decisive postulates which can be accounted for this tradition, except ‘qiyas’ postulates which are not appropriate. Most Indonesian Muslim put Azan as the first priority to do in funeral while oblivious other compulsory things that must be recited when lay down the corpse. They tend to assume that this tradition is a part of Islamic local culture.

Keywords: Azan, tradition, qiyas, Islamic local, hadist

Procedia PDF Downloads 492
2358 Rapid Method for the Determination of Acid Dyes by Capillary Electrophoresis

Authors: Can Hu, Huixia Shi, Hongcheng Mei, Jun Zhu, Hongling Guo

Abstract:

Textile fibers are important trace evidence and frequently encountered in criminal investigations. A significant aspect of fiber evidence examination is the determination of fiber dyes. Although several instrumental methods have been developed for dyes detection, the analysis speed is not fast enough yet. A rapid dye analysis method is still needed to further improve the efficiency of case handling. Capillary electrophoresis has the advantages of high separation speed and high separation efficiency and is an ideal method for the rapid analysis of fiber dyes. In this paper, acid dyes used for protein fiber dyeing were determined by a developed short-end injection capillary electrophoresis technique. Five acid red dyes with similar structures were successfully baseline separated within 5 min. The separation reproducibility is fairly good for the relative standard deviation of retention time is 0.51%. The established method is rapid and accurate which has great potential to be applied in forensic setting.

Keywords: acid dyes, capillary electrophoresis, fiber evidence, rapid determination

Procedia PDF Downloads 129
2357 Exploring the Power of Words: Domesticating the Competence/Competency Concept in Ugandan Organisations

Authors: John C. Munene, Florence Nansubuga

Abstract:

The study set out to examine a number of theories that have directly or indirectly implied that words are potent but that the potency depends on the context or practice in which they are utilised. The theories include the Freudian theory of Cathexis, which directly suggests that ambiguous events when named become potent as well as the word that is used to name them. We briefly examine Psychological differentiation, which submit that ambiguity is often a result of failure to distinguish figure from ground. The investigate Prospecting Theory, which suggests that in a situation when people have to make decisions, they have options to utilise intuition or reasoned judgment. It suggests that more often than not, the tendency is to utilise intuition especially when generic heuristics such as representativeness and similarity are available. That usage of these heuristics may depend on lack of a salience or accessibility of the situation due to ambiguity. We also examine Activity Theory, which proposes that meaning of words emerge directly and dialectically from the activities in which they are used. The paper argues that the power of words will depend on either or all of the theories mentioned above. To examine this general proposition we test the utilization of a generic competence framework in a local setting. The assumption is that generic frameworks are inherently ambiguous and lack the potency normally associated with the competence concept in the management of human resources. A number of case studies provide initial supporting evidence for the general proposition.

Keywords: competence, meaning, operationalisation, power of words

Procedia PDF Downloads 386
2356 Spectacles of the City: An Analysis of the Effects of Festivals in the Formation of New Urban Identities

Authors: Anusmita Das

Abstract:

In the post-industrial scenario, cities in India have become critical sites of negotiation and are expected to become some of the largest urban agglomeration of the twenty-first century. This has created a pluralist identity resulting in a new multifarious urbanism pervading throughout the entire urban landscape. There is an ambiguity regarding the character of present day Indian cities with new meanings emerging and no methodical study to understand them. More than an abstract diagram, the present day cities can be looked at as an ensemble of meanings. One of the ways in which the meaning is reflected is through events. Festivals such as Diwali, Dussera, Durga Puja, Ganesh Chaturthi, etc have transpired as the phenomenon of the city, and their presence in the everyday landscape weaves itself through the urban fabric dominating the popular visual culture of Indian cities. Festivals influence people’s idea of a city. Ritual, festival, celebrations are important in shaping of the urban environment and in their influence on the intangible aspect of the urban setting. These festivals pertaining to the city in motion have emerged as the symbolic image of the emerging urban Indian condition giving birth to new urban identities. The study undertaken to understand the present context of temporality of Indian cities is important in analyzing the process of its formation and transformation. This study aims to review the evolution of new dimensions of urbanism in India as well as its implication on the identity of cities.

Keywords: urban identities, urban design, festivals, rituals, celebrations, inter-disciplinary study

Procedia PDF Downloads 241
2355 A Network-Theorical Perspective on Music Analysis

Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria

Abstract:

The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.

Keywords: computational musicology, mathematical music modelling, music analysis, style classification

Procedia PDF Downloads 77
2354 Instrumentation for Engine Start Cycle Characterization at Cold Weather High Altitude Condition

Authors: Amit Kumar Gupta, Rohit Vashistha, G. P. Ravishankar, Mahesh P. Padwale

Abstract:

A cold soaked gas turbine engine have known starting problems in high altitude and low temperature conditions. The high altitude results in lower ambient temperature, pressure, and density. Soaking at low temperature leads to higher oil viscosity, increasing the engine starter system torque requirement. Also, low temperature soaks results in a cold compressor rotor and casing. Since the thermal mass of rotor is higher than casing, casing expands faster, thereby, increasing the blade-casing tip clearance. The low pressure flow over the compressor blade coupled with the secondary flow through the compressor tip clearance during start result in stall inception. The present study discusses engine instrumentation required for capturing the stall inception event. The engine fan exit and combustion chamber were instrumented with dynamic pressure probes to capture the pressure characteristic and clamp-on current meter on primary igniter cable to capture ignition event during start cycle. The experiment was carried out at 10500 Ft. pressure altitude and -15°C ambient temperature. The high pressure compressor stall events were recorded during the starts.

Keywords: compressor inlet, dynamic pressure probe, engine start cycle, flight test instrumentation

Procedia PDF Downloads 303
2353 A Passive Digital Video Authentication Technique Using Wavelet Based Optical Flow Variation Thresholding

Authors: R. S. Remya, U. S. Sethulekshmi

Abstract:

Detecting the authenticity of a video is an important issue in digital forensics as Video is used as a silent evidence in court such as in child pornography, movie piracy cases, insurance claims, cases involving scientific fraud, traffic monitoring etc. The biggest threat to video data is the availability of modern open video editing tools which enable easy editing of videos without leaving any trace of tampering. In this paper, we propose an efficient passive method for inter-frame video tampering detection, its type and location by estimating the optical flow of wavelet features of adjacent frames and thresholding the variation in the estimated feature. The performance of the algorithm is compared with the z-score thresholding and achieved an efficiency above 95% on all the tested databases. The proposed method works well for videos with dynamic (forensics) as well as static (surveillance) background.

Keywords: discrete wavelet transform, optical flow, optical flow variation, video tampering

Procedia PDF Downloads 343
2352 A Network Approach to Analyzing Financial Markets

Authors: Yusuf Seedat

Abstract:

The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.

Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks

Procedia PDF Downloads 174
2351 Artificial Intelligence-Based Detection of Individuals Suffering from Vestibular Disorder

Authors: Dua Hişam, Serhat İkizoğlu

Abstract:

Identifying the problem behind balance disorder is one of the most interesting topics in the medical literature. This study has considerably enhanced the development of artificial intelligence (AI) algorithms applying multiple machine learning (ML) models to sensory data on gait collected from humans to classify between normal people and those suffering from Vestibular System (VS) problems. Although AI is widely utilized as a diagnostic tool in medicine, AI models have not been used to perform feature extraction and identify VS disorders through training on raw data. In this study, three machine learning (ML) models, the Random Forest Classifier (RF), Extreme Gradient Boosting (XGB), and K-Nearest Neighbor (KNN), have been trained to detect VS disorder, and the performance comparison of the algorithms has been made using accuracy, recall, precision, and f1-score. With an accuracy of 95.28 %, Random Forest Classifier (RF) was the most accurate model.

Keywords: vestibular disorder, machine learning, random forest classifier, k-nearest neighbor, extreme gradient boosting

Procedia PDF Downloads 53
2350 The Effect of Honeycomb Core Thickness on the Repeated Low-Velocity Impact Behavior of Sandwich Beams

Authors: S. H. Abo Sabah, A. B. H. Kueh, M. A. Megat Johari, T. A. Majid

Abstract:

In a recent study, a new bio-inspired honeycomb sandwich beam (BHSB) mimicking the head configuration of the woodpecker was developed. The beam consists of two carbon/epoxy composite face sheets, aluminum honeycomb core, and rubber core to enhance the repeated low-velocity impact resistance of sandwich structures. This paper aims to numerically enhance the repeated low-velocity impact resistance of the BHSB via optimizing the aluminum honeycomb core thickness. The beam was investigated employing three core thicknesses: 20 mm, 25 mm, and 30 mm at three impact energy levels (13.5 J, 15.55 J, 21.43 J). The results revealed that increasing the thickness of the aluminum honeycomb core to a certain level enhances the sandwich beam stiffness. The beam with the 25 mm honeycomb core thickness was the only beam that can sustain five repeated impacts achieving the highest impact resistance efficiency index, especially at high energy levels. Furthermore, the bottom face sheet of this beam developed the lowest stresses indicating that this thickness has a relatively better performance during impact events since it allowed minimal stress to reach the bottom face sheet. Overall, increasing the aluminum core thickness will increase the height of its cells subjecting it to buckling phenomenon. Therefore, this study suggests that the optimal thickness of the aluminum honeycomb core should be 65 % of the overall thickness of the sandwich beam to have the best impact resistance.

Keywords: sandwich beams, core thickness, impact behavior, finite element analysis, modeling

Procedia PDF Downloads 138
2349 AI-Driven Forecasting Models for Anticipating Oil Market Trends and Demand

Authors: Gaurav Kumar Sinha

Abstract:

The volatility of the oil market, influenced by geopolitical, economic, and environmental factors, presents significant challenges for stakeholders in predicting trends and demand. This article explores the application of artificial intelligence (AI) in developing robust forecasting models to anticipate changes in the oil market more accurately. We delve into various AI techniques, including machine learning, deep learning, and time series analysis, that have been adapted to analyze historical data and current market conditions to forecast future trends. The study evaluates the effectiveness of these models in capturing complex patterns and dependencies in market data, which traditional forecasting methods often miss. Additionally, the paper discusses the integration of external variables such as political events, economic policies, and technological advancements that influence oil prices and demand. By leveraging AI, stakeholders can achieve a more nuanced understanding of market dynamics, enabling better strategic planning and risk management. The article concludes with a discussion on the potential of AI-driven models in enhancing the predictive accuracy of oil market forecasts and their implications for global economic planning and strategic resource allocation.

Keywords: AI forecasting, oil market trends, machine learning, deep learning, time series analysis, predictive analytics, economic factors, geopolitical influence, technological advancements, strategic planning

Procedia PDF Downloads 14
2348 GPU Based High Speed Error Protection for Watermarked Medical Image Transmission

Authors: Md Shohidul Islam, Jongmyon Kim, Ui-pil Chong

Abstract:

Medical image is an integral part of e-health care and e-diagnosis system. Medical image watermarking is widely used to protect patients’ information from malicious alteration and manipulation. The watermarked medical images are transmitted over the internet among patients, primary and referred physicians. The images are highly prone to corruption in the wireless transmission medium due to various noises, deflection, and refractions. Distortion in the received images leads to faulty watermark detection and inappropriate disease diagnosis. To address the issue, this paper utilizes error correction code (ECC) with (8, 4) Hamming code in an existing watermarking system. In addition, we implement the high complex ECC on a graphics processing units (GPU) to accelerate and support real-time requirement. Experimental results show that GPU achieves considerable speedup over the sequential CPU implementation, while maintaining 100% ECC efficiency.

Keywords: medical image watermarking, e-health system, error correction, Hamming code, GPU

Procedia PDF Downloads 273
2347 Comparison of Concentration of Heavy Metals in PM2.5 Analyzed in Three Different Global Research Institutions Using X-Ray Fluorescence

Authors: Sungroul Kim, Yeonjin Kim

Abstract:

This study was conducted by comparing the concentrations of heavy metals analyzed from the same samples with three X-Ray fluorescence (XRF) spectrometer in three different global research institutions, including PAN (A Branch of Malvern Panalytical, Seoul, South Korea), RTI (Research Triangle Institute, NC, U.S.A), and aerosol laboratory in Harvard University, Boston, U.S.A. To achieve our research objectives, the indoor air filter samples were collected at homes (n=24) of adults or child asthmatics then analyzed in PAN followed by Harvard University and RTI consecutively. Descriptive statistics were conducted for data comparison as well as correlation and simple regression analysis using R version 4.0.3. As a result, detection rates of most heavy metals analyzed in three institutions were about 90%. Of the 25 elements commonly analyzed among those institutions, 16 elements showed an R² (coefficient of determination) of 0.7 or higher (10 components were 0.9 or higher). The findings of this study demonstrated that XRF was a useful device ensuring reproducibility and compatibility for measuring heavy metals in PM2.5 collected from indoor air of asthmatics’ home.

Keywords: heavy metals, indoor air quality, PM2.5, X-ray fluorescence

Procedia PDF Downloads 181
2346 A Review of HVDC Modular Multilevel Converters Subjected to DC and AC Faults

Authors: Jude Inwumoh, Adam P. R. Taylor, Kosala Gunawardane

Abstract:

Modular multilevel converters (MMC) exhibit a highly scalable and modular characteristic with good voltage/power expansion, fault tolerance capability, low output harmonic content, good redundancy, and a flexible front-end configuration. Fault detection, location, and isolation, as well as maintaining fault ride-through (FRT), are major challenges to MMC reliability and power supply sustainability. Different papers have been reviewed to seek the best MMC configuration with fault capability. DC faults are the most common fault, while the probability that AC fault occurs in a modular multilevel converter (MCC) is low; though, AC faults consequence are severe. This paper reviews several MMC topologies and modulation techniques in tackling faults. These fault control strategies are compared based on cost, complexity, controllability, and power loss. A meshed network of half-bridge (HB) MMC topology was optimal in rendering fault ride through than any other MMC topologies but only when combined with DC circuit breakers (CBS), AC CBS, and fault current limiters (FCL).

Keywords: MMC-HVDC, DC faults, fault current limiters, control scheme

Procedia PDF Downloads 122
2345 Single Cell and Spatial Transcriptomics: A Beginners Viewpoint from the Conceptual Pipeline

Authors: Leo Nnamdi Ozurumba-Dwight

Abstract:

Messenger ribooxynucleic acid (mRNA) molecules are compositional, protein-based. These proteins, encoding mRNA molecules (which collectively connote the transcriptome), when analyzed by RNA sequencing (RNAseq), unveils the nature of gene expression in the RNA. The obtained gene expression provides clues of cellular traits and their dynamics in presentations. These can be studied in relation to function and responses. RNAseq is a practical concept in Genomics as it enables detection and quantitative analysis of mRNA molecules. Single cell and spatial transcriptomics both present varying avenues for expositions in genomic characteristics of single cells and pooled cells in disease conditions such as cancer, auto-immune diseases, hematopoietic based diseases, among others, from investigated biological tissue samples. Single cell transcriptomics helps conduct a direct assessment of each building unit of tissues (the cell) during diagnosis and molecular gene expressional studies. A typical technique to achieve this is through the use of a single-cell RNA sequencer (scRNAseq), which helps in conducting high throughput genomic expressional studies. However, this technique generates expressional gene data for several cells which lack presentations on the cells’ positional coordinates within the tissue. As science is developmental, the use of complimentary pre-established tissue reference maps using molecular and bioinformatics techniques has innovatively sprung-forth and is now used to resolve this set back to produce both levels of data in one shot of scRNAseq analysis. This is an emerging conceptual approach in methodology for integrative and progressively dependable transcriptomics analysis. This can support in-situ fashioned analysis for better understanding of tissue functional organization, unveil new biomarkers for early-stage detection of diseases, biomarkers for therapeutic targets in drug development, and exposit nature of cell-to-cell interactions. Also, these are vital genomic signatures and characterizations of clinical applications. Over the past decades, RNAseq has generated a wide array of information that is igniting bespoke breakthroughs and innovations in Biomedicine. On the other side, spatial transcriptomics is tissue level based and utilized to study biological specimens having heterogeneous features. It exposits the gross identity of investigated mammalian tissues, which can then be used to study cell differentiation, track cell line trajectory patterns and behavior, and regulatory homeostasis in disease states. Also, it requires referenced positional analysis to make up of genomic signatures that will be sassed from the single cells in the tissue sample. Given these two presented approaches to RNA transcriptomics study in varying quantities of cell lines, with avenues for appropriate resolutions, both approaches have made the study of gene expression from mRNA molecules interesting, progressive, developmental, and helping to tackle health challenges head-on.

Keywords: transcriptomics, RNA sequencing, single cell, spatial, gene expression.

Procedia PDF Downloads 108
2344 Content Analysis of Images Shared on Twitter during 2017 Iranian Protests

Authors: Maryam Esfandiari, Bohdan Fridrich

Abstract:

On December 28, 2017, a wave of protests erupted in several Iranian cities. Protesters demonstrated against the president, Hasan Rohani, and theocratical nature of the regime. Iran has a recent history with protest movements, such as Green Movement responsible for demonstrations after 2009 Iranian presidential election. However, the 2017/2018 protests differ from the previous ones in terms of organization and agenda. The events show little to no central organization and seem as being sparked by grass root movements and by citizens’ fatigue of government corruption, authoritarianism, and economic problems of the country. Social media has played important role in communicating the protests to the outside world and also in general coordination. By using content analyses, this paper analyzes the visual content of Twitter posts published during the protests. It aims to find the correlation between their decentralized nature and nature of the tweets – either emotionally arousing or efficiency-elicit. Pictures are searched by hashtags and coded by their content, such as ‘crowds,’ ‘protest activities,’ ‘symbols of unity,’ ‘violence,’ ‘iconic figures,’ etc. The study determines what type of content prevails and what type is the most impactful in terms of reach. This study contributes to understanding the role of social media both as a tool and a space in protest organization and portrayal in countries with limited Internet access.

Keywords: twitter, Iran, collective action, protest

Procedia PDF Downloads 131
2343 An Investigation of the Influence of the Iranian 1979 Revolution on Tehran’s Public Art

Authors: M. Sohrabi Narciss

Abstract:

Urban spaces of Tehran, the capital of Iran, have witnessed many revolts, movements, and protests during the past few decades. After the Iranian Constitutional Revolution, the 1979 Revolution has had a profound impact on Tehran’s urban space. In 1979, the world watched as Iranians demonstrated en masse against the Pahlavi dynastdy which eventually led to its overthrow. Tehran’s public space is replete with images and artwork that depict the overthrow of the Pahlavi regime and the establishment of an Islamic government in Iran. The public artworks related to the 1979 Islamic Revolution reflect the riots, protests, and strikes that the Iranians underwent during the revolution. Many of these artworks try to revitalize the events that occurred in the 1970s by means of collective memory. Almost 4 decades have passed since the revolution and ever since the public artwork has been affected either directly or indirectly by the Iran-Iraq War, the Green Movement, and the rise and fall of various political forces. The present study is an attempt to investigate Tehran’s urban artwork such as urban sculptures and mural paintings organized and supervised by the government and the graffiti drawn by the critics or the opposition groups. To this end, in addition to the available documents, field research and questionnaires were used to qulaitatively analyze the data. This paper tries to address the following questions: 1) what changes have occurred in Tehran’s urban art? 2) Does the public, revolution-related artwork have an effect on people’s vitality? 3) do Iranians find these artworks appealing or not?

Keywords: public space, Tehran, public art, movement, Islamic revolution

Procedia PDF Downloads 173
2342 Standalone Docking Station with Combined Charging Methods for Agricultural Mobile Robots

Authors: Leonor Varandas, Pedro D. Gaspar, Martim L. Aguiar

Abstract:

One of the biggest concerns in the field of agriculture is around the energy efficiency of robots that will perform agriculture’s activity and their charging methods. In this paper, two different charging methods for agricultural standalone docking stations are shown that will take into account various variants as field size and its irregularities, work’s nature to which the robot will perform, deadlines that have to be respected, among others. Its features also are dependent on the orchard, season, battery type and its technical specifications and cost. First charging base method focuses on wireless charging, presenting more benefits for small field. The second charging base method relies on battery replacement being more suitable for large fields, thus avoiding the robot stop for recharge. Existing many methods to charge a battery, the CC CV was considered the most appropriate for either simplicity or effectiveness. The choice of the battery for agricultural purposes is if most importance. While the most common battery used is Li-ion battery, this study also discusses the use of graphene-based new type of batteries with 45% over capacity to the Li-ion one. A Battery Management Systems (BMS) is applied for battery balancing. All these approaches combined showed to be a promising method to improve a lot of technical agricultural work, not just in terms of plantation and harvesting but also about every technique to prevent harmful events like plagues and weeds or even to reduce crop time and cost.

Keywords: agricultural mobile robot, charging methods, battery replacement method, wireless charging method

Procedia PDF Downloads 129
2341 Statistical Analysis of Extreme Flow (Regions of Chlef)

Authors: Bouthiba Amina

Abstract:

The estimation of the statistics bound to the precipitation represents a vast domain, which puts numerous challenges to meteorologists and hydrologists. Sometimes, it is necessary, to approach in value the extreme events for sites where there is little, or no datum, as well as their periods of return. The search for a model of the frequency of the heights of daily rains dresses a big importance in operational hydrology: It establishes a basis for predicting the frequency and intensity of floods by estimating the amount of precipitation in past years. The most known and the most common approach is the statistical approach, It consists in looking for a law of probability that fits best the values observed by the random variable " daily maximal rain " after a comparison of various laws of probability and methods of estimation by means of tests of adequacy. Therefore, a frequent analysis of the annual series of daily maximal rains was realized on the data of 54 pluviometric stations of the pond of high and average. This choice was concerned with five laws usually applied to the study and the analysis of frequent maximal daily rains. The chosen period is from 1970 to 2013. It was of use to the forecast of quantiles. The used laws are the law generalized by extremes to three components, those of the extreme values to two components (Gumbel and log-normal) in two parameters, the law Pearson typifies III and Log-Pearson III in three parameters. In Algeria, Gumbel's law has been used for a long time to estimate the quantiles of maximum flows. However, and we will check and choose the most reliable law.

Keywords: return period, extreme flow, statistics laws, Gumbel, estimation

Procedia PDF Downloads 58
2340 Detecting Cyberbullying, Spam and Bot Behavior and Fake News in Social Media Accounts Using Machine Learning

Authors: M. D. D. Chathurangi, M. G. K. Nayanathara, K. M. H. M. M. Gunapala, G. M. R. G. Dayananda, Kavinga Yapa Abeywardena, Deemantha Siriwardana

Abstract:

Due to the growing popularity of social media platforms at present, there are various concerns, mostly cyberbullying, spam, bot accounts, and the spread of incorrect information. To develop a risk score calculation system as a thorough method for deciphering and exposing unethical social media profiles, this research explores the most suitable algorithms to our best knowledge in detecting the mentioned concerns. Various multiple models, such as Naïve Bayes, CNN, KNN, Stochastic Gradient Descent, Gradient Boosting Classifier, etc., were examined, and the best results were taken into the development of the risk score system. For cyberbullying, the Logistic Regression algorithm achieved an accuracy of 84.9%, while the spam-detecting MLP model gained 98.02% accuracy. The bot accounts identifying the Random Forest algorithm obtained 91.06% accuracy, and 84% accuracy was acquired for fake news detection using SVM.

Keywords: cyberbullying, spam behavior, bot accounts, fake news, machine learning

Procedia PDF Downloads 17
2339 Tax Evasion with Mobility between the Regular and Irregular Sectors

Authors: Xavier Ruiz Del Portal

Abstract:

This paper incorporates mobility between the legal and black economies into a model of tax evasion with endogenous labor supply in which underreporting is possible in one sector but impossible in the other. We have found that the results of the effects along the extensive margin (number of evaders) become more robust and conclusive than those along the intensive margin (hours of illegal work) usually considered by the literature. In particular, it is shown that the following policies reduce the number of evaders: (a) larger and more progressive evasion penalties; (b) higher detection probabilities; (c) an increase in the legal sector wage rate; (d) a decrease in the moonlighting wage rate; (e) higher costs for creating opportunities to evade; (f) lower opportunities to evade, and (g) greater psychological costs of tax evasion. When tax concealment and illegal work also are taken into account, the effects do not vary significantly under the assumptions in Cowell (1985), except for the fact that policies (a) and (b) only hold as regards low- and middle-income groups and policies (e) and (f) as regards high-income groups.

Keywords: income taxation, tax evasion, extensive margin responses, the penalty system

Procedia PDF Downloads 141
2338 Grating Scale Thermal Expansion Error Compensation for Large Machine Tools Based on Multiple Temperature Detection

Authors: Wenlong Feng, Zhenchun Du, Jianguo Yang

Abstract:

To decrease the grating scale thermal expansion error, a novel method which based on multiple temperature detections is proposed. Several temperature sensors are installed on the grating scale and the temperatures of these sensors are recorded. The temperatures of every point on the grating scale are calculated by interpolating between adjacent sensors. According to the thermal expansion principle, the grating scale thermal expansion error model can be established by doing the integral for the variations of position and temperature. A novel compensation method is proposed in this paper. By applying the established error model, the grating scale thermal expansion error is decreased by 90% compared with no compensation. The residual positioning error of the grating scale is less than 15um/10m and the accuracy of the machine tool is significant improved.

Keywords: thermal expansion error of grating scale, error compensation, machine tools, integral method

Procedia PDF Downloads 346
2337 Resilient Analysis as an Alternative to Conventional Seismic Analysis Methods for the Maintenance of a Socioeconomical Functionality of Structures

Authors: Sara Muhammad Elqudah, Vigh László Gergely

Abstract:

Catastrophic events, such as earthquakes, are sudden, short, and devastating, threatening lives, demolishing futures, and causing huge economic losses. Current seismic analyses and design standards are based on life safety levels where only some residual strength and stiffness are left in the structure leaving it beyond economical repair. Consequently, it has become necessary to introduce and implement the concept of resilient design. Resilient design is about designing for ductility over time by resisting, absorbing, and recovering from the effects of a hazard in an appropriate and efficient time manner while maintaining the functionality of the structure in the aftermath of the incident. Resilient analysis is mainly based on the fragility, vulnerability, and functionality curves where eventually a resilience index is generated from these curves, and the higher this index is, the better is the performance of the structure. In this paper, seismic performances of a simple two story reinforced concrete building, located in a moderate seismic region, has been evaluated using the conventional seismic analyses methods, which are the linear static analysis, the response spectrum analysis, and the pushover analysis, and the generated results of these analyses methods are compared to those of the resilient analysis. Results highlight that the resilience analysis was the most convenient method in generating a more ductile and functional structure from a socio-economic perspective, in comparison to the standard seismic analysis methods.

Keywords: conventional analysis methods, functionality, resilient analysis, seismic performance

Procedia PDF Downloads 86