Search results for: method detection limit
20392 Ontology based Fault Detection and Diagnosis system Querying and Reasoning examples
Authors: Marko Batic, Nikola Tomasevic, Sanja Vranes
Abstract:
One of the strongholds in the ubiquitous efforts related to the energy conservation and energy efficiency improvement is represented by the retrofit of high energy consumers in buildings. In general, HVAC systems represent the highest energy consumers in buildings. However they usually suffer from mal-operation and/or malfunction, causing even higher energy consumption than necessary. Various Fault Detection and Diagnosis (FDD) systems can be successfully employed for this purpose, especially when it comes to the application at a single device/unit level. In the case of more complex systems, where multiple devices are operating in the context of the same building, significant energy efficiency improvements can only be achieved through application of comprehensive FDD systems relying on additional higher level knowledge, such as their geographical location, served area, their intra- and inter- system dependencies etc. This paper presents a comprehensive FDD system that relies on the utilization of common knowledge repository that stores all critical information. The discussed system is deployed as a test-bed platform at the two at Fiumicino and Malpensa airports in Italy. This paper aims at presenting advantages of implementation of the knowledge base through the utilization of ontology and offers improved functionalities of such system through examples of typical queries and reasoning that enable derivation of high level energy conservation measures (ECM). Therefore, key SPARQL queries and SWRL rules, based on the two instantiated airport ontologies, are elaborated. The detection of high level irregularities in the operation of airport heating/cooling plants is discussed and estimation of energy savings is reported.Keywords: airport ontology, knowledge management, ontology modeling, reasoning
Procedia PDF Downloads 54120391 A Machine Learning Framework Based on Biometric Measurements for Automatic Fetal Head Anomalies Diagnosis in Ultrasound Images
Authors: Hanene Sahli, Aymen Mouelhi, Marwa Hajji, Amine Ben Slama, Mounir Sayadi, Farhat Fnaiech, Radhwane Rachdi
Abstract:
Fetal abnormality is still a public health problem of interest to both mother and baby. Head defect is one of the most high-risk fetal deformities. Fetal head categorization is a sensitive task that needs a massive attention from neurological experts. In this sense, biometrical measurements can be extracted by gynecologist doctors and compared with ground truth charts to identify normal or abnormal growth. The fetal head biometric measurements such as Biparietal Diameter (BPD), Occipito-Frontal Diameter (OFD) and Head Circumference (HC) needs to be monitored, and expert should carry out its manual delineations. This work proposes a new approach to automatically compute BPD, OFD and HC based on morphological characteristics extracted from head shape. Hence, the studied data selected at the same Gestational Age (GA) from the fetal Ultrasound images (US) are classified into two categories: Normal and abnormal. The abnormal subjects include hydrocephalus, microcephaly and dolichocephaly anomalies. By the use of a support vector machines (SVM) method, this study achieved high classification for automated detection of anomalies. The proposed method is promising although it doesn't need expert interventions.Keywords: biometric measurements, fetal head malformations, machine learning methods, US images
Procedia PDF Downloads 29020390 Digital Phase Shifting Holography in a Non-Linear Interferometer using Undetected Photons
Authors: Sebastian Töpfer, Marta Gilaberte Basset, Jorge Fuenzalida, Fabian Steinlechner, Juan P. Torres, Markus Gräfe
Abstract:
This work introduces a combination of digital phase-shifting holography with a non-linear interferometer using undetected photons. Non-linear interferometers can be used in combination with a measurement scheme called quantum imaging with undetected photons, which allows for the separation of the wavelengths used for sampling an object and detecting it in the imaging sensor. This method recently faced increasing attention, as it allows to use of exotic wavelengths (e.g., mid-infrared, ultraviolet) for object interaction while at the same time keeping the detection in spectral areas with highly developed, comparable low-cost imaging sensors. The object information, including its transmission and phase influence, is recorded in the form of an interferometric pattern. To collect these, this work combines the method of quantum imaging with undetected photons with digital phase-shifting holography with a minimal sampling of the interference. With this, the quantum imaging scheme gets extended in its measurement capabilities and brings it one step closer to application. Quantum imaging with undetected photons uses correlated photons generated by spontaneous parametric down-conversion in a non-linear interferometer to create indistinguishable photon pairs, which leads to an effect called induced coherence without induced emission. Placing an object inside changes the interferometric pattern depending on the object’s properties. Digital phase-shifting holography records multiple images of the interference with determined phase shifts to reconstruct the complete interference shape, which can afterward be used to analyze the changes introduced by the object and conclude its properties. An extensive characterization of this method was done using a proof-of-principle setup. The measured spatial resolution, phase accuracy, and transmission accuracy are compared for different combinations of camera exposure times and the number of interference sampling steps. The current limits of this method are shown to allow further improvements. To summarize, this work presents an alternative holographic measurement method using non-linear interferometers in combination with quantum imaging to enable new ways of measuring and motivating continuing research.Keywords: digital holography, quantum imaging, quantum holography, quantum metrology
Procedia PDF Downloads 9520389 Post-Earthquake Damage Detection Using System Identification with a Pair of Seismic Recordings
Authors: Lotfi O. Gargab, Ruichong R. Zhang
Abstract:
A wave-based framework is presented for modeling seismic motion in multistory buildings and using measured response for system identification which can be utilized to extract important information regarding structure integrity. With one pair of building response at two locations, a generalized model response is formulated based on wave propagation features and expressed as frequency and time response functions denoted, respectively, as GFRF and GIRF. In particular, GIRF is fundamental in tracking arrival times of impulsive wave motion initiated at response level which is dependent on local model properties. Matching model and measured-structure responses can help in identifying model parameters and infer building properties. To show the effectiveness of this approach, the Millikan Library in Pasadena, California is identified with recordings of the Yorba Linda earthquake of September 3, 2002.Keywords: system identification, continuous-discrete mass modeling, damage detection, post-earthquake
Procedia PDF Downloads 37120388 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos
Authors: Dhanuja S. Patil, Sanjay B. Waykar
Abstract:
Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.Keywords: summarization, detection, Bayesian network, t-cherry tree
Procedia PDF Downloads 32720387 Neural Network based Risk Detection for Dyslexia and Dysgraphia in Sinhala Language Speaking Children
Authors: Budhvin T. Withana, Sulochana Rupasinghe
Abstract:
The educational system faces a significant concern with regards to Dyslexia and Dysgraphia, which are learning disabilities impacting reading and writing abilities. This is particularly challenging for children who speak the Sinhala language due to its complexity and uniqueness. Commonly used methods to detect the risk of Dyslexia and Dysgraphia rely on subjective assessments, leading to limited coverage and time-consuming processes. Consequently, delays in diagnoses and missed opportunities for early intervention can occur. To address this issue, the project developed a hybrid model that incorporates various deep learning techniques to detect the risk of Dyslexia and Dysgraphia. Specifically, Resnet50, VGG16, and YOLOv8 models were integrated to identify handwriting issues. The outputs of these models were then combined with other input data and fed into an MLP model. Hyperparameters of the MLP model were fine-tuned using Grid Search CV, enabling the identification of optimal values for the model. This approach proved to be highly effective in accurately predicting the risk of Dyslexia and Dysgraphia, providing a valuable tool for early detection and intervention. The Resnet50 model exhibited a training accuracy of 0.9804 and a validation accuracy of 0.9653. The VGG16 model achieved a training accuracy of 0.9991 and a validation accuracy of 0.9891. The MLP model demonstrated impressive results with a training accuracy of 0.99918, a testing accuracy of 0.99223, and a loss of 0.01371. These outcomes showcase the high accuracy achieved by the proposed hybrid model in predicting the risk of Dyslexia and Dysgraphia.Keywords: neural networks, risk detection system, dyslexia, dysgraphia, deep learning, learning disabilities, data science
Procedia PDF Downloads 6720386 Clustering and Modelling Electricity Conductors from 3D Point Clouds in Complex Real-World Environments
Authors: Rahul Paul, Peter Mctaggart, Luke Skinner
Abstract:
Maintaining public safety and network reliability are the core objectives of all electricity distributors globally. For many electricity distributors, managing vegetation clearances from their above ground assets (poles and conductors) is the most important and costly risk mitigation control employed to meet these objectives. Light Detection And Ranging (LiDAR) is widely used by utilities as a cost-effective method to inspect their spatially-distributed assets at scale, often captured using high powered LiDAR scanners attached to fixed wing or rotary aircraft. The resulting 3D point cloud model is used by these utilities to perform engineering grade measurements that guide the prioritisation of vegetation cutting programs. Advances in computer vision and machine-learning approaches are increasingly applied to increase automation and reduce inspection costs and time; however, real-world LiDAR capture variables (e.g., aircraft speed and height) create complexity, noise, and missing data, reducing the effectiveness of these approaches. This paper proposes a method for identifying each conductor from LiDAR data via clustering methods that can precisely reconstruct conductors in complex real-world configurations in the presence of high levels of noise. It proposes 3D catenary models for individual clusters fitted to the captured LiDAR data points using a least square method. An iterative learning process is used to identify potential conductor models between pole pairs. The proposed method identifies the optimum parameters of the catenary function and then fits the LiDAR points to reconstruct the conductors.Keywords: point cloud, LİDAR data, machine learning, computer vision, catenary curve, vegetation management, utility industry
Procedia PDF Downloads 10220385 Low-Cost Image Processing System for Evaluating Pavement Surface Distress
Authors: Keerti Kembhavi, M. R. Archana, V. Anjaneyappa
Abstract:
Most asphalt pavement condition evaluation use rating frameworks in which asphalt pavement distress is estimated by type, extent, and severity. Rating is carried out by the pavement condition rating (PCR), which is tedious and expensive. This paper presents the development of a low-cost technique for image pavement distress analysis that permits the identification of pothole and cracks. The paper explores the application of image processing tools for the detection of potholes and cracks. Longitudinal cracking and pothole are detected using Fuzzy-C- Means (FCM) and proceeded with the Spectral Theory algorithm. The framework comprises three phases, including image acquisition, processing, and extraction of features. A digital camera (Gopro) with the holder is used to capture pavement distress images on a moving vehicle. FCM classifier and Spectral Theory algorithms are used to compute features and classify the longitudinal cracking and pothole. The Matlab2016Ra Image preparing tool kit utilizes performance analysis to identify the viability of pavement distress on selected urban stretches of Bengaluru city, India. The outcomes of image evaluation with the utilization semi-computerized image handling framework represented the features of longitudinal crack and pothole with an accuracy of about 80%. Further, the detected images are validated with the actual dimensions, and it is seen that dimension variability is about 0.46. The linear regression model y=1.171x-0.155 is obtained using the existing and experimental / image processing area. The R2 correlation square obtained from the best fit line is 0.807, which is considered in the linear regression model to be ‘large positive linear association’.Keywords: crack detection, pothole detection, spectral clustering, fuzzy-c-means
Procedia PDF Downloads 18320384 Adaptive Decision Feedback Equalizer Utilizing Fixed-Step Error Signal for Multi-Gbps Serial Links
Authors: Alaa Abdullah Altaee
Abstract:
This paper presents an adaptive decision feedback equalizer (ADFE) for multi-Gbps serial links utilizing a fix-step error signal extracted from cross-points of received data symbols. The extracted signal is generated based on violation of received data symbols with minimum detection requirements at the clock and data recovery (CDR) stage. The iterations of the adaptation process search for the optimum feedback tap coefficients to maximize the data eye-opening and minimize the adaptation convergence time. The effectiveness of the proposed architecture is validated using the simulation results of a serial link designed in an IBM 130 nm 1.2V CMOS technology. The data link with variable channel lengths is analyzed using Spectre from Cadence Design Systems with BSIM4 device models.Keywords: adaptive DFE, CMOS equalizer, error detection, serial links, timing jitter, wire-line communication
Procedia PDF Downloads 12520383 A Rapid and Greener Analysis Approach Based on Carbonfiber Column System and MS Detection for Urine Metabolomic Study After Oral Administration of Food Supplements
Authors: Zakia Fatima, Liu Lu, Donghao Li
Abstract:
The analysis of biological fluid metabolites holds significant importance in various areas, such as medical research, food science, and public health. Investigating the levels and distribution of nutrients and their metabolites in biological samples allows researchers and healthcare professionals to determine nutritional status, find hypovitaminosis or hypervitaminosis, and monitor the effectiveness of interventions such as dietary supplementation. Moreover, analysis of nutrient metabolites provides insight into their metabolism, bioavailability, and physiological processes, aiding in the clarification of their health roles. Hence, the exploration of a distinct, efficient, eco-friendly, and simpler methodology is of great importance to evaluate the metabolic content of complex biological samples. In this work, a green and rapid analytical method based on an automated online two-dimensional microscale carbon fiber/activated carbon fiber fractionation system and time-of-flight mass spectrometry (2DμCFs-TOF-MS) was used to evaluate metabolites of urine samples after oral administration of food supplements. The automated 2DμCFs instrument consisted of a microcolumn system with bare carbon fibers and modified carbon fiber coatings. Carbon fibers and modified carbon fibers exhibit different surface characteristics and retain different compounds accordingly. Three kinds of mobile-phase solvents were used to elute the compounds of varied chemical heterogeneities. The 2DμCFs separation system has the ability to effectively separate different compounds based on their polarity and solubility characteristics. No complicated sample preparation method was used prior to analysis, which makes the strategy more eco-friendly, practical, and faster than traditional analysis methods. For optimum analysis results, mobile phase composition, flow rate, and sample diluent were optimized. Water-soluble vitamins, fat-soluble vitamins, and amino acids, as well as 22 vitamin metabolites and 11 vitamin metabolic pathway-related metabolites, were found in urine samples. All water-soluble vitamins except vitamin B12 and vitamin B9 were detected in urine samples. However, no fat-soluble vitamin was detected, and only one metabolite of Vitamin A was found. The comparison with a blank urine sample showed a considerable difference in metabolite content. For example, vitamin metabolites and three related metabolites were not detected in blank urine. The complete single-run screening was carried out in 5.5 minutes with the minimum consumption of toxic organic solvent (0.5 ml). The analytical method was evaluated in terms of greenness, with an analytical greenness (AGREE) score of 0.72. The method’s practicality has been investigated using the Blue Applicability Grade Index (BAGI) tool, obtaining a score of 77. The findings in this work illustrated that the 2DµCFs-TOF-MS approach could emerge as a fast, sustainable, practical, high-throughput, and promising analytical tool for screening and accurate detection of various metabolites, pharmaceuticals, and ingredients in dietary supplements as well as biological fluids.Keywords: metabolite analysis, sustainability, carbon fibers, urine.
Procedia PDF Downloads 3120382 Enhancing Code Security with AI-Powered Vulnerability Detection
Authors: Zzibu Mark Brian
Abstract:
As software systems become increasingly complex, ensuring code security is a growing concern. Traditional vulnerability detection methods often rely on manual code reviews or static analysis tools, which can be time-consuming and prone to errors. This paper presents a distinct approach to enhancing code security by leveraging artificial intelligence (AI) and machine learning (ML) techniques. Our proposed system utilizes a combination of natural language processing (NLP) and deep learning algorithms to identify and classify vulnerabilities in real-world codebases. By analyzing vast amounts of open-source code data, our AI-powered tool learns to recognize patterns and anomalies indicative of security weaknesses. We evaluated our system on a dataset of over 10,000 open-source projects, achieving an accuracy rate of 92% in detecting known vulnerabilities. Furthermore, our tool identified previously unknown vulnerabilities in popular libraries and frameworks, demonstrating its potential for improving software security.Keywords: AI, machine language, cord security, machine leaning
Procedia PDF Downloads 4120381 Suggestion of Methodology to Detect Building Damage Level Collectively with Flood Depth Utilizing Geographic Information System at Flood Disaster in Japan
Authors: Munenari Inoguchi, Keiko Tamura
Abstract:
In Japan, we were suffered by earthquake, typhoon, and flood disaster in 2019. Especially, 38 of 47 prefectures were affected by typhoon #1919 occurred in October 2019. By this disaster, 99 people were dead, three people were missing, and 484 people were injured as human damage. Furthermore, 3,081 buildings were totally collapsed, 24,998 buildings were half-collapsed. Once disaster occurs, local responders have to inspect damage level of each building by themselves in order to certificate building damage for survivors for starting their life reconstruction process. At that disaster, the total number to be inspected was so high. Based on this situation, Cabinet Office of Japan approved the way to detect building damage level efficiently, that is collectively detection. However, they proposed a just guideline, and local responders had to establish the concrete and infallible method by themselves. Against this issue, we decided to establish the effective and efficient methodology to detect building damage level collectively with flood depth. Besides, we thought that the flood depth was relied on the land height, and we decided to utilize GIS (Geographic Information System) for analyzing the elevation spatially. We focused on the analyzing tool of spatial interpolation, which is utilized to survey the ground water level usually. In establishing the methodology, we considered 4 key-points: 1) how to satisfy the condition defined in the guideline approved by Cabinet Office for detecting building damage level, 2) how to satisfy survivors for the result of building damage level, 3) how to keep equitability and fairness because the detection of building damage level was executed by public institution, 4) how to reduce cost of time and human-resource because they do not have enough time and human-resource for disaster response. Then, we proposed a methodology for detecting building damage level collectively with flood depth utilizing GIS with five steps. First is to obtain the boundary of flooded area. Second is to collect the actual flood depth as sampling over flooded area. Third is to execute spatial analysis of interpolation with sampled flood depth to detect two-dimensional flood depth extent. Fourth is to divide to blocks by four categories of flood depth (non-flooded, over the floor to 100 cm, 100 cm to 180 cm and over 180 cm) following lines of roads for getting satisfaction from survivors. Fifth is to put flood depth level to each building. In Koriyama city of Fukushima prefecture, we proposed the methodology of collectively detection for building damage level as described above, and local responders decided to adopt our methodology at typhoon #1919 in 2019. Then, we and local responders detect building damage level collectively to over 1,000 buildings. We have received good feedback that the methodology was so simple, and it reduced cost of time and human-resources.Keywords: building damage inspection, flood, geographic information system, spatial interpolation
Procedia PDF Downloads 12720380 Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor
Authors: Mansouri Nabila, Ben Jemaa Yousra, Motamed Cina, Watelain Eric
Abstract:
Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA.Keywords: car-detector, HOG, motion, computing time
Procedia PDF Downloads 32520379 Bacteriophage Lysis Of Physiologically Stressed Listeria Monocytogenes In A Simulated Seafood Processing Environment
Authors: Geevika J. Ganegama Arachchi, Steve H. Flint, Lynn McIntyre, Cristina D. Cruz, Beatrice M. Dias-Wanigasekera, Craig Billington, J. Andrew Hudson, Anthony N. Mutukumira
Abstract:
In seafood processing plants, Listeriamonocytogenes(L. monocytogenes)likely exists in a metabolically stressed state due to the nutrient-deficient environment, processing treatments such as heating, curing, drying, and freezing, and exposure to detergents and disinfectants. Stressed L. monocytogenes cells have been shown to be as pathogenic as unstressed cells. This study investigated lytic efficacy of (LiMN4L, LiMN4p, and LiMN17) which were previouslycharacterized as virulent against physiologically stressed cells of three seafood borne L. monocytogenesstrains (19CO9, 19DO3, and 19EO3).Physiologically compromised cells ofL. monocytogenesstrains were prepared by aging cultures in TrypticaseSoy Broth at 15±1°C for 72 h; heat injuringcultures at 54±1 - 55±1°C for 40 - 60 min;salt-stressing cultures in Milli-Q water were incubated at 25±1°C in darkness for three weeks; and incubating cultures in 9% (w/v) NaCl at 15±1°C for 72 h. Low concentrations of physiologically compromised cells of three L. monocytogenesstrainswere challenged in vitrowith high titre of three phages in separate experiments using Fish Broth medium (aqueous fish extract) at 15 °C in order to mimic the environment of seafood processing plant. Each phage, when present at ≈9 log10 PFU/ml, reduced late exponential phase cells of L. monocytogenes suspended in fish protein broth at ≈2-3 log10 CFU/ml to a non-detectable level (< 10 CFU/ml). Each phage, when present at ≈8.5 log10 PFU/ml, reduced both heat-injured cells present at 2.5-3.6 log10 CFU/ml and starved cells that were showed coccoid shape, present at ≈2-3 log10 CFU/ml to < 10 CFU/ml after 30 min. Phages also reduced salt-stressed cellspresent at ≈3 log10 CFU/ml by > 2 log10. L. monocytogenes (≈8 log10 CFU/ml) were reduced to below the detection limit (1 CFU/ml) by the three successive phage infections over 16 h, indicating that emergence of spontaneous phage resistance was infrequent. The three virulent phages showed high decontamination potential for physiologically stressed L. monocytogenes strains from seafood processing environments.Keywords: physiologically stressed L. monocytogenes, heat injured, seafood processing environment, virulent phage
Procedia PDF Downloads 13620378 Fiber Based Pushover Analysis of Reinforced Concrete Frame
Authors: Shewangizaw Tesfaye Wolde
Abstract:
The current engineering community has developed a method called performance based seismic design in which we design structures based on predefined performance levels set by the parties. Since we design our structures economically for the maximum actions expected in the life of structures they go beyond their elastic limit, in need of nonlinear analysis. In this paper conventional pushover analysis (nonlinear static analysis) is used for the performance assessment of the case study Reinforced Concrete (RC) Frame building located in Addis Ababa City, Ethiopia where proposed peak ground acceleration value by RADIUS 1999 project and others is more than twice as of EBCS-8:1995 (RADIUS 1999 project) by taking critical planar frame. Fiber beam-column model is used to control material nonlinearity with tension stiffening effect. The reliability of the fiber model and validation of software outputs are checked under verification chapter. Therefore, the aim of this paper is to propose a way for structural performance assessment of existing reinforced concrete frame buildings as well as design check.Keywords: seismic, performance, fiber model, tension stiffening, reinforced concrete
Procedia PDF Downloads 7820377 Position of the Constitutional Court of the Russian Federation on the Matter of Restricting Constitutional Rights of Citizens Concerning Banking Secrecy
Authors: A. V. Shashkova
Abstract:
The aim of the present article is to analyze the position of the Constitutional Court of the Russian Federation on the matter of restricting the constitutional rights of citizens to inviolability of professional and banking secrecy in effecting controlling activities. The methodological ground of the present Article represents the dialectic scientific method of the socio-political, legal and organizational processes with the principles of development, integrity, and consistency, etc. The consistency analysis method is used while researching the object of the analysis. Some public-private research methods are also used: the formally-logical method or the comparative legal method, are used to compare the understanding of the ‘secrecy’ concept. The aim of the present article is to find the root of the problem and to give recommendations for the solution of the problem. The result of the present research is the author’s conclusion on the necessity of the political will to improve Russian legislation with the aim of compliance with the provisions of the Constitution. It is also necessary to establish a clear balance between the constitutional rights of the individual and the limit of these rights when carrying out various control activities by public authorities. Attempts by the banks to "overdo" an anti-money laundering law under threat of severe sanctions by the regulators actually led to failures in the execution of normal economic activity. Therefore, individuals face huge problems with payments on the basis of clearing, in addition to problems with cash withdrawals. The Bank of Russia sets requirements for banks to execute Federal Law No. 115-FZ too high. It is high place to attract political will here. As well, recent changes in Russian legislation, e.g. allowing banks to refuse opening of accounts unilaterally, simplified banking activities in the country. The article focuses on different theoretical approaches towards the concept of “secrecy”. The author gives an overview of the practices of Spain, Switzerland and the United States of America on the matter of restricting the constitutional rights of citizens to inviolability of professional and banking secrecy in effecting controlling activities. The Constitutional Court of the Russian Federation basing on the Constitution of the Russian Federation has its special understanding of the issue, which should be supported by further legislative development in the Russian Federation.Keywords: constitutional court, restriction of constitutional rights, bank secrecy, control measures, money laundering, financial control, banking information
Procedia PDF Downloads 18720376 Automated Detection of Related Software Changes by Probabilistic Neural Networks Model
Authors: Yuan Huang, Xiangping Chen, Xiaonan Luo
Abstract:
Current software are continuously updating. The change between two versions usually involves multiple program entities (e.g., packages, classes, methods, attributes) with multiple purposes (e.g., changed requirements, bug fixing). It is hard for developers to understand which changes are made for the same purpose. Whether two changes are related is not decided by the relationship between this two entities in the program. In this paper, we summarized 4 coupling rules(16 instances) and 4 state-combination types at the class, method and attribute levels for software change. Related Change Vector (RCV) are defined based on coupling rules and state-combination types, and applied to classify related software changes by using Probabilistic Neural Network during a software updating.Keywords: PNN, related change, state-combination, logical coupling, software entity
Procedia PDF Downloads 44020375 Triplex Detection of Pistacia vera, Arachis hypogaea and Pisum sativum in Processed Food Products Using Probe Based PCR
Authors: Ergün Şakalar, Şeyma Özçirak Ergün, Emrah Yalazi̇, Emine Altinkaya, Cengiz Ataşoğlu
Abstract:
In recent years, food allergies which cause serious health problems affect to public health around the world. Foodstuffs which contain allergens are either intentionally used as ingredients or are encased as contaminant in food products. The prevalence of clinical allergy to peanuts and nuts is estimated at about 0.4%-1.1% of the adult population, representing the allergy to pistachio the 7% of the cases of tree nut causing allergic reactions. In order to protect public health and enforce the legislation, methods for sensitive analysis of pistachio and peanut contents in food are required. Pea, pistachio and peanut are used together, to reduce the cost in food production such as baklava, snack foods.DNA technology-based methods in food analysis are well-established and well-roundedtools for species differentiation, allergen detection. Especially, the probe-based TaqMan real-time PCR assay can amplify target DNA with efficiency, specificity, and sensitivity.In this study, pistachio, peanut and pea were finely ground and three separate series of triplet mixtures containing 0.1, 1, 10, 100, 1000, 10,000 and 100,000 mg kg-1 of each sample were prepared for each series, to a final weight of 100 g. DNA from reference samples and industrial products was successfully extracted with the GIDAGEN® Multi-Fast DNA Isolation Kit. TaqMan probes were designed for triplex determination of ITS, Ara h 3 and pea lectin genes which are specific regions for identification pistachio, peanut and pea, respectively.The real-time PCR as quantitative detected pistachio, peanut and pea in these mixtures down to the lowest investigated level of 0.1, 0.1 and 1 mg kg-1, respectively. Also, the methods reported here are capable of detecting of as little as 0.001% level of peanut DNA, 0,000001% level of pistachio DNA and 0.000001% level of pea DNA. We accomplish that the quantitative triplex real-time PCR method developed in this study canbe applied to detect pistachio, peanut and peatraces for three allergens at once in commercial food products.Keywords: allergens, DNA, real-time PCR, TaqMan probe
Procedia PDF Downloads 26020374 Design and Optimization Fire Alarm System to Protect Gas Condensate Reservoirs With the Use of Nano-Technology
Authors: Hefzollah Mohammadian, Ensieh Hajeb, Mohamad Baqer Heidari
Abstract:
In this paper, for the protection and safety of tanks gases (flammable materials) and also due to the considerable economic value of the reservoir, the new system for the protection, the conservation and fire fighting has been cloned. The system consists of several parts: the Sensors to detect heat and fire with Nanotechnology (nano sensor), Barrier for isolation and protection from a range of two electronic zones, analyzer for detection and locating point of fire accurately, Main electronic board to announce fire, Fault diagnosis in different locations, such as relevant alarms and activate different devices for fire distinguish and announcement. An important feature of this system, high speed and capability of fire detection system in a way that is able to detect the value of the ambient temperature that can be adjusted. Another advantage of this system is autonomous and does not require human operator in place. Using nanotechnology, in addition to speeding up the work, reduces the cost of construction of the sensor and also the notification system and fire extinguish.Keywords: analyser, barrier, heat resistance, general fault, general alarm, nano sensor
Procedia PDF Downloads 45720373 Case Study: Hybrid Mechanically Stabilized Earth Wall System Built on Basal Reinforced Raft
Authors: S. Kaymakçı, D. Gündoğdu, H. Özçelik
Abstract:
The truck park of a warehouse for a chain of supermarket was going to be constructed on a poor ground. Rather than using a piled foundation, the client was convinced that a ground improvement using a reinforced foundation raft also known as “basal reinforcement” shall work. The retaining structures supporting the truck park area were designed using a hybrid structure made up of the Terramesh® Wall System and MacGrid™ high strength geogrids. The total wall surface area is nearly 2740 sq.m , reaching a maximum height of 13.00 meters. The area is located in the first degree seismic zone of Turkey and the design seismic acceleration is high. The design of walls has been carried out using pseudo-static method (limit equilibrium) taking into consideration different loading conditions using Eurocode 7. For each standard approach stability analysis in seismic condition were performed. The paper presents the detailed design of the reinforced soil structure, basal reinforcement and the construction methods; advantages of using such system for the project are discussed.Keywords: basal reinforcement, geogrid, reinforced soil raft, reinforced soil wall, soil reinforcement
Procedia PDF Downloads 30520372 A Comparison of Bias Among Relaxed Divisor Methods Using 3 Bias Measurements
Authors: Sumachaya Harnsukworapanich, Tetsuo Ichimori
Abstract:
The apportionment method is used by many countries, to calculate the distribution of seats in political bodies. For example, this method is used in the United States (U.S.) to distribute house seats proportionally based on the population of the electoral district. Famous apportionment methods include the divisor methods called the Adams Method, Dean Method, Hill Method, Jefferson Method and Webster Method. Sometimes the results from the implementation of these divisor methods are unfair and include errors. Therefore, it is important to examine the optimization of this method by using a bias measurement to figure out precise and fair results. In this research we investigate the bias of divisor methods in the U.S. Houses of Representatives toward large and small states by applying the Stolarsky Mean Method. We compare the bias of the apportionment method by using two famous bias measurements: The Balinski and Young measurement and the Ernst measurement. Both measurements have a formula for large and small states. The Third measurement however, which was created by the researchers, did not factor in the element of large and small states into the formula. All three measurements are compared and the results show that our measurement produces similar results to the other two famous measurements.Keywords: apportionment, bias, divisor, fair, measurement
Procedia PDF Downloads 36820371 Innovation in PhD Training in the Interdisciplinary Research Institute
Authors: B. Shaw, K. Doherty
Abstract:
The Cultural Communication and Computing Research Institute (C3RI) is a diverse multidisciplinary research institute including art, design, media production, communication studies, computing and engineering. Across these disciplines it can seem like there are enormous differences of research practice and convention, including differing positions on objectivity and subjectivity, certainty and evidence, and different political and ethical parameters. These differences sit within, often unacknowledged, histories, codes, and communication styles of specific disciplines, and it is all these aspects that can make understanding of research practice across disciplines difficult. To explore this, a one day event was orchestrated, testing how a PhD community might communicate and share research in progress in a multi-disciplinary context. Instead of presenting results at a conference, research students were tasked to articulate their method of inquiry. A working party of students from across disciplines had to design a conference call, visual identity and an event framework that would work for students across all disciplines. The process of establishing the shape and identity of the conference was revealing. Even finding a linguistic frame that would meet the expectations of different disciplines for the conference call was challenging. The first abstracts submitted either resorted to reporting findings, or only described method briefly. It took several weeks of supported intervention for research students to get ‘inside’ their method and to understand their research practice as a process rich with philosophical and practical decisions and implications. In response to the abstracts the conference committee generated key methodological categories for conference sessions, including sampling, capturing ‘experience’, ‘making models’, researcher identities, and ‘constructing data’. Each session involved presentations by visual artists, communications students and computing researchers with inter-disciplinary dialogue, facilitated by alumni Chairs. The apparently simple focus on method illuminated research process as a site of creativity, innovation and discovery, and also built epistemological awareness, drawing attention to what is being researched and how it can be known. It was surprisingly difficult to limit students to discussing method, and it was apparent that the vocabulary available for method is sometimes limited. However, by focusing on method rather than results, the genuine process of research, rather than one constructed for approval, could be captured. In unlocking the twists and turns of planning and implementing research, and the impact of circumstance and contingency, students had to reflect frankly on successes and failures. This level of self – and public- critique emphasised the degree of critical thinking and rigour required in executing research and demonstrated that honest reportage of research, faults and all, is good valid research. The process also revealed the degree that disciplines can learn from each other- the computing students gained insights from the sensitive social contextualizing generated by communications and art and design students, and art and design students gained understanding from the greater ‘distance’ and emphasis on application that computing students applied to their subjects. Finding the means to develop dialogue across disciplines makes researchers better equipped to devise and tackle research problems across disciplines, potentially laying the ground for more effective collaboration.Keywords: interdisciplinary, method, research student, training
Procedia PDF Downloads 20720370 An Observation Approach of Reading Order for Single Column and Two Column Layout Template
Authors: In-Tsang Lin, Chiching Wei
Abstract:
Reading order is an important task in many digitization scenarios involving the preservation of the logical structure of a document. From the paper survey, it finds that the state-of-the-art algorithm could not fulfill to get the accurate reading order in the portable document format (PDF) files with rich formats, diverse layout arrangement. In recent years, most of the studies on the analysis of reading order have targeted the specific problem of associating layout components with logical labels, while less attention has been paid to the problem of extracting relationships the problem of detecting the reading order relationship between logical components, such as cross-references. Over 3 years of development, the company Foxit has demonstrated the layout recognition (LR) engine in revision 20601 to eager for the accuracy of the reading order. The bounding box of each paragraph can be obtained correctly by the Foxit LR engine, but the result of reading-order is not always correct for single-column, and two-column layout format due to the table issue, formula issue, and multiple mini separated bounding box and footer issue. Thus, the algorithm is developed to improve the accuracy of the reading order based on the Foxit LR structure. In this paper, a creative observation method (Here called the MESH method) is provided here to open a new chance in the research of the reading-order field. Here two important parameters are introduced, one parameter is the number of the bounding box on the right side of the present bounding box (NRight), and another parameter is the number of the bounding box under the present bounding box (Nunder). And the normalized x-value (x/the whole width), the normalized y-value (y/the whole height) of each bounding box, the x-, and y- position of each bounding box were also put into consideration. Initial experimental results of single column layout format demonstrate a 19.33% absolute improvement in accuracy of the reading-order over 7 PDF files (total 150 pages) using our proposed method based on the LR structure over the baseline method using the LR structure in 20601 revision, which its accuracy of the reading-order is 72%. And for two-column layout format, the preliminary results demonstrate a 44.44% absolute improvement in accuracy of the reading-order over 2 PDF files (total 18 pages) using our proposed method based on the LR structure over the baseline method using the LR structure in 20601 revision, which its accuracy of the reading-order is 0%. Until now, the footer issue and a part of multiple mini separated bounding box issue can be solved by using the MESH method. However, there are still three issues that cannot be solved, such as the table issue, formula issue, and the random multiple mini separated bounding boxes. But the detection of the table position and the recognition of the table structure are out of the scope in this paper, and there is needed another research. In the future, the tasks are chosen- how to detect the table position in the page and to extract the content of the table.Keywords: document processing, reading order, observation method, layout recognition
Procedia PDF Downloads 18220369 Worst-Case Load Shedding in Electric Power Networks
Authors: Fu Lin
Abstract:
We consider the worst-case load-shedding problem in electric power networks where a number of transmission lines are to be taken out of service. The objective is to identify a prespecified number of line outages that lead to the maximum interruption of power generation and load at the transmission level, subject to the active power-flow model, the load and generation capacity of the buses, and the phase-angle limit across the transmission lines. For this nonlinear model with binary constraints, we show that all decision variables are separable except for the nonlinear power-flow equations. We develop an iterative decomposition algorithm, which converts the worst-case load shedding problem into a sequence of small subproblems. We show that the subproblems are either convex problems that can be solved efficiently or nonconvex problems that have closed-form solutions. Consequently, our approach is scalable for large networks. Furthermore, we prove the convergence of our algorithm to a critical point, and the objective value is guaranteed to decrease throughout the iterations. Numerical experiments with IEEE test cases demonstrate the effectiveness of the developed approach.Keywords: load shedding, power system, proximal alternating linearization method, vulnerability analysis
Procedia PDF Downloads 14220368 Effect of Gaseous Imperfections on the Supersonic Flow Parameters for Air in Nozzles
Authors: Merouane Salhi, Toufik Zebbiche
Abstract:
When the stagnation pressure of perfect gas increases, the specific heat and their ratio do not remain constant anymore and start to vary with this pressure. The gas doesn’t remain perfect. Its state equation change and it becomes for a real gas. In this case, the effects of molecular size and intermolecular attraction forces intervene to correct the state equation. The aim of this work is to show and discuss the effect of stagnation pressure on supersonic thermodynamical, physical and geometrical flow parameters, to find a general case for real gas. With the assumptions that Berthelot’s state equation accounts for the molecular size and intermolecular force effects, expressions are developed for analyzing supersonic flow for thermally and calorically imperfect gas lower than the dissociation molecules threshold. The designs parameters for supersonic nozzle like thrust coefficient depend directly on stagnation parameters of the combustion chamber. The application is for air. A computation of error is made in this case to give a limit of perfect gas model compared to real gas model.Keywords: supersonic flow, real gas model, Berthelot’s state equation, Simpson’s method, condensation function, stagnation pressure
Procedia PDF Downloads 44820367 Progress in Accuracy, Reliability and Safety in Firedamp Detection
Authors: José Luis Lorenzo Bayona, Ljiljana Medic-Pejic, Isabel Amez Arenillas, Blanca Castells Somoza
Abstract:
The communication presents the study results carried out by the Official Laboratory J. M. Madariaga (LOM) of the Polytechnic University of Madrid to analyze the reliability of methane detection systems used in underground mining. Poor firedamp control in work can cause from production stoppages to fatal accidents and since there is currently a great variety of equipment with different functional characteristics, a study is needed to indicate which measurement principles have the highest degree of confidence. For the development of the project, a series of fixed, transportable and portable methane detectors with different measurement principles have been selected to subject them to laboratory tests following the methods described in the applicable regulations. The test equipment has been the one usually used in the certification and calibration of these devices, subject to the LOM quality system, and the tests have been carried out on detectors accessible in the market. The conclusions establish the main advantages and disadvantages of the equipment according to the measurement principle used; catalytic combustion, interferometry and infrared absorption.Keywords: ATEX standards, gas detector, methane meter, mining safety
Procedia PDF Downloads 14020366 Life Cycle Cost Evaluation of Structures Retrofitted with Damped Cable System
Authors: Asad Naeem, Mohamed Nour Eldin, Jinkoo Kim
Abstract:
In this study, the seismic performance and life cycle cost (LCC) are evaluated of the structure retrofitted with the damped cable system (DCS). The DCS is a seismic retrofit system composed of a high-strength steel cable and pressurized viscous dampers. The analysis model of the system is first derived using various link elements in SAP2000, and fragility curves of the structure retrofitted with the DCS and viscous dampers are obtained using incremental dynamic analyses. The analysis results show that the residual displacements of the structure equipped with the DCS are smaller than those of the structure with retrofitted with only conventional viscous dampers, due to the enhanced stiffness/strength and self-centering capability of the damped cable system. The fragility analysis shows that the structure retrofitted with the DCS has the least probability of reaching the specific limit states compared to the bare structure and the structure with viscous damper. It is also observed that the initial cost of the DCS method required for the seismic retrofit is smaller than that of the structure with viscous dampers and that the LCC of the structure equipped with the DCS is smaller than that of the structure with viscous dampers.Keywords: damped cable system, fragility curve, life cycle cost, seismic retrofit, self-centering
Procedia PDF Downloads 55420365 Assessment of Cellular Metabolites and Impedance for Early Diagnosis of Oral Cancer among Habitual Smokers
Authors: Ripon Sarkar, Kabita Chaterjee, Ananya Barui
Abstract:
Smoking is one of the leading causes of oral cancer. Cigarette smoke affects various cellular parameters and alters molecular metabolism of cells. Epithelial cells losses their cytoskeleton structure, membrane integrity, cellular polarity that subsequently initiates the process of epithelial cells to mesenchymal transition due to long exposure of cigarette smoking. It changes the normal cellular metabolic activity which induces oxidative stress and enhances the reactive oxygen spices (ROS) formation. Excessive ROS and associated oxidative stress are considered to be a driving force in alteration in cellular phenotypes, polarity distribution and mitochondrial metabolism. Noninvasive assessment of such parameters plays essential role in development of routine screening system for early diagnosis of oral cancer. Electrical cell-substrate impedance sensing (ECIS) is one of such method applied for detection of cellular membrane impedance which can be correlated to cell membrane integrity. Present study intends to explore the alteration in cellular impedance along with the expression of cellular polarity molecules and cytoskeleton distributions in oral epithelial cells of habitual smokers and to correlate the outcome to that of clinically diagnosed oral leukoplakia and oral squamous cell carcinoma patients. Total 80 subjects were categorized into four study groups: nonsmoker (NS), cigarette smoker (CS), oral leukoplakia (OLPK) and oral squamous cell carcinoma (OSCC). Cytoskeleton distribution was analyzed by staining of actin filament and generation of ROS was measured using assay kit using standard protocol. Cell impedance was measured through ECIS method at different frequencies. Expression of E-cadherin and protease-activated receptor (PAR) proteins were observed through immune-fluorescence method. Distribution of actin filament is well organized in NS group however; distribution pattern was grossly varied in CS, OLPK and OSCC. Generation of ROS was low in NS which subsequently increased towards OSCC. Expressions of E-cadherin and change in cellular electrical impedance in different study groups indicated the hallmark of cancer progression from NS to OSCC. Expressions of E-cadherin, PAR protein, and cell impedance were decreased from NS to CS and farther OSCC. Generally, the oral epithelial cells exhibit apico-basal polarity however with cancer progression these cells lose their characteristic polarity distribution. In this study expression of polarity molecule and ECIS observation indicates such altered pattern of polarity among smoker group. Overall the present study monitored the alterations in intracellular ROS generation and cell metabolic function, membrane integrity in oral epithelial cells in cigarette smokers. Present study thus has clinical significance, and it may help in developing a noninvasive technique for early diagnosis of oral cancer amongst susceptible individuals.Keywords: cigarette smoking, early oral cancer detection, electric cell-substrate impedance sensing, noninvasive screening
Procedia PDF Downloads 17820364 Detection of Important Biological Elements in Drug-Drug Interaction Occurrence
Authors: Reza Ferdousi, Reza Safdari, Yadollah Omidi
Abstract:
Drug-drug interactions (DDIs) are main cause of the adverse drug reactions and nature of the functional and molecular complexity of drugs behavior in human body make them hard to prevent and treat. With the aid of new technologies derived from mathematical and computational science the DDIs problems can be addressed with minimum cost and efforts. Market basket analysis is known as powerful method to identify co-occurrence of thing to discover patterns and frequency of the elements. In this research, we used market basket analysis to identify important bio-elements in DDIs occurrence. For this, we collected all known DDIs from DrugBank. The obtained data were analyzed by market basket analysis method. We investigated all drug-enzyme, drug-carrier, drug-transporter and drug-target associations. To determine the importance of the extracted bio-elements, extracted rules were evaluated in terms of confidence and support. Market basket analysis of the over 45,000 known DDIs reveals more than 300 important rules that can be used to identify DDIs, CYP 450 family were the most frequent shared bio-elements. We applied extracted rules over 2,000,000 unknown drug pairs that lead to discovery of more than 200,000 potential DDIs. Analysis of the underlying reason behind the DDI phenomena can help to predict and prevent DDI occurrence. Ranking of the extracted rules based on strangeness of them can be a supportive tool to predict the outcome of an unknown DDI.Keywords: drug-drug interaction, market basket analysis, rule discovery, important bio-elements
Procedia PDF Downloads 31520363 Solution for Thick Plate Resting on Winkler Foundation by Symplectic Geometry Method
Authors: Mei-Jie Xu, Yang Zhong
Abstract:
Based on the symplectic geometry method, the theory of Hamilton system can be applied in the analysis of problem solved using the theory of elasticity and in the solution of elliptic partial differential equations. With this technique, this paper derives the theoretical solution for a thick rectangular plate with four free edges supported on a Winkler foundation by variable separation method. In this method, the governing equation of thick plate was first transformed into state equations in the Hamilton space. The theoretical solution of this problem was next obtained by applying the method of variable separation based on the Hamilton system. Compared with traditional theoretical solutions for rectangular plates, this method has the advantage of not having to assume the form of deflection functions in the solution process. Numerical examples are presented to verify the validity of the proposed solution method.Keywords: symplectic geometry method, Winkler foundation, thick rectangular plate, variable separation method, Hamilton system
Procedia PDF Downloads 307