Search results for: automated container terminal
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1368

Search results for: automated container terminal

558 Application of Hydrological Model in Support of Streamflow Allocation in Arid Watersheds in Northwestern China

Authors: Chansheng He, Lanhui Zhang, Baoqing Zhang

Abstract:

Spatial heterogeneity of landscape significantly affects watershed hydrological processes, particularly in high elevation and cold mountainous watersheds such as the inland river (terminal lake) basins in Northwest China, where the upper reach mountainous areas are the main source of streamflow for the downstream agricultural oases and desert ecosystems. Thus, it is essential to take into account spatial variations of hydrological processes in streamflow allocation at the watershed scale. This paper adapts the Distributed Large Basin Runoff Model (DLBRM) to the Heihe River Watershed, the second largest inland river with a drainage area of about 128,000 km2 in Northwest China, for understanding the transfer and partitioning mechanism among the glacier and snowmelt, surface runoff, evapotranspiration, and groundwater recharge among the upper, middle, and lower reaches in the study area. Results indicate that the upper reach Qilian Mountain area is the main source of streamflow for the middle reach agricultural oasis and downstream desert areas. Large withdrawals for agricultural irrigation in the middle reach had significantly depleted river flow for the lower reach desert ecosystems. Innovative conservation and enforcement programs need to be undertaken to ensure the successful implementation of water allocation plan of delivering 0.95 x 109 m3 of water downstream annually by the State Council in the Heihe River Watershed.

Keywords: DLBRM, Northwestern China, spatial variation, water allocation

Procedia PDF Downloads 286
557 High-Throughput Screening and Selection of Electrogenic Microbial Communities Using Single Chamber Microbial Fuel Cells Based on 96-Well Plate Array

Authors: Lukasz Szydlowski, Jiri Ehlich, Igor Goryanin

Abstract:

We demonstrate a single chamber, 96-well-plated based Microbial Fuel Cell (MFC) with printed, electronic components. This invention is aimed at robust selection of electrogenic microbial community under specific conditions, e.g., electrode potential, pH, nutrient concentration, salt concentration that can be altered within the 96 well plate array. This invention enables robust selection of electrogenic microbial community under the homogeneous reactor, with multiple conditions that can be altered to allow comparative analysis. It can be used as a standalone technique or in conjunction with other selective processes, e.g., flow cytometry, microfluidic-based dielectrophoretic trapping. Mobile conductive elements, like carbon paper, carbon sponge, activated charcoal granules, metal mesh, can be inserted inside to increase the anode surface area in order to collect electrogenic microorganisms and to transfer them into new reactors or for other analytical works. An array of 96-well plate allows this device to be operated by automated pipetting stations.

Keywords: bioengineering, electrochemistry, electromicrobiology, microbial fuel cell

Procedia PDF Downloads 127
556 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach

Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta

Abstract:

Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.

Keywords: support vector machines, decision tree, random forest

Procedia PDF Downloads 22
555 A Visual Inspection System for Automotive Sheet Metal Chasis Parts Produced with Cold-Forming Method

Authors: İmren Öztürk Yılmaz, Abdullah Yasin Bilici, Yasin Atalay Candemir

Abstract:

The system consists of 4 main elements: motion system, image acquisition system, image processing software, and control interface. The parts coming out of the production line to enter the image processing system with the conveyor belt at the end of the line. The 3D scanning of the produced part is performed with the laser scanning system integrated into the system entry side. With the 3D scanning method, it is determined at what position and angle the parts enter the system, and according to the data obtained, parameters such as part origin and conveyor speed are calculated with the designed software, and the robot is informed about the position where it will take part. The robot, which receives the information, takes the produced part on the belt conveyor and shows it to high-resolution cameras for quality control. Measurement processes are carried out with a maximum error of 20 microns determined by the experiments.

Keywords: quality control, industry 4.0, image processing, automated fault detection, digital visual inspection

Procedia PDF Downloads 91
554 Design of Cartesian Robot for Electric Vehicle Wireless Charging Systems

Authors: Kaan Karaoglu, Raif Bayir

Abstract:

In this study, a cartesian robot is developed to improve the performance and efficiency of wireless charging of electric vehicles. The cartesian robot has three axes, each of which moves linearly. Magnetic positioning is used to align the cartesian robot transmitter charging pad. There are two different wireless charging methods, static and dynamic, for charging electric vehicles. The current state of charge information (SOC State of Charge) and location information are received wirelessly from the electric vehicle. Based on this information, the power to be transmitted is determined, and the transmitter and receiver charging pads are aligned for maximum efficiency. With this study, a fully automated cartesian robot structure will be used to charge electric vehicles with the highest possible efficiency. With the wireless communication established between the electric vehicle and the charging station, the charging status will be monitored in real-time. The cartesian robot developed in this study is a fully automatic system that can be easily used in static wireless charging systems with vehicle-machine communication.

Keywords: electric vehicle, wireless charging systems, energy efficiency, cartesian robot, location detection, trajectory planning

Procedia PDF Downloads 51
553 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control

Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni

Abstract:

An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.

Keywords: automation, human factors, air traffic controller, MINIMA, OOTL (Out-Of-The-Loop), EEG (Electroencephalography), HMI (Human Machine Interface)

Procedia PDF Downloads 365
552 Control of the Sustainability of Fresh Cheese in Order to Extend the Shelf-Life of the Product

Authors: Radovan Čobanović, Milica Rankov Šicar

Abstract:

The fresh cheese is in the group of perishable food which cannot be kept a long period of time. The study of sustainability have been done in order to extend the shelf-life of the product which was 15 days. According to the plan of sustainability it was defined that 35 samples had to be stored for 30 days at 2°C−6°C and analyzed every 7th day from the day of reception until 30th day. Shelf life of the cheese has expired during the study of sustainability in the period between 15th and 30th day of analyses. Cheese samples were subjected to sensory analysis (appearance, odor, taste, color, aroma) and bacteriological analyzes (Listeria monocytogenes, Salmonella spp., Bacillus cereus, Staphylococcus aureus and total plate count) according to Serbian state regulation. All analyses were tested according to ISO methodology: sensory analysis ISO 6658, Listeria monocytogenes ISO 11 290-1, Salmonella spp ISO 6579, Bacillus cereus ISO 7932, Staphylococcus aureus ISO 6888-1, and total plate count ISO 4833. Analyses showed that after fifteen days of storage at a temperature defined by the manufacturers and within the product's shelf life, the cheese did not have any noticeable changes in sensory characteristics. Smell and taste are unaffected there was no separation of whey and there was not presence of strange smell or taste. As far as microbiological analyses are concerned neither one pathogen was detected and total plate count was at level of 103 cfu/g. After expiry of shelf life in a period of 15th and 30th day of storage, the analysis showed that there was a separation of whey on the surface. Along the edge of the container was present a dried part of cheese and sour-milky smell and taste were very weakly expressed. Concerning the microbiological analyses there still were not positive results for pathogen microorganisms but the total plate count was at a level of 106cfu/g. Based on the obtained results it can be concluded that this product cannot have longer shelf life than shelf life which is already defined because there are a sensory changes that would certainly have influence on decision of customers when purchase of this product is concerned.

Keywords: sustainability, fresh cheese, shelf-life, product

Procedia PDF Downloads 361
551 The Dynamic Cone Penetration Test: A Review of Its Correlations and Applications

Authors: Abdulrahman M. Hamid

Abstract:

Dynamic Cone Penetration Test (DCPT) is widely used for field quality assessment of soils. Its application to predict the engineering properties of soil is globally promoted by the fact that it is difficult to obtain undisturbed soil samples, especially when loose or submerged sandy soil is encountered. Detailed discussion will be presented on the current development of DCPT correlations with resilient modulus, relative density, California Bearing Ratio (CBR), unconfined compressive strength and shear strength that have been developed for different materials in both the laboratory and field, as well as on the usage of DCPT in quality control of compaction of earth fills and performance evaluation of pavement layers. In addition, the relationship of the DCPT with other instruments such as falling weight deflectometer, nuclear gauge, soil stiffens gauge, and plate load test will be reported. Lastely, the application of DCPT in Saudi Arabia in recent years will be addressed in this manuscript.

Keywords: dynamic cone penetration test, falling weight deflectometer, nuclear gauge, soil stiffens gauge, plate load test, automated dynamic cone penetration

Procedia PDF Downloads 404
550 Design of Labview Based DAQ System

Authors: Omar A. A. Shaebi, Matouk M. Elamari, Salaheddin Allid

Abstract:

The Information Computing System of Monitoring (ICSM) for the Research Reactor of Tajoura Nuclear Research Centre (TNRC) stopped working since early 1991. According to the regulations, the computer is necessary to operate the reactor up to its maximum power (10 MW). The fund is secured via IAEA to develop a modern computer based data acquisition system to replace the old computer. This paper presents the development of the Labview based data acquisition system to allow automated measurements using National Instruments Hardware and its labview software. The developed system consists of SCXI 1001 chassis, the chassis house four SCXI 1100 modules each can maintain 32 variables. The chassis is interfaced with the PC using NI PCI-6023 DAQ Card. Labview, developed by National Instruments, is used to run and operate the DAQ System. Labview is graphical programming environment suited for high level design. It allows integrating different signal processing components or subsystems within a graphical framework. The results showed system capabilities in monitoring variables, acquiring and saving data. Plus the capability of the labview to control the DAQ.

Keywords: data acquisition, labview, signal conditioning, national instruments

Procedia PDF Downloads 481
549 Importance of New Policies of Process Management for Internet of Things Based on Forensic Investigation

Authors: Venkata Venugopal Rao Gudlur

Abstract:

The Proposed Policies referred to as “SOP”, on the Internet of Things (IoT) based Forensic Investigation into Process Management is the latest revolution to save time and quick solution for investigators. The forensic investigation process has been developed over many years from time to time it has been given the required information with no policies in investigation processes. This research reveals that the current IoT based forensic investigation into Process Management based is more connected to devices which is the latest revolution and policies. All future development in real-time information on gathering monitoring is evolved with smart sensor-based technologies connected directly to IoT. This paper present conceptual framework on process management. The smart devices are leading the way in terms of automated forensic models and frameworks established by different scholars. These models and frameworks were mostly focused on offering a roadmap for performing forensic operations with no policies in place. These initiatives would bring a tremendous benefit to process management and IoT forensic investigators proposing policies. The forensic investigation process may enhance more security and reduced data losses and vulnerabilities.

Keywords: Internet of Things, Process Management, Forensic Investigation, M2M Framework

Procedia PDF Downloads 82
548 A Study on the Impact of Artificial Intelligence on Human Society and the Necessity for Setting up the Boundaries on AI Intrusion

Authors: Swarna Pundir, Prabuddha Hans

Abstract:

As AI has already stepped into the daily life of human society, one cannot be ignorant about the data it collects and used it to provide a quality of services depending up on the individuals’ choices. It also helps in giving option for making decision Vs choice selection with a calculation based on the history of our search criteria. Over the past decade or so, the way Artificial Intelligence (AI) has impacted society is undoubtedly large.AI has changed the way we shop, the way we entertain and challenge ourselves, the way information is handled, and has automated some sections of our life. We have answered as to what AI is, but not why one may see it as useful. AI is useful because it is capable of learning and predicting outcomes, using Machine Learning (ML) and Deep Learning (DL) with the help of Artificial Neural Networks (ANN). AI can also be a system that can act like humans. One of the major impacts be Joblessness through automation via AI which is seen mostly in manufacturing sectors, especially in the routine manual and blue-collar occupations and those without a college degree. It raises some serious concerns about AI in regards of less employment, ethics in making moral decisions, Individuals privacy, human judgement’s, natural emotions, biased decisions, discrimination. So, the question is if an error occurs who will be responsible, or it will be just waved off as a “Machine Error”, with no one taking the responsibility of any wrongdoing, it is essential to form some rules for using the AI where both machines and humans are involved.

Keywords: AI, ML, DL, ANN

Procedia PDF Downloads 73
547 CT Images Based Dense Facial Soft Tissue Thickness Measurement by Open-source Tools in Chinese Population

Authors: Ye Xue, Zhenhua Deng

Abstract:

Objectives: Facial soft tissue thickness (FSTT) data could be obtained from CT scans by measuring the face-to-skull distances at sparsely distributed anatomical landmarks by manually located on face and skull. However, automated measurement using 3D facial and skull models by dense points using open-source software has become a viable option due to the development of computed assisted imaging technologies. By utilizing dense FSTT information, it becomes feasible to generate plausible automated facial approximations. Therefore, establishing a comprehensive and detailed, densely calculated FSTT database is crucial in enhancing the accuracy of facial approximation. Materials and methods: This study utilized head CT scans from 250 Chinese adults of Han ethnicity, with 170 participants originally born and residing in northern China and 80 participants in southern China. The age of the participants ranged from 14 to 82 years, and all samples were divided into five non-overlapping age groups. Additionally, samples were also divided into three categories based on BMI information. The 3D Slicer software was utilized to segment bone and soft tissue based on different Hounsfield Unit (HU) thresholds, and surface models of the face and skull were reconstructed for all samples from CT data. Following procedures were performed unsing MeshLab, including converting the face models into hollowed cropped surface models amd automatically measuring the Hausdorff Distance (referred to as FSTT) between the skull and face models. Hausdorff point clouds were colorized based on depth value and exported as PLY files. A histogram of the depth distributions could be view and subdivided into smaller increments. All PLY files were visualized of Hausdorff distance value of each vertex. Basic descriptive statistics (i.e., mean, maximum, minimum and standard deviation etc.) and distribution of FSTT were analysis considering the sex, age, BMI and birthplace. Statistical methods employed included Multiple Regression Analysis, ANOVA, principal component analysis (PCA). Results: The distribution of FSTT is mainly influenced by BMI and sex, as further supported by the results of the PCA analysis. Additionally, FSTT values exceeding 30mm were found to be more sensitive to sex. Birthplace-related differences were observed in regions such as the forehead, orbital, mandibular, and zygoma. Specifically, there are distribution variances in the depth range of 20-30mm, particularly in the mandibular region. Northern males exhibit thinner FSTT in the frontal region of the forehead compared to southern males, while females shows fewer distribution differences between the northern and southern, except for the zygoma region. The observed distribution variance in the orbital region could be attributed to differences in orbital size and shape. Discussion: This study provides a database of Chinese individuals distribution of FSTT and suggested opening source tool shows fine function for FSTT measurement. By incorporating birthplace as an influential factor in the distribution of FSTT, a greater level of detail can be achieved in facial approximation.

Keywords: forensic anthropology, forensic imaging, cranial facial reconstruction, facial soft tissue thickness, CT, open-source tool

Procedia PDF Downloads 47
546 Classifications of Images for the Recognition of People’s Behaviors by SIFT and SVM

Authors: Henni Sid Ahmed, Belbachir Mohamed Faouzi, Jean Caelen

Abstract:

Behavior recognition has been studied for realizing drivers assisting system and automated navigation and is an important studied field in the intelligent Building. In this paper, a recognition method of behavior recognition separated from a real image was studied. Images were divided into several categories according to the actual weather, distance and angle of view etc. SIFT was firstly used to detect key points and describe them because the SIFT (Scale Invariant Feature Transform) features were invariant to image scale and rotation and were robust to changes in the viewpoint and illumination. My goal is to develop a robust and reliable system which is composed of two fixed cameras in every room of intelligent building which are connected to a computer for acquisition of video sequences, with a program using these video sequences as inputs, we use SIFT represented different images of video sequences, and SVM (support vector machine) Lights as a programming tool for classification of images in order to classify people’s behaviors in the intelligent building in order to give maximum comfort with optimized energy consumption.

Keywords: video analysis, people behavior, intelligent building, classification

Procedia PDF Downloads 363
545 A Review of Deep Learning Methods in Computer-Aided Detection and Diagnosis Systems based on Whole Mammogram and Ultrasound Scan Classification

Authors: Ian Omung'a

Abstract:

Breast cancer remains to be one of the deadliest cancers for women worldwide, with the risk of developing tumors being as high as 50 percent in Sub-Saharan African countries like Kenya. With as many as 42 percent of these cases set to be diagnosed late when cancer has metastasized and or the prognosis has become terminal, Full Field Digital [FFD] Mammography remains an effective screening technique that leads to early detection where in most cases, successful interventions can be made to control or eliminate the tumors altogether. FFD Mammograms have been proven to multiply more effective when used together with Computer-Aided Detection and Diagnosis [CADe] systems, relying on algorithmic implementations of Deep Learning techniques in Computer Vision to carry out deep pattern recognition that is comparable to the level of a human radiologist and decipher whether specific areas of interest in the mammogram scan image portray abnormalities if any and whether these abnormalities are indicative of a benign or malignant tumor. Within this paper, we review emergent Deep Learning techniques that will prove relevant to the development of State-of-The-Art FFD Mammogram CADe systems. These techniques will span self-supervised learning for context-encoded occlusion, self-supervised learning for pre-processing and labeling automation, as well as the creation of a standardized large-scale mammography dataset as a benchmark for CADe systems' evaluation. Finally, comparisons are drawn between existing practices that pre-date these techniques and how the development of CADe systems that incorporate them will be different.

Keywords: breast cancer diagnosis, computer aided detection and diagnosis, deep learning, whole mammogram classfication, ultrasound classification, computer vision

Procedia PDF Downloads 78
544 Miniaturizing the Volumetric Titration of Free Nitric Acid in U(vi) Solutions: On the Lookout for a More Sustainable Process Radioanalytical Chemistry through Titration-On-A-Chip

Authors: Jose Neri, Fabrice Canto, Alastair Magnaldo, Laurent Guillerme, Vincent Dugas

Abstract:

A miniaturized and automated approach for the volumetric titration of free nitric acid in U(VI) solutions is presented. Free acidity measurement refers to the acidity quantification in solutions containing hydrolysable heavy metal ions such as U(VI), U(IV) or Pu(IV) without taking into account the acidity contribution from the hydrolysis of such metal ions. It is, in fact, an operation having an essential role for the control of the nuclear fuel recycling process. The main objective behind the technical optimization of the actual ‘beaker’ method was to reduce the amount of radioactive substance to be handled by the laboratory personnel, to ease the instrumentation adjustability within a glove-box environment and to allow a high-throughput analysis for conducting more cost-effective operations. The measurement technique is based on the concept of the Taylor-Aris dispersion in order to create inside of a 200 μm x 5cm circular cylindrical micro-channel a linear concentration gradient in less than a second. The proposed analytical methodology relies on the actinide complexation using pH 5.6 sodium oxalate solution and subsequent alkalimetric titration of nitric acid with sodium hydroxide. The titration process is followed with a CCD camera for fluorescence detection; the neutralization boundary can be visualized in a detection range of 500nm- 600nm thanks to the addition of a pH sensitive fluorophore. The operating principle of the developed device allows the active generation of linear concentration gradients using a single cylindrical micro channel. This feature simplifies the fabrication and ease of use of the micro device, as it does not need a complex micro channel network or passive mixers to generate the chemical gradient. Moreover, since the linear gradient is determined by the liquid reagents input pressure, its generation can be fully achieved in faster intervals than one second, being a more timely-efficient gradient generation process compared to other source-sink passive diffusion devices. The resulting linear gradient generator device was therefore adapted to perform for the first time, a volumetric titration on a chip where the amount of reagents used is fixed to the total volume of the micro channel, avoiding an important waste generation like in other flow-based titration techniques. The associated analytical method is automated and its linearity has been proven for the free acidity determination of U(VI) samples containing up to 0.5M of actinide ion and nitric acid in a concentration range of 0.5M to 3M. In addition to automation, the developed analytical methodology and technique greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing a thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight-fold. The developed device represents, therefore, a great step towards an easy-to-handle nuclear-related application, which in the short term could be used to improve laboratory safety as much as to reduce the environmental impact of the radioanalytical chain.

Keywords: free acidity, lab-on-a-chip, linear concentration gradient, Taylor-Aris dispersion, volumetric titration

Procedia PDF Downloads 375
543 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.

Keywords: apartment complex, big data, life-cycle building value analysis, machine learning

Procedia PDF Downloads 357
542 Design an Development of an Agorithm for Prioritizing the Test Cases Using Neural Network as Classifier

Authors: Amit Verma, Simranjeet Kaur, Sandeep Kaur

Abstract:

Test Case Prioritization (TCP) has gained wide spread acceptance as it often results in good quality software free from defects. Due to the increase in rate of faults in software traditional techniques for prioritization results in increased cost and time. Main challenge in TCP is difficulty in manually validate the priorities of different test cases due to large size of test suites and no more emphasis are made to make the TCP process automate. The objective of this paper is to detect the priorities of different test cases using an artificial neural network which helps to predict the correct priorities with the help of back propagation algorithm. In our proposed work one such method is implemented in which priorities are assigned to different test cases based on their frequency. After assigning the priorities ANN predicts whether correct priority is assigned to every test case or not otherwise it generates the interrupt when wrong priority is assigned. In order to classify the different priority test cases classifiers are used. Proposed algorithm is very effective as it reduces the complexity with robust efficiency and makes the process automated to prioritize the test cases.

Keywords: test case prioritization, classification, artificial neural networks, TF-IDF

Procedia PDF Downloads 371
541 Genomic and Evolutionary Diversity of Long Terminal Repeat (LTR) Retrotransposons in Date Palm (Phoenix dactylifera)

Authors: Faisal Nouroz, Mukaramin Mukaramin

Abstract:

Of the transposable elements (TEs), the retrotransposons are the most copious elements identified from many sequenced genomes. They have played a major role in genome evolution, rearrangement, and expansions based on their copy and paste mode of proliferation. They are further divided into LTR and Non-LTR retrotransposons. The purpose of the current study was to identify the LTR REs in sequenced Phoenix dactylifera genome and to study their structural diversity. A total of 150 P. dactylifera BAC sequences with > 60kb sizes were randomly retrieved from National Center for Biotechnology Information (NCBI) database and screened for the presence of LTR retrotransposons. Seven bacterial artificial chromosomes (BAC) sequences showed full-length LTR Retrotransposons with 4 Copia and 3 Gypsy families having variable copy numbers in respective families. Reverse transcriptase (RT) domain was found as the most conserved domain among Copia and Gypsy superfamilies and was used to deduce evolutionary analysis. The amino acid residues among various RT sequences showed variability in their percentages indicating post divergence evolution. Amino acid Leucine was found in highest proportions followed by Lysine, while Methionine and Tryptophan were in lowest percentages. The phylogenetic analysis based on RT domains confirmed that although having most conserved RT regions, several evolutionary events occurred causing nucleotide polymorphisms and hence clustering of Gypsy and Copia superfamilies into their respective lineages. The study will be helpful in identification and annotation of these elements in other species and genera and their distribution patterns on chromosomes by fluorescent in situ hybridization techniques.

Keywords: transposable elements, Phoenix dactylifera, retrotransposons, phylogenetic analysis

Procedia PDF Downloads 117
540 Classification of Random Doppler-Radar Targets during the Surveillance Operations

Authors: G. C. Tikkiwal, Mukesh Upadhyay

Abstract:

During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving the army, moving convoys etc. The radar operator selects one of the promising targets into single target tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper, we present a technique using mathematical and statistical methods like fast fourier transformation (FFT) and principal component analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.

Keywords: radar target, FFT, principal component analysis, eigenvector, octave-notes, DSP

Procedia PDF Downloads 379
539 The Implementation of the Multi-Agent Classification System (MACS) in Compliance with FIPA Specifications

Authors: Mohamed R. Mhereeg

Abstract:

The paper discusses the implementation of the MultiAgent classification System (MACS) and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies, which are the .NET widows service based agents, the Windows Communication Foundation (WCF) services, the Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). Microsoft's .NET windows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW. The Monitoring Agents (MAs) were configured to execute automatically to monitor excel spreadsheets development activities by content. Data gathered by the Monitoring Agents from various resources over a period of time was collected and filtered by a Database Updater Agent (DUA) residing in the .NET client application of the system. This agent then transfers and stores the data in Oracle server database via Oracle stored procedures for further processing that leads to the classification of the end user developers.

Keywords: MACS, implementation, multi-agent, SOA, autonomous, WCF

Procedia PDF Downloads 261
538 High-Frequency Cryptocurrency Portfolio Management Using Multi-Agent System Based on Federated Reinforcement Learning

Authors: Sirapop Nuannimnoi, Hojjat Baghban, Ching-Yao Huang

Abstract:

Over the past decade, with the fast development of blockchain technology since the birth of Bitcoin, there has been a massive increase in the usage of Cryptocurrencies. Cryptocurrencies are not seen as an investment opportunity due to the market’s erratic behavior and high price volatility. With the recent success of deep reinforcement learning (DRL), portfolio management can be modeled and automated. In this paper, we propose a novel DRL-based multi-agent system to automatically make proper trading decisions on multiple cryptocurrencies and gain profits in the highly volatile cryptocurrency market. We also extend this multi-agent system with horizontal federated transfer learning for better adapting to the inclusion of new cryptocurrencies in our portfolio; therefore, we can, through the concept of diversification, maximize our profits and minimize the trading risks. Experimental results through multiple simulation scenarios reveal that this proposed algorithmic trading system can offer three promising key advantages over other systems, including maximized profits, minimized risks, and adaptability.

Keywords: cryptocurrency portfolio management, algorithmic trading, federated learning, multi-agent reinforcement learning

Procedia PDF Downloads 102
537 Royal Tourism: Conscious Perspicacity of Dubai

Authors: Aarti Suryawanshi

Abstract:

Royal Tourism has always been a popular niche activity for many tourists around the world. The United Kingdom being at the heart of it, has been a pioneering nation for Royal tourists. Though many other countries with monarchies such as India, Thailand, Japan, Spain, Netherlands, and many more have attracted tourists with the motivation to see and experience the royalty to their nations, the Middle Eastern countries have never really been the attraction for Royal tourists. Royalty in the middle east is fast emerging as a tourist product and also paving way to marketing opportunity that may lead to the increased popularity of the Royal Houses of the region. Dubai has been garnering the centre stage for futuristic developments, economic growth initiatives, and continuous efforts towards urbanisation which has brought the lime light on the Royal house of the Al Maktoum globally, along with the younger royal members being extensively recognised and appreciated for their public and private adventures which are shared through various social media platforms. The objective of this paper is to analyse the popularity of His Highness Sheikh Hamdan Bin Mohammed Bin Rashid Al Maktoum through social media platforms and the possibility of inducing Royal Tourism in Dubai. An empirical study has been performed to describe the automated repositioning of the city of Dubai as a royal tourism hub.

Keywords: royalty, royal tourism, monarchy, marketing strategy, repositioning

Procedia PDF Downloads 80
536 Logistic Model Tree and Expectation-Maximization for Pollen Recognition and Grouping

Authors: Endrick Barnacin, Jean-Luc Henry, Jack Molinié, Jimmy Nagau, Hélène Delatte, Gérard Lebreton

Abstract:

Palynology is a field of interest for many disciplines. It has multiple applications such as chronological dating, climatology, allergy treatment, and even honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time-consuming task that requires the intervention of experts in the field, which is becoming increasingly rare due to economic and social conditions. So, the automation of this task is a necessity. Pollen slides analysis is mainly a visual process as it is carried out with the naked eye. That is the reason why a primary method to automate palynology is the use of digital image processing. This method presents the lowest cost and has relatively good accuracy in pollen retrieval. In this work, we propose a system combining recognition and grouping of pollen. It consists of using a Logistic Model Tree to classify pollen already known by the proposed system while detecting any unknown species. Then, the unknown pollen species are divided using a cluster-based approach. Success rates for the recognition of known species have been achieved, and automated clustering seems to be a promising approach.

Keywords: pollen recognition, logistic model tree, expectation-maximization, local binary pattern

Procedia PDF Downloads 163
535 The Role of Named Entity Recognition for Information Extraction

Authors: Girma Yohannis Bade, Olga Kolesnikova, Grigori Sidorov

Abstract:

Named entity recognition (NER) is a building block for information extraction. Though the information extraction process has been automated using a variety of techniques to find and extract a piece of relevant information from unstructured documents, the discovery of targeted knowledge still poses a number of research difficulties because of the variability and lack of structure in Web data. NER, a subtask of information extraction (IE), came to exist to smooth such difficulty. It deals with finding the proper names (named entities), such as the name of the person, country, location, organization, dates, and event in a document, and categorizing them as predetermined labels, which is an initial step in IE tasks. This survey paper presents the roles and importance of NER to IE from the perspective of different algorithms and application area domains. Thus, this paper well summarizes how researchers implemented NER in particular application areas like finance, medicine, defense, business, food science, archeology, and so on. It also outlines the three types of sequence labeling algorithms for NER such as feature-based, neural network-based, and rule-based. Finally, the state-of-the-art and evaluation metrics of NER were presented.

Keywords: the role of NER, named entity recognition, information extraction, sequence labeling algorithms, named entity application area

Procedia PDF Downloads 63
534 Computer-Aided Diagnosis of Polycystic Kidney Disease Using ANN

Authors: G. Anjan Babu, G. Sumana, M. Rajasekhar

Abstract:

Many inherited diseases and non-hereditary disorders are common in the development of renal cystic diseases. Polycystic kidney disease (PKD) is a disorder developed within the kidneys in which grouping of cysts filled with water like fluid. PKD is responsible for 5-10% of end-stage renal failure treated by dialysis or transplantation. New experimental models, application of molecular biology techniques have provided new insights into the pathogenesis of PKD. Researchers are showing keen interest for developing an automated system by applying computer aided techniques for the diagnosis of diseases. In this paper a multi-layered feed forward neural network with one hidden layer is constructed, trained and tested by applying back propagation learning rule for the diagnosis of PKD based on physical symptoms and test results of urinanalysis collected from the individual patients. The data collected from 50 patients are used to train and test the network. Among these samples, 75% of the data used for training and remaining 25% of the data are used for testing purpose. Furthermore, this trained network is used to implement for new samples. The output results in normality and abnormality of the patient.

Keywords: dialysis, hereditary, transplantation, polycystic, pathogenesis

Procedia PDF Downloads 361
533 SVM-RBN Model with Attentive Feature Culling Method for Early Detection of Fruit Plant Diseases

Authors: Piyush Sharma, Devi Prasad Sharma, Sulabh Bansal

Abstract:

Diseases are fairly common in fruits and vegetables because of the changing climatic and environmental circumstances. Crop diseases, which are frequently difficult to control, interfere with the growth and output of the crops. Accurate disease detection and timely disease control measures are required to guarantee high production standards and good quality. In India, apples are a common crop that may be afflicted by a variety of diseases on the fruit, stem, and leaves. It is fungi, bacteria, and viruses that trigger the early symptoms of leaf diseases. In order to assist farmers and take the appropriate action, it is important to develop an automated system that can be used to detect the type of illnesses. Machine learning-based image processing can be used to: this research suggested a system that can automatically identify diseases in apple fruit and apple plants. Hence, this research utilizes the hybrid SVM-RBN model. As a consequence, the model may produce results that are more effective in terms of accuracy, precision, recall, and F1 Score, with respective values of 96%, 99%, 94%, and 93%.

Keywords: fruit plant disease, crop disease, machine learning, image processing, SVM-RBN

Procedia PDF Downloads 41
532 Definition and Core Components of the Role-Partner Allocation Problem in Collaborative Networks

Authors: J. Andrade-Garda, A. Anguera, J. Ares-Casal, M. Hidalgo-Lorenzo, J.-A. Lara, D. Lizcano, S. Suárez-Garaboa

Abstract:

In the current constantly changing economic context, collaborative networks allow partners to undertake projects that would not be possible if attempted by them individually. These projects usually involve the performance of a group of tasks (named roles) that have to be distributed among the partners. Thus, an allocation/matching problem arises that will be referred to as Role-Partner Allocation problem. In real life this situation is addressed by negotiation between partners in order to reach ad hoc agreements. Besides taking a long time and being hard work, both historical evidence and economic analysis show that such approach is not recommended. Instead, the allocation process should be automated by means of a centralized matching scheme. However, as a preliminary step to start the search for such a matching mechanism (or even the development of a new one), the problem and its core components must be specified. To this end, this paper establishes (i) the definition of the problem and its constraints, (ii) the key features of the involved elements (i.e., roles and partners); and (iii) how to create preference lists both for roles and partners. Only this way it will be possible to conduct subsequent methodological research on the solution method.     

Keywords: collaborative network, matching, partner, preference list, role

Procedia PDF Downloads 210
531 Analyzing the Impact of Code Commenting on Software Quality

Authors: Thulya Premathilake, Tharushi Perera, Hansi Thathsarani, Tharushi Nethmini, Dilshan De Silva, Piyumika Samarasekara

Abstract:

One of the most efficient ways to assist developers in grasping the source code is to make use of comments, which can be found throughout the code. When working in fields such as software development, having comments in your code that are of good quality is a fundamental requirement. Tackling software problems while making use of programs that have already been built. It is essential for the intention of the source code to be made crystal apparent in the comments that are added to the code. This assists programmers in better comprehending the programs they are working on and enables them to complete software maintenance jobs in a more timely manner. In spite of the fact that comments and documentation are meant to improve readability and maintainability, the vast majority of programmers place the majority of their focus on the actual code that is being written. This study provides a complete and comprehensive overview of the previous research that has been conducted on the topic of code comments. The study focuses on four main topics, including automated comment production, comment consistency, comment classification, and comment quality rating. One is able to get the knowledge that is more complete for use in following inquiries if they conduct an analysis of the proper approaches that were used in this study issue.

Keywords: code commenting, source code, software quality, quality assurance

Procedia PDF Downloads 66
530 An Intelligent Traffic Management System Based on the WiFi and Bluetooth Sensing

Authors: Hamed Hossein Afshari, Shahrzad Jalali, Amir Hossein Ghods, Bijan Raahemi

Abstract:

This paper introduces an automated clustering solution that applies to WiFi/Bluetooth sensing data and is later used for traffic management applications. The paper initially summarizes a number of clustering approaches and thereafter shows their performance for noise removal. In this context, clustering is used to recognize WiFi and Bluetooth MAC addresses that belong to passengers traveling by a public urban transit bus. The main objective is to build an intelligent system that automatically filters out MAC addresses that belong to persons located outside the bus for different routes in the city of Ottawa. The proposed intelligent system alleviates the need for defining restrictive thresholds that however reduces the accuracy as well as the range of applicability of the solution for different routes. This paper moreover discusses the performance benefits of the presented clustering approaches in terms of the accuracy, time and space complexity, and the ease of use. Note that results of clustering can further be used for the purpose of the origin-destination estimation of individual passengers, predicting the traffic load, and intelligent management of urban bus schedules.

Keywords: WiFi-Bluetooth sensing, cluster analysis, artificial intelligence, traffic management

Procedia PDF Downloads 222
529 Performance Optimization of Low-Cost Solar Dryer Using Modified PI Controller

Authors: Rajesh Kondareddy, Prakash Kumar Nayak, Maunash Das, Vrinatri Velentina Boro

Abstract:

Today, there is a huge global concern for sustainable development which would include minimizing the consumption of non-renewable energies without affecting the basic global economy. Solar drying is one of the important processes used for extending the shelf life of agricultural products. The performance of a low cost automated solar dryer fitted with cascade control scheme and modified PI controller for drying chilli was investigated. The dryer was composed of designed solar collector (air heater) fitted with cylindrical pipes to improve the air velocity and a solar drying chamber containing rack of two cheese cloth (net) trays both being integrated together. The air allowed in through air inlet is heated up in the solar collector and channelled through the drying chamber where it is utilized in drying (removing the moisture content from the food substance or agricultural produce loaded). Here, to maintain the temperature in the heating chambers and to improve performance, a modified PI (Proportional–Integral) controller was used due its simplicity and robustness. Drying time for drying chilli from the initial moisture content of 88.5% (wb) to 7.3% (wb) was estimated to be 14 hours in solar dryer whereas 32 h was observed in the open sun drying.

Keywords: cascade control, chilli, PI controller, solar dryer

Procedia PDF Downloads 274