Search results for: automatic assembling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 954

Search results for: automatic assembling

504 Faults Diagnosis by Thresholding and Decision tree with Neuro-Fuzzy System

Authors: Y. Kourd, D. Lefebvre

Abstract:

The monitoring of industrial processes is required to ensure operating conditions of industrial systems through automatic detection and isolation of faults. This paper proposes a method of fault diagnosis based on a neuro-fuzzy hybrid structure. This hybrid structure combines the selection of threshold and decision tree. The validation of this method is obtained with the DAMADICS benchmark. In the first phase of the method, a model will be constructed that represents the normal state of the system to fault detection. Signatures of the faults are obtained with residuals analysis and selection of appropriate thresholds. These signatures provide groups of non-separable faults. In the second phase, we build faulty models to see the flaws in the system that cannot be isolated in the first phase. In the latest phase we construct the tree that isolates these faults.

Keywords: decision tree, residuals analysis, ANFIS, fault diagnosis

Procedia PDF Downloads 609
503 Enhancing the Performance of Automatic Logistic Centers by Optimizing the Assignment of Material Flows to Workstations and Flow Racks

Authors: Sharon Hovav, Ilya Levner, Oren Nahum, Istvan Szabo

Abstract:

In modern large-scale logistic centers (e.g., big automated warehouses), complex logistic operations performed by human staff (pickers) need to be coordinated with the operations of automated facilities (robots, conveyors, cranes, lifts, flow racks, etc.). The efficiency of advanced logistic centers strongly depends on optimizing picking technologies in synch with the facility/product layout, as well as on optimal distribution of material flows (products) in the system. The challenge is to develop a mathematical operations research (OR) tool that will optimize system cost-effectiveness. In this work, we propose a model that describes an automatic logistic center consisting of a set of workstations located at several galleries (floors), with each station containing a known number of flow racks. The requirements of each product and the working capacity of stations served by a given set of workers (pickers) are assumed as predetermined. The goal of the model is to maximize system efficiency. The proposed model includes two echelons. The first is the setting of the (optimal) number of workstations needed to create the total processing/logistic system, subject to picker capacities. The second echelon deals with the assignment of the products to the workstations and flow racks, aimed to achieve maximal throughputs of picked products over the entire system given picker capacities and budget constraints. The solutions to the problems at the two echelons interact to balance the overall load in the flow racks and maximize overall efficiency. We have developed an operations research model within each echelon. In the first echelon, the problem of calculating the optimal number of workstations is formulated as a non-standard bin-packing problem with capacity constraints for each bin. The problem arising in the second echelon is presented as a constrained product-workstation-flow rack assignment problem with non-standard mini-max criteria in which the workload maximum is calculated across all workstations in the center and the exterior minimum is calculated across all possible product-workstation-flow rack assignments. The OR problems arising in each echelon are proved to be NP-hard. Consequently, we find and develop heuristic and approximation solution algorithms based on exploiting and improving local optimums. The LC model considered in this work is highly dynamic and is recalculated periodically based on updated demand forecasts that reflect market trends, technological changes, seasonality, and the introduction of new items. The suggested two-echelon approach and the min-max balancing scheme are shown to work effectively on illustrative examples and real-life logistic data.

Keywords: logistics center, product-workstation, assignment, maximum performance, load balancing, fast algorithm

Procedia PDF Downloads 211
502 Edge Detection Using Multi-Agent System: Evaluation on Synthetic and Medical MR Images

Authors: A. Nachour, L. Ouzizi, Y. Aoura

Abstract:

Recent developments on multi-agent system have brought a new research field on image processing. Several algorithms are used simultaneously and improved in deferent applications while new methods are investigated. This paper presents a new automatic method for edge detection using several agents and many different actions. The proposed multi-agent system is based on parallel agents that locally perceive their environment, that is to say, pixels and additional environmental information. This environment is built using Vector Field Convolution that attract free agent to the edges. Problems of partial, hidden or edges linking are solved with the cooperation between agents. The presented method was implemented and evaluated using several examples on different synthetic and medical images. The obtained experimental results suggest that this approach confirm the efficiency and accuracy of detected edge.

Keywords: edge detection, medical MRImages, multi-agent systems, vector field convolution

Procedia PDF Downloads 372
501 TRAC: A Software Based New Track Circuit for Traffic Regulation

Authors: Jérôme de Reffye, Marc Antoni

Abstract:

Following the development of the ERTMS system, we think it is interesting to develop another software-based track circuit system which would fit secondary railway lines with an easy-to-work implementation and a low sensitivity to rail-wheel impedance variations. We called this track circuit 'Track Railway by Automatic Circuits.' To be internationally implemented, this system must not have any mechanical component and must be compatible with existing track circuit systems. For example, the system is independent from the French 'Joints Isolants Collés' that isolate track sections from one another, and it is equally independent from component used in Germany called 'Counting Axles,' in French 'compteur d’essieux.' This track circuit is fully interoperable. Such universality is obtained by replacing the train detection mechanical system with a space-time filtering of train position. The various track sections are defined by the frequency of a continuous signal. The set of frequencies related to the track sections is a set of orthogonal functions in a Hilbert Space. Thus the failure probability of track sections separation is precisely calculated on the basis of signal-to-noise ratio. SNR is a function of the level of traction current conducted by rails. This is the reason why we developed a very powerful algorithm to reject noise and jamming to obtain an SNR compatible with the precision required for the track circuit and SIL 4 level. The SIL 4 level is thus reachable by an adjustment of the set of orthogonal functions. Our major contributions to railway engineering signalling science are i) Train space localization is precisely defined by a calibration system. The operation bypasses the GSM-R radio system of the ERTMS system. Moreover, the track circuit is naturally protected against radio-type jammers. After the calibration operation, the track circuit is autonomous. ii) A mathematical topology adapted to train space localization by following the train through a linear time filtering of the received signal. Track sections are numerically defined and can be modified with a software update. The system was numerically simulated, and results were beyond our expectations. We achieved a precision of one meter. Rail-ground and rail-wheel impedance sensitivity analysis gave excellent results. Results are now complete and ready to be published. This work was initialised as a research project of the French Railways developed by the Pi-Ramses Company under SNCF contract and required five years to obtain the results. This track circuit is already at Level 3 of the ERTMS system, and it will be much cheaper to implement and to work. The traffic regulation is based on variable length track sections. As the traffic growths, the maximum speed is reduced, and the track section lengths are decreasing. It is possible if the elementary track section is correctly defined for the minimum speed and if every track section is able to emit with variable frequencies.

Keywords: track section, track circuits, space-time crossing, adaptive track section, automatic railway signalling

Procedia PDF Downloads 314
500 Medical Neural Classifier Based on Improved Genetic Algorithm

Authors: Fadzil Ahmad, Noor Ashidi Mat Isa

Abstract:

This study introduces an improved genetic algorithm procedure that focuses search around near optimal solution corresponded to a group of elite chromosome. This is achieved through a novel crossover technique known as Segmented Multi Chromosome Crossover. It preserves the highly important information contained in a gene segment of elite chromosome and allows an offspring to carry information from gene segment of multiple chromosomes. In this way the algorithm has better possibility to effectively explore the solution space. The improved GA is applied for the automatic and simultaneous parameter optimization and feature selection of artificial neural network in pattern recognition of medical problem, the cancer and diabetes disease. The experimental result shows that the average classification accuracy of the cancer and diabetes dataset has improved by 0.1% and 0.3% respectively using the new algorithm.

Keywords: genetic algorithm, artificial neural network, pattern clasification, classification accuracy

Procedia PDF Downloads 457
499 Cursive Handwriting in an Internet Age

Authors: Karen Armstrong

Abstract:

Recent concerns about the value of teaching cursive handwriting in the classroom are based on the belief that cursive handwriting or penmanship is an outdated and unnecessary skill in today’s online world. The discussion of this issue begins with a description of current initiatives to eliminate handwriting instruction in schools. This is followed by a brief history of cursive writing through the ages. Next considered is a description of its benefits as a preliminary process for younger children as compared with immediate instruction in keyboarding, particularly in the areas of vision, cognition, motor skills and automatic fluency. Also considered, is cursive’s companion, paper itself, and the impact of a paperless, “screen and keyboard” environment. The discussion concludes with a consideration of the unique contributions of cursive and keyboarding as written forms of communication, along with their respective surfaces, paper and screen. Finally, an assessment of the practical utility of each skill is followed by an informal assessment of what is lost and what remains as we move from a predominantly paper and pen world of handwriting to texting and keyboarding in an environment of screens.

Keywords: asemic writing, cursive, handwriting, keyboarding, paper

Procedia PDF Downloads 262
498 Text Similarity in Vector Space Models: A Comparative Study

Authors: Omid Shahmirzadi, Adam Lugowski, Kenneth Younge

Abstract:

Automatic measurement of semantic text similarity is an important task in natural language processing. In this paper, we evaluate the performance of different vector space models to perform this task. We address the real-world problem of modeling patent-to-patent similarity and compare TFIDF (and related extensions), topic models (e.g., latent semantic indexing), and neural models (e.g., paragraph vectors). Contrary to expectations, the added computational cost of text embedding methods is justified only when: 1) the target text is condensed; and 2) the similarity comparison is trivial. Otherwise, TFIDF performs surprisingly well in other cases: in particular for longer and more technical texts or for making finer-grained distinctions between nearest neighbors. Unexpectedly, extensions to the TFIDF method, such as adding noun phrases or calculating term weights incrementally, were not helpful in our context.

Keywords: big data, patent, text embedding, text similarity, vector space model

Procedia PDF Downloads 154
497 Development of Orbital TIG Welding Robot System for the Pipe

Authors: Dongho Kim, Sung Choi, Kyowoong Pee, Youngsik Cho, Seungwoo Jeong, Soo-Ho Kim

Abstract:

This study is about the orbital TIG welding robot system which travels on the guide rail installed on the pipe, and welds and tracks the pipe seam using the LVS (Laser Vision Sensor) joint profile data. The orbital welding robot system consists of the robot, welder, controller, and LVS. Moreover we can define the relationship between welding travel speed and wire feed speed, and we can make the linear equation using the maximum and minimum amount of weld metal. Using the linear equation we can determine the welding travel speed and the wire feed speed accurately corresponding to the area of weld captured by LVS. We applied this orbital TIG welding robot system to the stainless steel or duplex pipe on DSME (Daewoo Shipbuilding and Marine Engineering Co. Ltd.,) shipyard and the result of radiographic test is almost perfect. (Defect rate: 0.033%).

Keywords: adaptive welding, automatic welding, pipe welding, orbital welding, laser vision sensor, LVS, welding D/B

Procedia PDF Downloads 665
496 Evolving Knowledge Extraction from Online Resources

Authors: Zhibo Xiao, Tharini Nayanika de Silva, Kezhi Mao

Abstract:

In this paper, we present an evolving knowledge extraction system named AKEOS (Automatic Knowledge Extraction from Online Sources). AKEOS consists of two modules, including a one-time learning module and an evolving learning module. The one-time learning module takes in user input query, and automatically harvests knowledge from online unstructured resources in an unsupervised way. The output of the one-time learning is a structured vector representing the harvested knowledge. The evolving learning module automatically schedules and performs repeated one-time learning to extract the newest information and track the development of an event. In addition, the evolving learning module summarizes the knowledge learned at different time points to produce a final knowledge vector about the event. With the evolving learning, we are able to visualize the key information of the event, discover the trends, and track the development of an event.

Keywords: evolving learning, knowledge extraction, knowledge graph, text mining

Procedia PDF Downloads 445
495 Urban Noise and Air Quality: Correlation between Air and Noise Pollution; Sensors, Data Collection, Analysis and Mapping in Urban Planning

Authors: Massimiliano Condotta, Paolo Ruggeri, Chiara Scanagatta, Giovanni Borga

Abstract:

Architects and urban planners, when designing and renewing cities, have to face a complex set of problems, including the issues of noise and air pollution which are considered as hot topics (i.e., the Clean Air Act of London and the Soundscape definition). It is usually taken for granted that these problems go by together because the noise pollution present in cities is often linked to traffic and industries, and these produce air pollutants as well. Traffic congestion can create both noise pollution and air pollution, because NO₂ is mostly created from the oxidation of NO, and these two are notoriously produced by processes of combustion at high temperatures (i.e., car engines or thermal power stations). We can see the same process for industrial plants as well. What have to be investigated – and is the topic of this paper – is whether or not there really is a correlation between noise pollution and air pollution (taking into account NO₂) in urban areas. To evaluate if there is a correlation, some low-cost methodologies will be used. For noise measurements, the OpeNoise App will be installed on an Android phone. The smartphone will be positioned inside a waterproof box, to stay outdoor, with an external battery to allow it to collect data continuously. The box will have a small hole to install an external microphone, connected to the smartphone, which will be calibrated to collect the most accurate data. For air, pollution measurements will be used the AirMonitor device, an Arduino board to which the sensors, and all the other components, are plugged. After assembling the sensors, they will be coupled (one noise and one air sensor) and placed in different critical locations in the area of Mestre (Venice) to map the existing situation. The sensors will collect data for a fixed period of time to have an input for both week and weekend days, in this way it will be possible to see the changes of the situation during the week. The novelty is that data will be compared to check if there is a correlation between the two pollutants using graphs that should show the percentage of pollution instead of the values obtained with the sensors. To do so, the data will be converted to fit on a scale that goes up to 100% and will be shown thru a mapping of the measurement using GIS methods. Another relevant aspect is that this comparison can help to choose which are the right mitigation solutions to be applied in the area of the analysis because it will make it possible to solve both the noise and the air pollution problem making only one intervention. The mitigation solutions must consider not only the health aspect but also how to create a more livable space for citizens. The paper will describe in detail the methodology and the technical solution adopted for the realization of the sensors, the data collection, noise and pollution mapping and analysis.

Keywords: air quality, data analysis, data collection, NO₂, noise mapping, noise pollution, particulate matter

Procedia PDF Downloads 199
494 A Cross-Gender Statistical Analysis of Tuvinian Intonation Features in Comparison With Uzbek and Azerbaijani

Authors: Daria Beziakina, Elena Bulgakova

Abstract:

The paper deals with cross-gender and cross-linguistic comparison of pitch characteristics for Tuvinian with two other Turkic languages - Uzbek and Azerbaijani, based on the results of statistical analysis of pitch parameter values and intonation patterns used by male and female speakers. The main goal of our work is to obtain the ranges of pitch parameter values typical for Tuvinian speakers for the purpose of automatic language identification. We also propose a cross-gender analysis of declarative intonation in the poorly studied Tuvinian language. The ranges of pitch parameter values were obtained by means of specially developed software that deals with the distribution of pitch values and allows us to obtain statistical language-specific pitch intervals.

Keywords: speech analysis, statistical analysis, speaker recognition, identification of person

Procedia PDF Downloads 331
493 Applications of Analytical Probabilistic Approach in Urban Stormwater Modeling in New Zealand

Authors: Asaad Y. Shamseldin

Abstract:

Analytical probabilistic approach is an innovative approach for urban stormwater modeling. It can provide information about the long-term performance of a stormwater management facility without being computationally very demanding. This paper explores the application of the analytical probabilistic approach in New Zealand. The paper presents the results of a case study aimed at development of an objective way of identifying what constitutes a rainfall storm event and the estimation of the corresponding statistical properties of storms using two selected automatic rainfall stations located in the Auckland region in New Zealand. The storm identification and the estimation of the storm statistical properties are regarded as the first step in the development of the analytical probabilistic models. The paper provides a recommendation about the definition of the storm inter-event time to be used in conjunction with the analytical probabilistic approach.

Keywords: hydrology, rainfall storm, storm inter-event time, New Zealand, stormwater management

Procedia PDF Downloads 324
492 Analyzing Test Data Generation Techniques Using Evolutionary Algorithms

Authors: Arslan Ellahi, Syed Amjad Hussain

Abstract:

Software Testing is a vital process in software development life cycle. We can attain the quality of software after passing it through software testing phase. We have tried to find out automatic test data generation techniques that are a key research area of software testing to achieve test automation that can eventually decrease testing time. In this paper, we review some of the approaches presented in the literature which use evolutionary search based algorithms like Genetic Algorithm, Particle Swarm Optimization (PSO), etc. to validate the test data generation process. We also look into the quality of test data generation which increases or decreases the efficiency of testing. We have proposed test data generation techniques for model-based testing. We have worked on tuning and fitness function of PSO algorithm.

Keywords: search based, evolutionary algorithm, particle swarm optimization, genetic algorithm, test data generation

Procedia PDF Downloads 169
491 Voxel Models as Input for Heat Transfer Simulations with Siemens NX Based on X-Ray Microtomography Images of Random Fibre Reinforced Composites

Authors: Steven Latré, Frederik Desplentere, Ilya Straumit, Stepan V. Lomov

Abstract:

A method is proposed in order to create a three-dimensional finite element model representing fibre reinforced insulation materials for the simulation software Siemens NX. VoxTex software, a tool for quantification of µCT images of fibrous materials, is used for the transformation of microtomography images of random fibre reinforced composites into finite element models. An automatic tool was developed to execute the import of the models to the thermal solver module of Siemens NX. The paper describes the numerical tools used for the image quantification and the transformation and illustrates them on several thermal simulations of fibre reinforced insulation blankets filled with low thermal conductive fillers. The calculation of thermal conductivity is validated by comparison with the experimental data.

Keywords: analysis, modelling, thermal, voxel

Procedia PDF Downloads 275
490 Efficient Manageability and Intelligent Classification of Web Browsing History Using Machine Learning

Authors: Suraj Gururaj, Sumantha Udupa U.

Abstract:

Browsing the Web has emerged as the de facto activity performed on the Internet. Although browsing gets tracked, the manageability aspect of Web browsing history is very poor. In this paper, we have a workable solution implemented by using machine learning and natural language processing techniques for efficient manageability of user’s browsing history. The significance of adding such a capability to a Web browser is that it ensures efficient and quick information retrieval from browsing history, which currently is very challenging. Our solution guarantees that any important websites visited in the past can be easily accessible because of the intelligent and automatic classification. In a nutshell, our solution-based paper provides an implementation as a browser extension by intelligently classifying the browsing history into most relevant category automatically without any user’s intervention. This guarantees no information is lost and increases productivity by saving time spent revisiting websites that were of much importance.

Keywords: adhoc retrieval, Chrome extension, supervised learning, tile, Web personalization

Procedia PDF Downloads 354
489 A Portable Device for Pulse Wave Velocity Measurements

Authors: Chien-Lin Wang, Cha-Ling Ko, Tainsong Chen

Abstract:

Pulse wave velocity (PWV) of blood flow provides important information of vessel property and blood pressure which can be used to assess cardiovascular disease. However, the above measurements need expensive equipment, such as Doppler ultrasound, MRI, angiography etc. The photoplethysmograph (PPG) signals are commonly utilized to detect blood volume changes. In this study, two infrared (IR) probes are designed and placed at a fixed distance from finger base and fingertip. An analog circuit with automatic gain adjustment is implemented to get the stable original PPG signals from above two IR probes. In order to obtain the time delay precisely between two PPG signals, we obtain the pulse transit time from the second derivative of the original PPG signals. To get a portable, wireless and low power consumption PWV measurement device, the low energy Bluetooth 4.0 (BLE) and the microprocessor (Cortex™-M3) are used in this study. The PWV is highly correlated with blood pressure. This portable device has potential to be used for continuous blood pressure monitoring.

Keywords: pulse wave velocity, photoplethysmography, portable device, biomedical engineering

Procedia PDF Downloads 513
488 A Smart Visitors’ Notification System with Automatic Secure Door Lock Using Mobile Communication Technology

Authors: Rabail Shafique Satti, Sidra Ejaz, Madiha Arshad, Marwa Khalid, Sadia Majeed

Abstract:

The paper presents the development of an automated security system to automate the entry of visitors, providing more flexibility of managing their record and securing homes or workplaces. Face recognition is part of this system to authenticate the visitors. A cost effective and SMS based door security module has been developed and integrated with the GSM network and made part of this system to allow communication between system and owner. This system functions in real time as when the visitor’s arrived it will detect and recognizes his face and on the result of face recognition process it will open the door for authorized visitors or notifies and allows the owner’s to take further action in case of unauthorized visitor. The proposed system is developed and it is successfully ensuring security, managing records and operating gate without physical interaction of owner.

Keywords: SMS, e-mail, GSM modem, authenticate, face recognition, authorized

Procedia PDF Downloads 768
487 X-Corner Detection for Camera Calibration Using Saddle Points

Authors: Abdulrahman S. Alturki, John S. Loomis

Abstract:

This paper discusses a corner detection algorithm for camera calibration. Calibration is a necessary step in many computer vision and image processing applications. Robust corner detection for an image of a checkerboard is required to determine intrinsic and extrinsic parameters. In this paper, an algorithm for fully automatic and robust X-corner detection is presented. Checkerboard corner points are automatically found in each image without user interaction or any prior information regarding the number of rows or columns. The approach represents each X-corner with a quadratic fitting function. Using the fact that the X-corners are saddle points, the coefficients in the fitting function are used to identify each corner location. The automation of this process greatly simplifies calibration. Our method is robust against noise and different camera orientations. Experimental analysis shows the accuracy of our method using actual images acquired at different camera locations and orientations.

Keywords: camera calibration, corner detector, edge detector, saddle points

Procedia PDF Downloads 390
486 Generating Real-Time Visual Summaries from Located Sensor-Based Data with Chorems

Authors: Z. Bouattou, R. Laurini, H. Belbachir

Abstract:

This paper describes a new approach for the automatic generation of the visual summaries dealing with cartographic visualization methods and sensors real time data modeling. Hence, the concept of chorems seems an interesting candidate to visualize real time geographic database summaries. Chorems have been defined by Roger Brunet (1980) as schematized visual representations of territories. However, the time information is not yet handled in existing chorematic map approaches, issue has been discussed in this paper. Our approach is based on spatial analysis by interpolating the values recorded at the same time, by sensors available, so we have a number of distributed observations on study areas and used spatial interpolation methods to find the concentration fields, from these fields and by using some spatial data mining procedures on the fly, it is possible to extract important patterns as geographic rules. Then, those patterns are visualized as chorems.

Keywords: geovisualization, spatial analytics, real-time, geographic data streams, sensors, chorems

Procedia PDF Downloads 385
485 Coal Mining Safety Monitoring Using Wsn

Authors: Somdatta Saha

Abstract:

The main purpose was to provide an implementable design scenario for underground coal mines using wireless sensor networks (WSNs). The main reason being that given the intricacies in the physical structure of a coal mine, only low power WSN nodes can produce accurate surveillance and accident detection data. The work mainly concentrated on designing and simulating various alternate scenarios for a typical mine and comparing them based on the obtained results to arrive at a final design. In the Era of embedded technology, the Zigbee protocols are used in more and more applications. Because of the rapid development of sensors, microcontrollers, and network technology, a reliable technological condition has been provided for our automatic real-time monitoring of coal mine. The underground system collects temperature, humidity and methane values of coal mine through sensor nodes in the mine; it also collects the number of personnel inside the mine with the help of an IR sensor, and then transmits the data to information processing terminal based on ARM.

Keywords: ARM, embedded board, wireless sensor network (Zigbee)

Procedia PDF Downloads 325
484 Improvement of Transient Voltage Response Using PSS-SVC Coordination Based on ANFIS-Algorithm in a Three-Bus Power System

Authors: I Made Ginarsa, Agung Budi Muljono, I Made Ari Nrartha

Abstract:

Transient voltage response appears in power system operation when an additional loading is forced to load bus of power systems. In this research, improvement of transient voltage response is done by using power system stabilizer-static var compensator (PSS-SVC) based on adaptive neuro-fuzzy inference system (ANFIS)-algorithm. The main function of the PSS is to add damping component to damp rotor oscillation through automatic voltage regulator (AVR) and excitation system. Learning process of the ANFIS is done by using off-line method where data learning that is used to train the ANFIS model are obtained by simulating the PSS-SVC conventional. The ANFIS model uses 7 Gaussian membership functions at two inputs and 49 rules at an output. Then, the ANFIS-PSS and ANFIS-SVC models are applied to power systems. Simulation result shows that the response of transient voltage is improved with settling time at the time of 4.25 s.

Keywords: improvement, transient voltage, PSS-SVC, ANFIS, settling time

Procedia PDF Downloads 555
483 A Novel Probabilistic Spatial Locality of Reference Technique for Automatic Cleansing of Digital Maps

Authors: A. Abdullah, S. Abushalmat, A. Bakshwain, A. Basuhail, A. Aslam

Abstract:

GIS (Geographic Information System) applications require geo-referenced data, this data could be available as databases or in the form of digital or hard-copy agro-meteorological maps. These parameter maps are color-coded with different regions corresponding to different parameter values, converting these maps into a database is not very difficult. However, text and different planimetric elements overlaid on these maps makes an accurate image to database conversion a challenging problem. The reason being, it is almost impossible to exactly replace what was underneath the text or icons; thus, pointing to the need for inpainting. In this paper, we propose a probabilistic inpainting approach that uses the probability of spatial locality of colors in the map for replacing overlaid elements with underlying color. We tested the limits of our proposed technique using non-textual simulated data and compared text removing results with a popular image editing tool using public domain data with promising results.

Keywords: noise, image, GIS, digital map, inpainting

Procedia PDF Downloads 331
482 Evolution under Length Constraints for Convolutional Neural Networks Architecture Design

Authors: Ousmane Youme, Jean Marie Dembele, Eugene Ezin, Christophe Cambier

Abstract:

In recent years, the convolutional neural networks (CNN) architectures designed by evolution algorithms have proven to be competitive with handcrafted architectures designed by experts. However, these algorithms need a lot of computational power, which is beyond the capabilities of most researchers and engineers. To overcome this problem, we propose an evolution architecture under length constraints. It consists of two algorithms: a search length strategy to find an optimal space and a search architecture strategy based on a genetic algorithm to find the best individual in the optimal space. Our algorithms drastically reduce resource costs and also keep good performance. On the Cifar-10 dataset, our framework presents outstanding performance with an error rate of 5.12% and only 4.6 GPU a day to converge to the optimal individual -22 GPU a day less than the lowest cost automatic evolutionary algorithm in the peer competition.

Keywords: CNN architecture, genetic algorithm, evolution algorithm, length constraints

Procedia PDF Downloads 109
481 Automatic Facial Skin Segmentation Using Possibilistic C-Means Algorithm for Evaluation of Facial Surgeries

Authors: Elham Alaee, Mousa Shamsi, Hossein Ahmadi, Soroosh Nazem, Mohammad Hossein Sedaaghi

Abstract:

Human face has a fundamental role in the appearance of individuals. So the importance of facial surgeries is undeniable. Thus, there is a need for the appropriate and accurate facial skin segmentation in order to extract different features. Since Fuzzy C-Means (FCM) clustering algorithm doesn’t work appropriately for noisy images and outliers, in this paper we exploit Possibilistic C-Means (PCM) algorithm in order to segment the facial skin. For this purpose, first, we convert facial images from RGB to YCbCr color space. To evaluate performance of the proposed algorithm, the database of Sahand University of Technology, Tabriz, Iran was used. In order to have a better understanding from the proposed algorithm; FCM and Expectation-Maximization (EM) algorithms are also used for facial skin segmentation. The proposed method shows better results than the other segmentation methods. Results include misclassification error (0.032) and the region’s area error (0.045) for the proposed algorithm.

Keywords: facial image, segmentation, PCM, FCM, skin error, facial surgery

Procedia PDF Downloads 570
480 Curcumin-Loaded Pickering Emulsion Stabilized by pH-Induced Self-Aggregated Chitosan Particles for Encapsulating Bioactive Compounds for Food, Flavor/Fragrance, Cosmetics, and Medicine

Authors: Rizwan Ahmed Bhutto, Noor ul ain Hira Bhutto, Mingwei Wang, Shahid Iqbal, Jiang Yi

Abstract:

Curcumin, a natural polyphenolic compound, boasts numerous health benefits; however, its industrial applications are hindered by instabilities and poor solubility. Encapsulating curcumin in Pickering emulsion presents a promising strategy to enhance its bioavailability. Yet, the development of an efficient and straightforward method to fabricate a natural emulsifier for Pickering emulsion poses a significant challenge. Chitosan has garnered attention due to its non-toxicity and excellent emulsifying properties. This study aimed to prepare four distinct types of self-aggregated chitosan particles using a pH-responsive self-assembling approach. The properties of the aggregated particles were adjusted by pH, degree of deacetylation (DDA), and molecular weight (MW), thereby controlling surface charge, size (ranging from nano to micro and floc), and contact angle. Pickering emulsions were then formulated using these various aggregated particles. As MW and pH increased and DDA decreased, the networked structures of the aggregated particles formed, resulting in highly elastic gels that were more resistant to the breakdown of Pickering emulsion at ambient temperature. With elevated temperatures, the kinetic energy of the aggregated particles increased, disrupting hydrogen bonds and potentially transforming the systems from fluids to gels. The Pickering emulsion based on aggregated particles served as a carrier for curcumin encapsulation. It was observed that DDA and MW played crucial roles in regulating drug loading, encapsulation efficiency, and release profile. This research sheds light on selecting suitable chitosan for controlling the release of bioactive compounds in Pickering emulsions, considering factors such as adjustable rheological properties, microstructure, and macrostructure. Furthermore, this study introduces an environmentally friendly and cost-effective synthesis of pH-responsive aggregate particles without the need for high-pressure homogenizers. It underscores the potential of aggregate particles with various MWs and DDAs for encapsulating other bioactive compounds, offering valuable applications in industries including food, flavor/fragrance, cosmetics, and medicine.

Keywords: chitosan, molecular weight, rheological properties, curcumin encapsulation

Procedia PDF Downloads 50
479 A Performance Comparison between Conventional and Flexible Box Erecting Machines Using Dispatching Rules

Authors: Min Kyu Kim, Eun Young Lee, Dong Woo Son, Yoon Seok Chang

Abstract:

In this paper, we introduce a flexible box erecting machine (BEM) that swiftly and automatically transforms cardboard into a three dimensional box. Recently, the parcel service and home-shopping industries have grown rapidly, and there is an increasing need for various box types to ship various products. However, workers cannot fold thousands of boxes manually in a day. As such, automatic BEMs are garnering greater attention. This study takes equipment operation into consideration as well as mechanical improvements in order to design a BEM that is able to outperform its conventional counterparts. We analyzed six dispatching rules – First In First Out (FIFO), Shortest Processing Time (SPT), Earliest Due Date (EDD), Setup Avoidance, EDD + SPT, and EDD + Setup Avoidance – to determine which one was most suitable for BEM operation. Consequently, SPT and Setup Avoidance were found to be the most critical rules, followed by EDD + Setup Avoidance, EDD + SPT, EDD, and FIFO. This hierarchy was valid for both our conventional BEM and our new flexible BEM from the viewpoint of processing time. We believe that this research can contribute to flexible BEM management, which has the potential to increase productivity and convenience.

Keywords: automation, box erecting machine, dispatching rule, setup time

Procedia PDF Downloads 344
478 Novel Low-cost Bubble CPAP as an Alternative Non-invasive Oxygen Therapy for Newborn Infants with Respiratory Distress Syndrome in a Tertiary Level Neonatal Intensive Care Unit in the Philippines: A Single Blind Randomized Controlled Trial

Authors: Navid P Roodaki, Rochelle Abila, Daisy Evangeline Garcia

Abstract:

Background and Objective: Respiratory Distress Syndrome (RDS) among premature infants is a major causes of neonatal death. The use of Continuous Positive Airway Pressure (CPAP) has become a standard of care for preterm newborns with RDS hence cost-effective innovations are needed. This study compared a novel low-cost Bubble CPAP (bCPAP) device to ventilator driven CPAP in the treatment of RDS. Methods: This is a single-blind, randomized controlled trial done on May 2022 to October 2022 in a Level III Neonatal Intensive Care Unit in the Philippines. Preterm newborns (<36 weeks) with RDS were randomized to receive Vayu bCPAP device or Ventilator-derived CPAP. Arterial Blood Gases, Oxygen Saturation, administration of surfactant, and CPAP failure rates were measured. Results: Seventy preterm newborns were included. No differences were observed between the Ventilator driven CPAP and Vayu bCPAP on the PaO2 (97.51mmHg vs 97.37mmHg), So2 (97.08% vs 95.60%) levels, amount of surfactant administered between groups. There were no observed differences in CPAP failure rates between Vayu bPCAP (x̄ 3.23 days) and ventilator-driven CPAP (x̄ 2.98 days). However, a significant difference was noted on the CO2 level (40.32mmHg vs 50.70mmHg), which was higher among those hooked to Ventilator-driven CPAP (p 0.004). Conclusion: This study has shown that the novel low-cost bubble CPAP (Vayu bCPAP) can be used as an efficacious alternate non invasive oxygen therapy among preterm neonates with RDS, although the CO2 levels were higher among those hooked to ventilator driven CPAP, other outcome parameters measured showed that both devices are comparable. Recommendation: A multi-center or national study to account for geographic region, which may alter the outcomes of patients connected to different ventilatory support. Cost comparison between devices is also suggested. A mixed-method research assessing the experiences of health care professionals in assembling and utilizing the gadget is a second consideration.

Keywords: bubble CPAP, ventilator-derived CPAP; infant, premature, respiratory distress syndrome

Procedia PDF Downloads 65
477 Modeling of Crack Propagation Path in Concrete with Coarse Trapezoidal Aggregates by Boundary Element Method

Authors: Chong Wang, Alexandre Urbano Hoffmann

Abstract:

Interaction between a crack and a trapezoidal aggregate in a single edge notched concrete beam is simulated using boundary element method with an automatic crack extension program. The stress intensity factors of the growing crack are obtained from the J-integral. Three crack extension paths: deflecting around the particulate, growing along the interface and penetrating into the particulate are achieved in terms of the mismatch state of mechanical characteristics of matrix and the particulate. The toughening is also given by the ratio of stress intensity factors. The results reveal that as stress shielding occurs, toughening is obtained when the crack is approaching to a stiff and strong aggregate weakly bonded to a relatively soft matrix. The present work intends to help for the design of aggregate reinforced concretes.

Keywords: aggregate concrete, boundary element method, two-phase composite, crack extension path, crack/particulate interaction

Procedia PDF Downloads 411
476 Power Quality Issues: Power Supply Interruptions as Key Constraint to Development in Ekiti State, Nigeria

Authors: Oluwatosin S. Adeoye

Abstract:

The power quality issues in the world today are critical to the development of different nations. Prosperity of each nation depends on availability of constant power supply. Constant power supply is a major challenge in Africa particularly in Nigeria where the generated power is than thirty percent of the required power. The metrics of power quality are voltage dip, flickers, spikes, harmonics and interruptions. The level of interruptions in Ekiti State was examined through the investigation of the causes of power interruptions in the State. The method used was the collection of data from the Distribution Company, assessment through simple programming as a command for plotting the graphs through the use of MATLAB 2015 depicting the behavioural pattern of the interruption for a period of six months in 2016. The result shows that the interrelationship between the interruptions and development. Recommendations were suggested with the objective of solving the problems being set up by interruptions in the State and these include installation of reactors, automatic voltage regulators and effective tap changing system on the lines, busses and transformer substation respectively.

Keywords: development, frequency, interruption, power, quality

Procedia PDF Downloads 148
475 Robust and Real-Time Traffic Counting System

Authors: Hossam M. Moftah, Aboul Ella Hassanien

Abstract:

In the recent years the importance of automatic traffic control has increased due to the traffic jams problem especially in big cities for signal control and efficient traffic management. Traffic counting as a kind of traffic control is important to know the road traffic density in real time. This paper presents a fast and robust traffic counting system using different image processing techniques. The proposed system is composed of the following four fundamental building phases: image acquisition, pre-processing, object detection, and finally counting the connected objects. The object detection phase is comprised of the following five steps: subtracting the background, converting the image to binary, closing gaps and connecting nearby blobs, image smoothing to remove noises and very small objects, and detecting the connected objects. Experimental results show the great success of the proposed approach.

Keywords: traffic counting, traffic management, image processing, object detection, computer vision

Procedia PDF Downloads 278