Search results for: water cycle algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13765

Search results for: water cycle algorithm

10045 An Indoor Guidance System Combining Near Field Communication and Bluetooth Low Energy Beacon Technologies

Authors: Rung-Shiang Cheng, Wei-Jun Hong, Jheng-Syun Wang, Kawuu W. Lin

Abstract:

Users rely increasingly on Location-Based Services (LBS) and automated navigation/guidance systems nowadays. However, while such services are easily implemented in outdoor environments using Global Positioning System (GPS) technology, a requirement still exists for accurate localization and guidance schemes in indoor settings. Accordingly, the present study presents a methodology based on GPS, Bluetooth Low Energy (BLE) beacons, and Near Field Communication (NFC) technology. Through establishing graphic information and the design of algorithm, this study develops a guidance system for indoor and outdoor on smartphones, with aim to provide users a smart life through this system. The presented system is implemented on a smartphone and evaluated on a student campus environment. The experimental results confirm the ability of the presented app to switch automatically from an outdoor mode to an indoor mode and to guide the user to the requested target destination via the shortest possible route.

Keywords: beacon, indoor, BLE, Dijkstra algorithm

Procedia PDF Downloads 302
10044 River Offtake Management Using Mathematical Modelling Tool: A Case Study of the Gorai River, Bangladesh

Authors: Sarwat Jahan, Asker Rajin Rahman

Abstract:

Management of offtake of any fluvial river is very sensitive in terms of long-term sustainability where the variation of water flow and sediment transport range are wide enough throughout a hydrological year. The Gorai River is a major distributary of the Ganges River in Bangladesh and is termed as a primary source of fresh water for the South-West part of the country. Every year, significant siltation of the Gorai offtake disconnects it from the Ganges during the dry season. As a result, the socio-economic and environmental condition of the downstream areas has been deteriorating for a few decades. To improve the overall situation of the Gorai offtake and its dependent areas, a study has been conducted by the Institute of Water Modelling, Bangladesh, in 2022. Using the mathematical morphological modeling tool MIKE 21C of DHI Water & Environment, Denmark, simulated results revealed the need for dredging/river training structures for offtake management at the Gorai offtake to ensure significant dry season flow towards the downstream. The dry season flow is found to increase significantly with the proposed river interventions, which also improves the environmental conditions in terms of salinity of the South-West zone of the country. This paper summarizes the primary findings of the analyzed results of the developed mathematical model for improving the existing condition of the Gorai River.

Keywords: Gorai river, mathematical modelling, offtake, siltation, salinity

Procedia PDF Downloads 98
10043 Improved Processing Speed for Text Watermarking Algorithm in Color Images

Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari

Abstract:

Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.

Keywords: steganography, watermarking, time complexity measurements, private keys

Procedia PDF Downloads 143
10042 Bisphenol-A Concentrations in Urine and Drinking Water Samples of Adults Living in Ankara

Authors: Hasan Atakan Sengul, Nergis Canturk, Bahar Erbas

Abstract:

Drinking water is indispensable for life. With increasing awareness of communities, the content of drinking water and tap water has been a matter of curiosity. The presence of Bisphenol-A is the top one when content curiosity is concerned. The most used chemical worldwide for production of polycarbonate plastics and epoxy resins is Bisphenol-A. People are exposed to Bisphenol-A chemical, which disrupts the endocrine system, almost every day. Each year it is manufactured an average of 5.4 billion kilograms of Bisphenol-A. Linear formula of Bisphenol-A is (CH₃)₂C(C₆H₄OH)₂, its molecular weight is 228.29 and CAS number is 80-05-7. Bisphenol-A is known to be used in the manufacturing of plastics, along with various chemicals. Bisphenol-A, an industrial chemical, is used in the raw materials of packaging mate-rials in the monomers of polycarbonate and epoxy resins. The pass through the nutrients of Bisphenol-A substance happens by packaging. This substance contaminates with nutrition and penetrates into body by consuming. International researches show that BPA is transported through body fluids, leading to hormonal disorders in animals. Experimental studies on animals report that BPA exposure also affects the gender of the newborn and its time to reach adolescence. The extent to what similar endocrine disrupting effects are on humans is a debate topic in many researches. In our country, detailed studies on BPA have not been done. However, it is observed that 'BPA-free' phrases are beginning to appear on plastic packaging such as baby products and water carboys. Accordingly, this situation increases the interest of the society about the subject; yet it causes information pollution. In our country, all national and international studies on exposure to BPA have been examined and Ankara province has been designated as testing region. To assess the effects of plastic use in daily habits of people and the plastic amounts removed out of the body, the results of the survey conducted with volunteers who live in Ankara has been analyzed with Sciex appliance by means of LC-MS/MS in the laboratory and the amount of exposure and BPA removal have been detected by comparing the results elicited before. The results have been compared with similar studies done in international arena and the relation between them has been exhibited. Consequently, there has been found no linear correlation between the amount of BPA in drinking water and the amount of BPA in urine. This has also revealed that environmental exposure and the habits of daily plastic use have also direct effects a human body. When the amount of BPA in drinking water is considered; minimum 0.028 µg/L, maximum 1.136 µg/L, mean 0.29194 µg/L and SD(standard deviation)= 0.199 have been detected. When the amount of BPA in urine is considered; minimum 0.028 µg/L, maximum 0.48 µg/L, mean 0.19181 µg/L and SD= 0.099 have been detected. In conclusion, there has been found no linear correlation between the amount of BPA in drinking water and the amount of BPA in urine (r= -0.151). The p value of the comparison between drinking water’s and urine’s BPA amounts is 0.004 which shows that there is a significant change and the amounts of BPA in urine is dependent on the amounts in drinking waters (p < 0.05). This has revealed that environmental exposure and daily plastic habits have also direct effects on the human body.

Keywords: analyze of bisphenol-A, BPA, BPA in drinking water, BPA in urine

Procedia PDF Downloads 128
10041 New Environmentally Friendly Material for the Purification of the Fresh Water from Oil Pollution

Authors: M. A. Ashour

Abstract:

As it is known Egypt is one of the countries having oldest sugarcane industry, which goes back to the year 710 AD. Cane plantations are the main agricultural product in five governorates in Upper Egypt (El-Menia, Sohag, Qena, Luxor, and Aswan), producing not less than 16 million tons a year. Eight factories (Abou-korkas, Gena, Nagaa-Hamadi, Deshna, Kous, Armant, Edfuo, and Komombo), located in such upper Egypt governorates generates huge amount of wastes during the manufacturing stage, the so called bagasse which is the fibrous, and cellulosic materials remaining after the era of the sugarcane and the juice extraction, presents about 30% of such wastes. The amount of bagasse generated yearly through the manufacturing stage of the above mentioned 8 factories is approximately about 2.8 million tons, getting red safely of such huge amount, presents a serious environmental problem. Storage of that material openly in the so hot climate in upper Egypt, may cause its self-ignition under air temperature reaches 50 degrees centigrade in summer, due to the remained residual content of sugar. At the same time preparing places for safely storage for such amount is very expensive with respect to the valueless of it. So the best way for getting rid of bagasse is converting it into an added value environmentally friendly material, especially till now the utilization of it is so limited. Since oil pollution became a serious concern, the issue of environmental cleaning arises. With the structure of sugarcane bagasse, which contains fiber and high content of carbon, it can be an adsorbent to adsorb the oil contamination from the water. The present study is a trail to introduce a new material for the purification of water systems to score two goals at once, the first is getting rid of that harmful waste safely, the second is converting it to a commercial valuable material for cleaning, and purifying the water from oil spills, and petroleum pollution. Introduced the new material proved very good performance, and higher efficiency than other similar materials available in the local market, in both closed and open systems. The introduced modified material can absorb 10 times its weight of oil, while don't absorb any water.

Keywords: environment, water resources, agricultural wastes, oil pollution control, sugarcane

Procedia PDF Downloads 189
10040 Improving Human Hand Localization in Indoor Environment by Using Frequency Domain Analysis

Authors: Wipassorn Vinicchayakul, Pichaya Supanakoon, Sathaporn Promwong

Abstract:

A human’s hand localization is revised by using radar cross section (RCS) measurements with a minimum root mean square (RMS) error matching algorithm on a touchless keypad mock-up model. RCS and frequency transfer function measurements are carried out in an indoor environment on the frequency ranged from 3.0 to 11.0 GHz to cover federal communications commission (FCC) standards. The touchless keypad model is tested in two different distances between the hand and the keypad. The initial distance of 19.50 cm is identical to the heights of transmitting (Tx) and receiving (Rx) antennas, while the second distance is 29.50 cm from the keypad. Moreover, the effects of Rx angles relative to the hand of human factor are considered. The RCS input parameters are compared with power loss parameters at each frequency. From the results, the performance of the RCS input parameters with the second distance, 29.50 cm at 3 GHz is better than the others.

Keywords: radar cross section, fingerprint-based localization, minimum root mean square (RMS) error matching algorithm, touchless keypad model

Procedia PDF Downloads 342
10039 Pattern Synthesis of Nonuniform Linear Arrays Including Mutual Coupling Effects Based on Gaussian Process Regression and Genetic Algorithm

Authors: Ming Su, Ziqiang Mu

Abstract:

This paper proposes a synthesis method for nonuniform linear antenna arrays that combine Gaussian process regression (GPR) and genetic algorithm (GA). In this method, the GPR model can be used to calculate the array radiation pattern in the presence of mutual coupling effects, and then the GA is used to optimize the excitations and locations of the elements so as to generate the desired radiation pattern. In this paper, taking a 9-element nonuniform linear array as an example and the desired radiation pattern corresponding to a Chebyshev distribution as the optimization objective, optimize the excitations and locations of the elements. Finally, the optimization results are verified by electromagnetic simulation software CST, which shows that the method is effective.

Keywords: nonuniform linear antenna arrays, GPR, GA, mutual coupling effects, active element pattern

Procedia PDF Downloads 110
10038 A Laboratory Study into the Effects of Surface Waves on Freestyle Swimming

Authors: Scott Draper, Nat Benjanuvatra, Grant Landers, Terry Griffiths, Justin Geldard

Abstract:

Open water swimming has been an Olympic sport since 2008 and is growing in popularity world-wide as a low impact form of exercise. Unlike pool swimming, open water swimmers experience a range of different environmental conditions, including surface waves, variable water temperature, aquatic life, and ocean currents. This presentation will describe experimental research to investigate how freestyle swimming behaviour and performance is influenced by surface waves. A group of 12 swimmers were instructed to swim freestyle in the 54 m long wave flume located at The University of Western Australia’s Coastal and Offshore Engineering Laboratory. A variety of different regular waves were simulated, varying in height (up to 0.3 m), period (1.25 – 4s), and direction (with or against the swimmer). Swimmer’s velocity and acceleration, respectively, were determined from video recording and inertial sensors attached to five different parts of the swimmer’s body. The results illustrate how the swimmers stroke rate and the wave encounter frequency influence their forward speed and how particular wave conditions can benefit or hinder performance. Comparisons to simplified mathematical models provide insight into several aspects of performance, including: (i) how much faster swimmers can travel when swimming with as opposed to against the waves, and (ii) why swimmers of lesser ability are expected to be affected proportionally more by waves than elite swimmers. These findings have implications across the spectrum from elite to ‘weekend’ swimmers, including how they are coached and their ability to win (or just successfully complete) iconic open water events such as the Rottnest Channel Swim held annually in Western Australia.

Keywords: open water, surface waves, wave height/length, wave flume, stroke rate

Procedia PDF Downloads 112
10037 Applying Hybrid Graph Drawing and Clustering Methods on Stock Investment Analysis

Authors: Mouataz Zreika, Maria Estela Varua

Abstract:

Stock investment decisions are often made based on current events of the global economy and the analysis of historical data. Conversely, visual representation could assist investors’ gain deeper understanding and better insight on stock market trends more efficiently. The trend analysis is based on long-term data collection. The study adopts a hybrid method that combines the Clustering algorithm and Force-directed algorithm to overcome the scalability problem when visualizing large data. This method exemplifies the potential relationships between each stock, as well as determining the degree of strength and connectivity, which will provide investors another understanding of the stock relationship for reference. Information derived from visualization will also help them make an informed decision. The results of the experiments show that the proposed method is able to produced visualized data aesthetically by providing clearer views for connectivity and edge weights.

Keywords: clustering, force-directed, graph drawing, stock investment analysis

Procedia PDF Downloads 302
10036 Seismic Response Control of Multi-Span Bridge Using Magnetorheological Dampers

Authors: B. Neethu, Diptesh Das

Abstract:

The present study investigates the performance of a semi-active controller using magneto-rheological dampers (MR) for seismic response reduction of a multi-span bridge. The application of structural control to the structures during earthquake excitation involves numerous challenges such as proper formulation and selection of the control strategy, mathematical modeling of the system, uncertainty in system parameters and noisy measurements. These problems, however, need to be tackled in order to design and develop controllers which will efficiently perform in such complex systems. A control algorithm, which can accommodate un-certainty and imprecision compared to all the other algorithms mentioned so far, due to its inherent robustness and ability to cope with the parameter uncertainties and imprecisions, is the sliding mode algorithm. A sliding mode control algorithm is adopted in the present study due to its inherent stability and distinguished robustness to system parameter variation and external disturbances. In general a semi-active control scheme using an MR damper requires two nested controllers: (i) an overall system controller, which derives the control force required to be applied to the structure and (ii) an MR damper voltage controller which determines the voltage required to be supplied to the damper in order to generate the desired control force. In the present study a sliding mode algorithm is used to determine the desired optimal force. The function of the voltage controller is to command the damper to produce the desired force. The clipped optimal algorithm is used to find the command voltage supplied to the MR damper which is regulated by a semi active control law based on sliding mode algorithm. The main objective of the study is to propose a robust semi active control which can effectively control the responses of the bridge under real earthquake ground motions. Lumped mass model of the bridge is developed and time history analysis is carried out by solving the governing equations of motion in the state space form. The effectiveness of MR dampers is studied by analytical simulations by subjecting the bridge to real earthquake records. In this regard, it may also be noted that the performance of controllers depends, to a great extent, on the characteristics of the input ground motions. Therefore, in order to study the robustness of the controller in the present study, the performance of the controllers have been investigated for fourteen different earthquake ground motion records. The earthquakes are chosen in such a way that all possible characteristic variations can be accommodated. Out of these fourteen earthquakes, seven are near-field and seven are far-field. Also, these earthquakes are divided into different frequency contents, viz, low-frequency, medium-frequency, and high-frequency earthquakes. The responses of the controlled bridge are compared with the responses of the corresponding uncontrolled bridge (i.e., the bridge without any control devices). The results of the numerical study show that the sliding mode based semi-active control strategy can substantially reduce the seismic responses of the bridge showing a stable and robust performance for all the earthquakes.

Keywords: bridge, semi active control, sliding mode control, MR damper

Procedia PDF Downloads 124
10035 A Local Tensor Clustering Algorithm to Annotate Uncharacterized Genes with Many Biological Networks

Authors: Paul Shize Li, Frank Alber

Abstract:

A fundamental task of clinical genomics is to unravel the functions of genes and their associations with disorders. Although experimental biology has made efforts to discover and elucidate the molecular mechanisms of individual genes in the past decades, still about 40% of human genes have unknown functions, not to mention the diseases they may be related to. For those biologists who are interested in a particular gene with unknown functions, a powerful computational method tailored for inferring the functions and disease relevance of uncharacterized genes is strongly needed. Studies have shown that genes strongly linked to each other in multiple biological networks are more likely to have similar functions. This indicates that the densely connected subgraphs in multiple biological networks are useful in the functional and phenotypic annotation of uncharacterized genes. Therefore, in this work, we have developed an integrative network approach to identify the frequent local clusters, which are defined as those densely connected subgraphs that frequently occur in multiple biological networks and consist of the query gene that has few or no disease or function annotations. This is a local clustering algorithm that models multiple biological networks sharing the same gene set as a three-dimensional matrix, the so-called tensor, and employs the tensor-based optimization method to efficiently find the frequent local clusters. Specifically, massive public gene expression data sets that comprehensively cover dynamic, physiological, and environmental conditions are used to generate hundreds of gene co-expression networks. By integrating these gene co-expression networks, for a given uncharacterized gene that is of biologist’s interest, the proposed method can be applied to identify the frequent local clusters that consist of this uncharacterized gene. Finally, those frequent local clusters are used for function and disease annotation of this uncharacterized gene. This local tensor clustering algorithm outperformed the competing tensor-based algorithm in both module discovery and running time. We also demonstrated the use of the proposed method on real data of hundreds of gene co-expression data and showed that it can comprehensively characterize the query gene. Therefore, this study provides a new tool for annotating the uncharacterized genes and has great potential to assist clinical genomic diagnostics.

Keywords: local tensor clustering, query gene, gene co-expression network, gene annotation

Procedia PDF Downloads 168
10034 Elephant Herding Optimization for Service Selection in QoS-Aware Web Service Composition

Authors: Samia Sadouki Chibani, Abdelkamel Tari

Abstract:

Web service composition combines available services to provide new functionality. Given the number of available services with similar functionalities and different non functional aspects (QoS), the problem of finding a QoS-optimal web service composition is considered as an optimization problem belonging to NP-hard class. Thus, an optimal solution cannot be found by exact algorithms within a reasonable time. In this paper, a meta-heuristic bio-inspired is presented to address the QoS aware web service composition; it is based on Elephant Herding Optimization (EHO) algorithm, which is inspired by the herding behavior of elephant group. EHO is characterized by a process of dividing and combining the population to sub populations (clan); this process allows the exchange of information between local searches to move toward a global optimum. However, with Applying others evolutionary algorithms the problem of early stagnancy in a local optimum cannot be avoided. Compared with PSO, the results of experimental evaluation show that our proposition significantly outperforms the existing algorithm with better performance of the fitness value and a fast convergence.

Keywords: bio-inspired algorithms, elephant herding optimization, QoS optimization, web service composition

Procedia PDF Downloads 327
10033 Alphabet Recognition Using Pixel Probability Distribution

Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay

Abstract:

Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.

Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix

Procedia PDF Downloads 389
10032 Sustainability in Retaining Wall Construction with Geosynthetics

Authors: Sateesh Kumar Pisini, Swetha Priya Darshini, Sanjay Kumar Shukla

Abstract:

This paper seeks to present a research study on sustainability in construction of retaining wall using geosynthetics. Sustainable construction is a way for the building and infrastructure industry to move towards achieving sustainable development, taking into account environmental, socioeconomic and cultural issues. Geotechnical engineering, being very resource intensive, warrants an environmental sustainability study, but a quantitative framework for assessing the sustainability of geotechnical practices, particularly at the planning and design stages, does not exist. In geotechnical projects, major economic issues to be addressed are in the design and construction of stable slopes and retaining structures within space constraints. In this paper, quantitative indicators for assessing the environmental sustainability of retaining wall with geosynthetics are compared with conventional concrete retaining wall through life cycle assessment (LCA). Geosynthetics can make a real difference in sustainable construction techniques and contribute to development in developing countries in particular. Their imaginative application can result in considerable cost savings over the use of conventional designs and materials. The acceptance of geosynthetics in reinforced retaining wall construction has been triggered by a number of factors, including aesthetics, reliability, simple construction techniques, good seismic performance, and the ability to tolerate large deformations without structural distress. Reinforced retaining wall with geosynthetics is the best cost-effective and eco-friendly solution as compared with traditional concrete retaining wall construction. This paper presents an analysis of the theme of sustainability applied to the design and construction of traditional concrete retaining wall and presenting a cost-effective and environmental solution using geosynthetics.

Keywords: sustainability, retaining wall, geosynthetics, life cycle assessment

Procedia PDF Downloads 2060
10031 Detecting Tomato Flowers in Greenhouses Using Computer Vision

Authors: Dor Oppenheim, Yael Edan, Guy Shani

Abstract:

This paper presents an image analysis algorithm to detect and count yellow tomato flowers in a greenhouse with uneven illumination conditions, complex growth conditions and different flower sizes. The algorithm is designed to be employed on a drone that flies in greenhouses to accomplish several tasks such as pollination and yield estimation. Detecting the flowers can provide useful information for the farmer, such as the number of flowers in a row, and the number of flowers that were pollinated since the last visit to the row. The developed algorithm is designed to handle the real world difficulties in a greenhouse which include varying lighting conditions, shadowing, and occlusion, while considering the computational limitations of the simple processor in the drone. The algorithm identifies flowers using an adaptive global threshold, segmentation over the HSV color space, and morphological cues. The adaptive threshold divides the images into darker and lighter images. Then, segmentation on the hue, saturation and volume is performed accordingly, and classification is done according to size and location of the flowers. 1069 images of greenhouse tomato flowers were acquired in a commercial greenhouse in Israel, using two different RGB Cameras – an LG G4 smartphone and a Canon PowerShot A590. The images were acquired from multiple angles and distances and were sampled manually at various periods along the day to obtain varying lighting conditions. Ground truth was created by manually tagging approximately 25,000 individual flowers in the images. Sensitivity analyses on the acquisition angle of the images, periods throughout the day, different cameras and thresholding types were performed. Precision, recall and their derived F1 score were calculated. Results indicate better performance for the view angle facing the flowers than any other angle. Acquiring images in the afternoon resulted with the best precision and recall results. Applying a global adaptive threshold improved the median F1 score by 3%. Results showed no difference between the two cameras used. Using hue values of 0.12-0.18 in the segmentation process provided the best results in precision and recall, and the best F1 score. The precision and recall average for all the images when using these values was 74% and 75% respectively with an F1 score of 0.73. Further analysis showed a 5% increase in precision and recall when analyzing images acquired in the afternoon and from the front viewpoint.

Keywords: agricultural engineering, image processing, computer vision, flower detection

Procedia PDF Downloads 329
10030 Molecular Detection of E. coli in Treated Wastewater and Well Water Samples Collected from Al Riyadh Governorate, Saudi Arabia

Authors: Hanouf A. S. Al Nuwaysir, Nadine Moubayed, Abir Ben Bacha, Islem Abid

Abstract:

Consumption of waste water continues to cause significant problems for human health in both developed and developing countries. Many regulations have been implied by different world authorities controlling water quality for the presence of coliforms used as standard indicators of water quality deterioration and historically leading health protection concept. In this study, the European directive for the detection of Escherichia coli, ISO 9308-1, was applied to examine and monitor coliforms in water samples collected from Wadi Hanifa and neighboring wells, Riyadh governorate, kingdom of Saudi Arabia, which is used for irrigation and industrial purposes. Samples were taken from different locations for 8 months consecutively, chlorine concentration ranging from 0.1- 0.4 mg/l, was determined using the DPD FREE CHLORINE HACH kit. Water samples were then analyzed following the ISO protocol which relies on the membrane filtration technique (0.45µm, pore size membrane filter) and a chromogenic medium TTC, a lactose based medium used for the detection and enumeration of total coliforms and E.coli. Data showed that the number of bacterial isolates ranged from 60 to 300 colonies/100ml for well and surface water samples respectively; where higher numbers were attributed to the surface samples. Organisms which apparently ferment lactose on TTC agar plates, appearing as orange colonies, were selected and additionally cultured on EMB and MacConkey agar for a further differentiation among E.coli and coliform bacteria. Two additional biochemical tests (Cytochrome oxidase and indole from tryptophan) were also investigated to detect and differentiate the presence of E.coli from other coliforms, E. coli was identified in an average of 5 to 7colonies among 25 selected colonies.On the other hand, a more rapid, specific and sensitive analytical molecular detection namely single colony PCR was also performed targeting hha gene to sensitively detect E.coli, giving more accurate and time consuming identification of colonies considered presumptively as E.coli. Comparative methodologies, such as ultrafiltration and direct DNA extraction from membrane filters (MoBio, Grermany) were also applied; however, results were not as accurate as the membrane filtration, making it a technique of choice for the detection and enumeration of water coliforms, followed by sufficiently specific enzymatic confirmatory stage.

Keywords: coliform, cytochrome oxidase, hha primer, membrane filtration, single colony PCR

Procedia PDF Downloads 318
10029 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models

Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah

Abstract:

In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.

Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model

Procedia PDF Downloads 242
10028 Quantifying the Effects of Canopy Cover and Cover Crop Species on Water Use Partitioning in Micro-Sprinkler Irrigated Orchards in South Africa

Authors: Zanele Ntshidi, Sebinasi Dzikiti, Dominic Mazvimavi

Abstract:

South Africa is a dry country and yet it is ranked as the 8th largest exporter of fresh apples (Malus Domestica) globally. Prime apple producing regions are in the Eastern and Western Cape Provinces of the country where all the fruit is grown under irrigation. Climate change models predict increasingly drier future conditions in these regions and the frequency and severity of droughts is expected to increase. For the sustainability and growth of the fruit industry it is important to minimize non-beneficial water losses from the orchard floor. The aims of this study were firstly to compare the water use of cover crop species used in South African orchards for which there is currently no information. The second aim was to investigate how orchard water use (evapotranspiration) was partitioned into beneficial (tree transpiration) and non-beneficial (orchard floor evaporation) water uses for micro-sprinkler irrigated orchards with different canopy covers. This information is important in order to explore opportunities to minimize non-beneficial water losses. Six cover crop species (four exotic and two indigenous) were grown in 2 L pots in a greenhouse. Cover crop transpiration was measured using the gravimetric method on clear days. To establish how water use was partitioned in orchards, evapotranspiration (ET) was measured using an open path eddy covariance system, while tree transpiration was measured hourly throughout the season (October to June) on six trees per orchard using the heat ratio sap flow method. On selected clear days, soil evaporation was measured hourly from sunrise to sunset using six micro-lysimeters situated at different wet/dry and sun/shade positions on the orchard floor. Transpiration of cover crops was measured using miniature (2 mm Ø) stem heat balance sap flow gauges. The greenhouse study showed that exotic cover crops had significantly higher (p < 0.01) average transpiration rates (~3.7 L/m2/d) than the indigenous species (~ 2.2 L/m²/d). In young non-bearing orchards, orchard floor evaporative fluxes accounted for more than 60% of orchard ET while this ranged from 10 to 30% in mature orchards with a high canopy cover. While exotic cover crops are preferred by most farmers, this study shows that they use larger quantities of water than indigenous species. This in turn contributes to a larger orchard floor evaporation flux. In young orchards non-beneficial losses can be minimized by adopting drip or short range micro-sprinkler methods that reduce the wetted soil fraction thereby conserving water.

Keywords: evapotranspiration, sap flow, soil evaporation, transpiration

Procedia PDF Downloads 388
10027 Optimal Reactive Power Dispatch under Various Contingency Conditions Using Whale Optimization Algorithm

Authors: Khaled Ben Oualid Medani, Samir Sayah

Abstract:

The Optimal Reactive Power Dispatch (ORPD) problem has been solved and analysed usually in the normal conditions. However, network collapses appear in contingency conditions. In this paper, ORPD under several contingencies is presented using the proposed method WOA. To ensure viability of the power system in contingency conditions, several critical cases are simulated in order to prevent and prepare the power system to face such situations. The results obtained are carried out in IEEE 30 bus test system for the solution of ORPD problem in which control of bus voltages, tap position of transformers and reactive power sources are involved. Moreover, another method, namely, Particle Swarm Optimization with Time Varying Acceleration Coefficient (PSO-TVAC) has been compared with the proposed technique. Simulation results indicate that the proposed WOA gives remarkable solution in terms of effectiveness in case of outages.

Keywords: optimal reactive power dispatch, power system analysis, real power loss minimization, contingency condition, metaheuristic technique, whale optimization algorithm

Procedia PDF Downloads 121
10026 Effect of Rhythmic Auditory Stimulation on Gait in Patients with Stroke

Authors: Mohamed Ahmed Fouad

Abstract:

Background: Stroke is the most leading cause to functional disability and gait problems. Objectives: The purpose of this study was to determine the effect of rhythmic auditory stimulation combined with treadmill training on selected gait kinematics in stroke patients. Methods: Thirty male stroke patients participated in this study. The patients were assigned randomly into two equal groups, (study and control). Patients in the study group received treadmill training combined with rhythmic auditory stimulation in addition to selected physical therapy program for hemiparetic patients. Patients in the control group received treadmill training in addition to the same selected physical therapy program including strengthening, stretching, weight bearing, balance exercises and gait training. Biodex gait trainer 2 TM was used to assess selected gait kinematics (step length, step cycle, walking speed, time on each foot and ambulation index) before and after six weeks training period (end of treatment) for both groups. Results: There was a statistically significant increase in walking speed, step cycle, step length, percent of the time on each foot and ambulation index in both groups post-treatment. The improvement in gait parameters post-treatment was significantly higher in the study group compared to the control. Conclusion: Rhythmic auditory stimulation combined with treadmill training is effective in improving selected gait kinematics in stroke patients when added to the selected physical therapy program.

Keywords: stroke, rhythmic auditory stimulation, treadmill training, gait kinematics

Procedia PDF Downloads 245
10025 Architectural Adaptation for Road Humps Detection in Adverse Light Scenario

Authors: Padmini S. Navalgund, Manasi Naik, Ujwala Patil

Abstract:

Road hump is a semi-cylindrical elevation on the road made across specific locations of the road. The vehicle needs to maneuver the hump by reducing the speed to avoid car damage and pass over the road hump safely. Road Humps on road surfaces, if identified in advance, help to maintain the security and stability of vehicles, especially in adverse visibility conditions, viz. night scenarios. We have proposed a deep learning architecture adaptation by implementing the MISH activation function and developing a new classification loss function called "Effective Focal Loss" for Indian road humps detection in adverse light scenarios. We captured images comprising of marked and unmarked road humps from two different types of cameras across South India to build a heterogeneous dataset. A heterogeneous dataset enabled the algorithm to train and improve the accuracy of detection. The images were pre-processed, annotated for two classes viz, marked hump and unmarked hump. The dataset from these images was used to train the single-stage object detection algorithm. We utilised an algorithm to synthetically generate reduced visible road humps scenarios. We observed that our proposed framework effectively detected the marked and unmarked hump in the images in clear and ad-verse light environments. This architectural adaptation sets up an option for early detection of Indian road humps in reduced visibility conditions, thereby enhancing the autonomous driving technology to handle a wider range of real-world scenarios.

Keywords: Indian road hump, reduced visibility condition, low light condition, adverse light condition, marked hump, unmarked hump, YOLOv9

Procedia PDF Downloads 26
10024 Deep Learning Based on Image Decomposition for Restoration of Intrinsic Representation

Authors: Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Kensuke Nakamura, Dongeun Choi, Byung-Woo Hong

Abstract:

Artefacts are commonly encountered in the imaging process of clinical computed tomography (CT) where the artefact refers to any systematic discrepancy between the reconstructed observation and the true attenuation coefficient of the object. It is known that CT images are inherently more prone to artefacts due to its image formation process where a large number of independent detectors are involved, and they are assumed to yield consistent measurements. There are a number of different artefact types including noise, beam hardening, scatter, pseudo-enhancement, motion, helical, ring, and metal artefacts, which cause serious difficulties in reading images. Thus, it is desired to remove nuisance factors from the degraded image leaving the fundamental intrinsic information that can provide better interpretation of the anatomical and pathological characteristics. However, it is considered as a difficult task due to the high dimensionality and variability of data to be recovered, which naturally motivates the use of machine learning techniques. We propose an image restoration algorithm based on the deep neural network framework where the denoising auto-encoders are stacked building multiple layers. The denoising auto-encoder is a variant of a classical auto-encoder that takes an input data and maps it to a hidden representation through a deterministic mapping using a non-linear activation function. The latent representation is then mapped back into a reconstruction the size of which is the same as the size of the input data. The reconstruction error can be measured by the traditional squared error assuming the residual follows a normal distribution. In addition to the designed loss function, an effective regularization scheme using residual-driven dropout determined based on the gradient at each layer. The optimal weights are computed by the classical stochastic gradient descent algorithm combined with the back-propagation algorithm. In our algorithm, we initially decompose an input image into its intrinsic representation and the nuisance factors including artefacts based on the classical Total Variation problem that can be efficiently optimized by the convex optimization algorithm such as primal-dual method. The intrinsic forms of the input images are provided to the deep denosing auto-encoders with their original forms in the training phase. In the testing phase, a given image is first decomposed into the intrinsic form and then provided to the trained network to obtain its reconstruction. We apply our algorithm to the restoration of the corrupted CT images by the artefacts. It is shown that our algorithm improves the readability and enhances the anatomical and pathological properties of the object. The quantitative evaluation is performed in terms of the PSNR, and the qualitative evaluation provides significant improvement in reading images despite degrading artefacts. The experimental results indicate the potential of our algorithm as a prior solution to the image interpretation tasks in a variety of medical imaging applications. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).

Keywords: auto-encoder neural network, CT image artefact, deep learning, intrinsic image representation, noise reduction, total variation

Procedia PDF Downloads 190
10023 Image Compression on Region of Interest Based on SPIHT Algorithm

Authors: Sudeepti Dayal, Neelesh Gupta

Abstract:

Image abbreviation is utilized for reducing the size of a file without demeaning the quality of the image to an objectionable level. The depletion in file size permits more images to be deposited in a given number of spaces. It also minimizes the time necessary for images to be transferred. Storage of medical images is a most researched area in the current scenario. To store a medical image, there are two parameters on which the image is divided, regions of interest and non-regions of interest. The best way to store an image is to compress it in such a way that no important information is lost. Compression can be done in two ways, namely lossy, and lossless compression. Under that, several compression algorithms are applied. In the paper, two algorithms are used which are, discrete cosine transform, applied to non-region of interest (lossy), and discrete wavelet transform, applied to regions of interest (lossless). The paper introduces SPIHT (set partitioning hierarchical tree) algorithm which is applied onto the wavelet transform to obtain good compression ratio from which an image can be stored efficiently.

Keywords: Compression ratio, DWT, SPIHT, DCT

Procedia PDF Downloads 349
10022 Spectrum Assignment Algorithms in Optical Networks with Protection

Authors: Qusay Alghazali, Tibor Cinkler, Abdulhalim Fayad

Abstract:

In modern optical networks, the flex grid spectrum usage is most widespread, where higher bit rate streams get larger spectrum slices while lower bit rate traffic streams get smaller spectrum slices. To our practice, under the ITU-T recommendation, G.694.1, spectrum slices of 50, 75, and 100 GHz are being used with central frequency at 193.1 THz. However, when these spectrum slices are not sufficient, multiple spectrum slices can use either one next to another or anywhere in the optical wavelength. In this paper, we propose the analysis of the wavelength assignment problem. We compare different algorithms for this spectrum assignment with and without protection. As a reference for comparisons, we concluded that the Integer Linear Programming (ILP) provides the global optimum for all cases. The most scalable algorithm is the greedy one, which yields results in subsequent ranges even for more significant network instances. The algorithms’ benchmark implemented using the LEMON C++ optimization library and simulation runs based on a minimum number of spectrum slices assigned to lightpaths and their execution time.

Keywords: spectrum assignment, integer linear programming, greedy algorithm, international telecommunication union, library for efficient modeling and optimization in networks

Procedia PDF Downloads 169
10021 Dynamic Store Procedures in Database

Authors: Muhammet Dursun Kaya, Hasan Asil

Abstract:

In recent years, different methods have been proposed to optimize question processing in database. Although different methods have been proposed to optimize the query, but the problem which exists here is that most of these methods destroy the query execution plan after executing the query. This research attempts to solve the above problem by using a combination of methods of communicating with the database (the present questions in the programming code and using store procedures) and making query processing adaptive in database, and proposing a new approach for optimization of query processing by introducing the idea of dynamic store procedures. This research creates dynamic store procedures in the database according to the proposed algorithm. This method has been tested on applied software and results shows a significant improvement in reducing the query processing time and also reducing the workload of DBMS. Other advantages of this algorithm include: making the programming environment a single environment, eliminating the parametric limitations of the stored procedures in the database, making the stored procedures in the database dynamic, etc.

Keywords: relational database, agent, query processing, adaptable, communication with the database

Procedia PDF Downloads 372
10020 Sustainable Separation of Nicotine from Its Aqueous Solutions

Authors: Zoran Visak, Joana Lopes, Vesna Najdanovic-Visak

Abstract:

Within this study, the separation of nicotine from its aqueous solutions, using inorganic salt sodium chloride or ionic liquid (molten salt) ECOENG212® as salting-out media, was carried out. Thus, liquid-liquid equilibria of the ternary solutions (nicotine+water+NaCl) and (nicotine+water+ECOENG212®) were determined at ambient pressure, 0.1 MPa, at three temperatures. The related phase diagrams were constructed in two manners: by adding the determined cloud-points and by the chemical analysis of phases in equilibrium (tie-line data). The latter were used to calculate two important separation parameters - partition coefficients of nicotine and separation factors. The impacts of the initial compositions of the mother solutions and of temperature on the liquid-liquid phase separation and partition coefficients were analyzed and discussed. The results obtained clearly showed that both investigated salts are good salting-out media for the efficient and sustainable separation of nicotine from its solutions with water. However, when compared, sodium chloride exhibited much better separation performance than the ionic liquid.

Keywords: nicotine, liquid-liquid separation, inorganic salt, ionic liquid

Procedia PDF Downloads 311
10019 Structural Damage Detection Using Modal Data Employing Teaching Learning Based Optimization

Authors: Subhajit Das, Nirjhar Dhang

Abstract:

Structural damage detection is a challenging work in the field of structural health monitoring (SHM). The damage detection methods mainly focused on the determination of the location and severity of the damage. Model updating is a well known method to locate and quantify the damage. In this method, an error function is defined in terms of difference between the signal measured from ‘experiment’ and signal obtained from undamaged finite element model. This error function is minimised with a proper algorithm, and the finite element model is updated accordingly to match the measured response. Thus, the damage location and severity can be identified from the updated model. In this paper, an error function is defined in terms of modal data viz. frequencies and modal assurance criteria (MAC). MAC is derived from Eigen vectors. This error function is minimized by teaching-learning-based optimization (TLBO) algorithm, and the finite element model is updated accordingly to locate and quantify the damage. Damage is introduced in the model by reduction of stiffness of the structural member. The ‘experimental’ data is simulated by the finite element modelling. The error due to experimental measurement is introduced in the synthetic ‘experimental’ data by adding random noise, which follows Gaussian distribution. The efficiency and robustness of this method are explained through three examples e.g., one truss, one beam and one frame problem. The result shows that TLBO algorithm is efficient to detect the damage location as well as the severity of damage using modal data.

Keywords: damage detection, finite element model updating, modal assurance criteria, structural health monitoring, teaching learning based optimization

Procedia PDF Downloads 215
10018 Achievement of Livable and Healthy City through the Design of Green and Blue Infrastructure: A Case Study on City of Isfahan, Iran

Authors: Reihaneh Rafiemanzelat

Abstract:

due to towards the rapid urbanization, cities throughout the world faced to rapid growth through gray infrastructure. Therefore designing cities based on green and blue infrastructure can offer the best solution to support healthy urban environment. This conformation with a wide range of ecosystem service has a positive impact on the regulation of air temperature, noise reduction, air quality, and also create a pleasant environment for humans activities. Research mainly focuses on the concept and principles of green and blue infrastructure in the city of Esfahan at the center of Iran in order to create a livable and healthy environment. Design principles for green and blue infrastructure are classified into two different but interconnect evaluations. Healthy green infrastructure assessing based on; volume, shape, location, dispersion, and maintenance. For blue infrastructure there are three aspects of water and ecosystem which are; the contribution of water on medical health, the contribution of water on mental health, and creating possibilities to exercise.

Keywords: healthy cities, livability, urban landscape, green and blue infrastructure

Procedia PDF Downloads 305
10017 An Assessment of Bathymetric Changes in the Lower Usuma Reservoir, Abuja, Nigera

Authors: Rayleigh Dada Abu, Halilu Ahmad Shaba

Abstract:

Siltation is a serious problem that affects public water supply infrastructures such as dams and reservoirs. It is a major problem which threatens the performance and sustainability of dams and reservoirs. It reduces the dam capacity for flood control, potable water supply, changes water stage, reduces water quality and recreational benefits. The focus of this study is the Lower Usuma reservoir. At completion the reservoir had a gross storage capacity of 100 × 106 m3 (100 million cubic metres), a maximum operational level of 587.440 m a.s.l., with a maximum depth of 49 m and a catchment area of 241 km2 at dam site with a daily designed production capacity of 10,000 cubic metres per hour. The reservoir is 1,300 m long and feeds the treatment plant mainly by gravity. The reservoir became operational in 1986 and no survey has been conducted to determine its current storage capacity and rate of siltation. Hydrographic survey of the reservoir by integrated acoustic echo-sounding technique was conducted in November 2012 to determine the level and rate of siltation. The result obtained shows that the reservoir has lost 12.0 meters depth to siltation in 26 years of its operation; indicating 24.5% loss in installed storage capacity. The present bathymetric survey provides baseline information for future work on siltation depth and annual rates of storage capacity loss for the Lower Usuma reservoir.

Keywords: sedimentation, lower Usuma reservoir, acoustic echo sounder, bathymetric survey

Procedia PDF Downloads 515
10016 Electrical Interactions and Patterning of Bio-Polymers and Nanoparticles in Water Suspensions

Authors: N. V. Klassen, A. A. Vasin, A. M. Likhter, K. A. Voronin, A. V. Mariasevskaya, I. M. Shmit’ko

Abstract:

Regular patterning in mixtures of bio-polymers (chitosan and collagen) and nanoparticles in water suspensions has been found by means of optical microscopy. The patterning was created either by external electrical field of moderate amplitude (200–1000 v/cm) or spontaneously. Simultaneously with the patterning pushing out of water drops mixed with nanoparticles to the external regions was observed. These phenomena are explained by interactions of charged bio-polymers and nanoparticles with external and internal electrical fields as well as with the regions of decreased dielectrical permittivity surrounding nano-objects in water which possesses anomalously high dielectrical permittivity. Electrical charges of opposite signs of the nano-objects induce their mutual attraction whereas dipole moments created around these nano-objects by the electrical fields are pushing these particles to the regions with lower fields. Due to this reason, non-homogeneities of dielectrical permittivity around nano-objects immersed into water suspension induces mutual repulsion of the objects. This spatial decrease of this repulsion with the inter-particle distances is more sharp than that of the Coulomb attraction. So, at longer distances, the attractions are stronger whereas at shorter distances the repulsion prevails. At a certain distance these two forces compensate each other creating the equilibrium state of the mixture of nano-objects with opposite charges. When the groups of positive and negative nano-objects consist from identical particles, quasi-periodical pattern of the suspension is observed like mesoscopic two-dimensional super-crystal. These results can clarify the mechanisms of healing of internal organs with direct or alternative electrical fields.

Keywords: bio-polymers, chitosan, collagen, nanoparticles, Coulomb attraction, polarization repulsion, periodical patterning, electrical low frequency resonances

Procedia PDF Downloads 444