Search results for: sensor-based methods
2895 Turbine Compressor Vibration Analysis and Rotor Movement Evaluation by Shaft Center Line Method (The Case History Related to Main Turbine Compressor of an Olefin Plant in Iran Oil Industries)
Authors: Omid A. Zargar
Abstract:
Vibration monitoring methods of most critical equipment like main turbine and compressors always plays important role in preventive maintenance and management consideration in big industrial plants. There are a number of traditional methods like monitoring the overall vibration data from Bently Nevada panel and the time wave form (TWF) or fast Fourier transform (FFT) monitoring. Besides, Shaft centerline monitoring method developed too much in recent years. There are a number of arguments both in favor of and against this method between people who work in preventive maintenance and condition monitoring systems (vibration analysts). In this paper basic principal of Turbine compressor vibration analysis and rotor movement evaluation by shaft centerline method discussed in details through a case history. This case history is related to main turbine compressor of an olefin plant in Iran oil industry. In addition, some common mistakes that may occur by vibration analyst during the process discussed in details. It is worthy to know that, these mistakes may one of the reasons that sometimes this method seems to be not effective. Furthermore, recent patent and innovation in shaft position and movement evaluation are discussed in this paper.
Keywords: Shaft centerline position, attitude angle, journal bearing, sleeve bearing, tilting pad, steam turbine, main compressor, multistage compressor, condition monitoring, non-contact probe
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 71272894 Modern Detection and Description Methods for Natural Plants Recognition
Authors: Masoud Fathi Kazerouni, Jens Schlemper, Klaus-Dieter Kuhnert
Abstract:
Green planet is one of the Earth’s names which is known as a terrestrial planet and also can be named the fifth largest planet of the solar system as another scientific interpretation. Plants do not have a constant and steady distribution all around the world, and even plant species’ variations are not the same in one specific region. Presence of plants is not only limited to one field like botany; they exist in different fields such as literature and mythology and they hold useful and inestimable historical records. No one can imagine the world without oxygen which is produced mostly by plants. Their influences become more manifest since no other live species can exist on earth without plants as they form the basic food staples too. Regulation of water cycle and oxygen production are the other roles of plants. The roles affect environment and climate. Plants are the main components of agricultural activities. Many countries benefit from these activities. Therefore, plants have impacts on political and economic situations and future of countries. Due to importance of plants and their roles, study of plants is essential in various fields. Consideration of their different applications leads to focus on details of them too. Automatic recognition of plants is a novel field to contribute other researches and future of studies. Moreover, plants can survive their life in different places and regions by means of adaptations. Therefore, adaptations are their special factors to help them in hard life situations. Weather condition is one of the parameters which affect plants life and their existence in one area. Recognition of plants in different weather conditions is a new window of research in the field. Only natural images are usable to consider weather conditions as new factors. Thus, it will be a generalized and useful system. In order to have a general system, distance from the camera to plants is considered as another factor. The other considered factor is change of light intensity in environment as it changes during the day. Adding these factors leads to a huge challenge to invent an accurate and secure system. Development of an efficient plant recognition system is essential and effective. One important component of plant is leaf which can be used to implement automatic systems for plant recognition without any human interface and interaction. Due to the nature of used images, characteristic investigation of plants is done. Leaves of plants are the first characteristics to select as trusty parts. Four different plant species are specified for the goal to classify them with an accurate system. The current paper is devoted to principal directions of the proposed methods and implemented system, image dataset, and results. The procedure of algorithm and classification is explained in details. First steps, feature detection and description of visual information, are outperformed by using Scale invariant feature transform (SIFT), HARRIS-SIFT, and FAST-SIFT methods. The accuracy of the implemented methods is computed. In addition to comparison, robustness and efficiency of results in different conditions are investigated and explained.
Keywords: SIFT combination, feature extraction, feature detection, natural images, natural plant recognition, HARRIS-SIFT, FAST-SIFT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7292893 Upgraded Rough Clustering and Outlier Detection Method on Yeast Dataset by Entropy Rough K-Means Method
Authors: P. Ashok, G. M. Kadhar Nawaz
Abstract:
Rough set theory is used to handle uncertainty and incomplete information by applying two accurate sets, Lower approximation and Upper approximation. In this paper, the rough clustering algorithms are improved by adopting the Similarity, Dissimilarity–Similarity and Entropy based initial centroids selection method on three different clustering algorithms namely Entropy based Rough K-Means (ERKM), Similarity based Rough K-Means (SRKM) and Dissimilarity-Similarity based Rough K-Means (DSRKM) were developed and executed by yeast dataset. The rough clustering algorithms are validated by cluster validity indexes namely Rand and Adjusted Rand indexes. An experimental result shows that the ERKM clustering algorithm perform effectively and delivers better results than other clustering methods. Outlier detection is an important task in data mining and very much different from the rest of the objects in the clusters. Entropy based Rough Outlier Factor (EROF) method is seemly to detect outlier effectively for yeast dataset. In rough K-Means method, by tuning the epsilon (ᶓ) value from 0.8 to 1.08 can detect outliers on boundary region and the RKM algorithm delivers better results, when choosing the value of epsilon (ᶓ) in the specified range. An experimental result shows that the EROF method on clustering algorithm performed very well and suitable for detecting outlier effectively for all datasets. Further, experimental readings show that the ERKM clustering method outperformed the other methods.
Keywords: Clustering, Entropy, Outlier, Rough K-Means, validity index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14122892 Optimization of Lead Bioremediation by Marine Halomonas sp. ES015 Using Statistical Experimental Methods
Authors: Aliaa M. El-Borai, Ehab A. Beltagy, Eman E. Gadallah, Samy A. ElAssar
Abstract:
Bioremediation technology is now used for treatment instead of traditional metal removal methods. A strain was isolated from Marsa Alam, Red sea, Egypt showed high resistance to high lead concentration and was identified by the 16S rRNA gene sequencing technique as Halomonas sp. ES015. Medium optimization was carried out using Plackett-Burman design, and the most significant factors were yeast extract, casamino acid and inoculums size. The optimized media obtained by the statistical design raised the removal efficiency from 84% to 99% from initial concentration 250 ppm of lead. Moreover, Box-Behnken experimental design was applied to study the relationship between yeast extract concentration, casamino acid concentration and inoculums size. The optimized medium increased removal efficiency to 97% from initial concentration 500 ppm of lead. Immobilized Halomonas sp. ES015 cells on sponge cubes, using optimized medium in loop bioremediation column, showed relatively constant lead removal efficiency when reused six successive cycles over the range of time interval. Also metal removal efficiency was not affected by flow rate changes. Finally, the results of this research refer to the possibility of lead bioremediation by free or immobilized cells of Halomonas sp. ES015. Also, bioremediation can be done in batch cultures and semicontinuous cultures using column technology.
Keywords: Bioremediation, lead, Box–Behnken, Halomonas sp. ES015, loop bioremediation, Plackett-Burman.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10182891 A Robust and Efficient Segmentation Method Applied for Cardiac Left Ventricle with Abnormal Shapes
Authors: Peifei Zhu, Zisheng Li, Yasuki Kakishita, Mayumi Suzuki, Tomoaki Chono
Abstract:
Segmentation of left ventricle (LV) from cardiac ultrasound images provides a quantitative functional analysis of the heart to diagnose disease. Active Shape Model (ASM) is widely used for LV segmentation, but it suffers from the drawback that initialization of the shape model is not sufficiently close to the target, especially when dealing with abnormal shapes in disease. In this work, a two-step framework is improved to achieve a fast and efficient LV segmentation. First, a robust and efficient detection based on Hough forest localizes cardiac feature points. Such feature points are used to predict the initial fitting of the LV shape model. Second, ASM is applied to further fit the LV shape model to the cardiac ultrasound image. With the robust initialization, ASM is able to achieve more accurate segmentation. The performance of the proposed method is evaluated on a dataset of 810 cardiac ultrasound images that are mostly abnormal shapes. This proposed method is compared with several combinations of ASM and existing initialization methods. Our experiment results demonstrate that accuracy of the proposed method for feature point detection for initialization was 40% higher than the existing methods. Moreover, the proposed method significantly reduces the number of necessary ASM fitting loops and thus speeds up the whole segmentation process. Therefore, the proposed method is able to achieve more accurate and efficient segmentation results and is applicable to unusual shapes of heart with cardiac diseases, such as left atrial enlargement.Keywords: Hough forest, active shape model, segmentation, cardiac left ventricle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15042890 Effect of Three Drying Methods on Antioxidant Efficiency and Vitamin C Content of Moringa oleifera Leaf Extract
Authors: Kenia Martínez, Geniel Talavera, Juan Alonso
Abstract:
Moringa oleifera is a plant containing many nutrients that are mostly concentrated within the leaves. Commonly, the separation process of these nutrients involves solid-liquid extraction followed by evaporation and drying to obtain a concentrated extract, which is rich in proteins, vitamins, carbohydrates, and other essential nutrients that can be used in the food industry. In this work, three drying methods were used, which involved very different temperature and pressure conditions, to evaluate the effect of each method on the vitamin C content and the antioxidant efficiency of the extracts. Solid-liquid extractions of Moringa leaf (LE) were carried out by employing an ethanol solution (35% v/v) at 50 °C for 2 hours. The resulting extracts were then dried i) in a convective oven (CO) at 100 °C and at an atmospheric pressure of 750 mbar for 8 hours, ii) in a vacuum evaporator (VE) at 50 °C and at 300 mbar for 2 hours, and iii) in a freeze-drier (FD) at -40 °C and at 0.050 mbar for 36 hours. The antioxidant capacity (EC50, mg solids/g DPPH) of the dry solids was calculated by the free radical inhibition method employing DPPH˙ at 517 nm, resulting in a value of 2902.5 ± 14.8 for LE, 3433.1 ± 85.2 for FD, 3980.1 ± 37.2 for VE, and 8123.5 ± 263.3 for CO. The calculated antioxidant efficiency (AE, g DPPH/(mg solids·min)) was 2.920 × 10-5 for LE, 2.884 × 10-5 for FD, 2.512 × 10-5 for VE, and 1.009 × 10-5 for CO. Further, the content of vitamin C (mg/L) determined by HPLC was 59.0 ± 0.3 for LE, 49.7 ± 0.6 for FD, 45.0 ± 0.4 for VE, and 23.6 ± 0.7 for CO. The results indicate that the convective drying preserves vitamin C and antioxidant efficiency to 40% and 34% of the initial value, respectively, while vacuum drying to 76% and 86%, and freeze-drying to 84% and 98%, respectively.
Keywords: Antioxidant efficiency, convective drying, freeze-drying, Moringa oleifera, vacuum drying, vitamin C content.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17982889 An Embedded System for Artificial Intelligence Applications
Authors: Ioannis P. Panagopoulos, Christos C. Pavlatos, George K. Papakonstantinou
Abstract:
Conventional approaches in the implementation of logic programming applications on embedded systems are solely of software nature. As a consequence, a compiler is needed that transforms the initial declarative logic program to its equivalent procedural one, to be programmed to the microprocessor. This approach increases the complexity of the final implementation and reduces the overall system's performance. On the contrary, presenting hardware implementations which are only capable of supporting logic programs prevents their use in applications where logic programs need to be intertwined with traditional procedural ones, for a specific application. We exploit HW/SW codesign methods to present a microprocessor, capable of supporting hybrid applications using both programming approaches. We take advantage of the close relationship between attribute grammar (AG) evaluation and knowledge engineering methods to present a programmable hardware parser that performs logic derivations and combine it with an extension of a conventional RISC microprocessor that performs the unification process to report the success or failure of those derivations. The extended RISC microprocessor is still capable of executing conventional procedural programs, thus hybrid applications can be implemented. The presented implementation is programmable, supports the execution of hybrid applications, increases the performance of logic derivations (experimental analysis yields an approximate 1000% increase in performance) and reduces the complexity of the final implemented code. The proposed hardware design is supported by a proposed extended C-language called C-AG.
Keywords: Attribute Grammars, Logic Programming, RISC microprocessor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50872888 Human Trafficking: The Kosovar Perspective of Fighting the Phenomena through Police and Civil Society Cooperation
Authors: Samedin Mehmeti
Abstract:
The rationale behind this study is considering combating and preventing the phenomenon of trafficking in human beings from a multidisciplinary perspective that involves many layers of the society. Trafficking in human beings is an abhorrent phenomenon highly affecting negatively the victims and their families in both human and material aspect, sometimes causing irreversible damages. The longer term effects of this phenomenon, in countries with a weak economic development and extremely young and dynamic population, such as Kosovo, without proper measures to prevented and control can cause tremendous damages in the society. Given the fact that a complete eradication of this phenomenon is almost impossible, efforts should be concentrated at least on the prevention and controlling aspects. Treating trafficking in human beings based on traditional police tactics, methods and proceedings cannot bring satisfactory results. There is no doubt that a multi-disciplinary approach is an irreplaceable requirement, in other words, a combination of authentic and functional proactive and reactive methods, techniques and tactics. Obviously, police must exercise its role in preventing and combating trafficking in human beings, a role sanctioned by the law, however, police role and contribution cannot by any means considered complete if all segments of the society are not included in these efforts. Naturally, civil society should have an important share in these collaborative and interactive efforts especially in preventive activities such as: awareness on trafficking risks and damages, proactive engagement in drafting appropriate legislation and strategies, law enforcement monitoring and direct or indirect involvement in protective and supporting activities which benefit the victims of trafficking etc.Keywords: Civil society, cooperation, police, trafficking in human beings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16352887 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain
Authors: Bita Payami-Shabestari, Dariush Eslami
Abstract:
The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.Keywords: Economic production quantity, random cost, supply chain management, vendor-managed inventory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6832886 Conceptual Synthesis of Multi-Source Renewable Energy Based Microgrid
Authors: Bakari M. M. Mwinyiwiwa, Mighanda J. Manyahi, Nicodemu Gregory, Alex L. Kyaruzi
Abstract:
Microgrids are increasingly being considered to provide electricity for the expanding energy demand in the grid distribution network and grid isolated areas. However, the technical challenges associated with the operation and controls are immense. Management of dynamic power balances, power flow, and network voltage profiles imposes unique challenges in the context of microgrids. Stability of the microgrid during both grid-connected and islanded mode is considered as the major challenge during its operation. Traditional control methods have been employed are based on the assumption of linear loads. For instance the concept of PQ, voltage and frequency control through decoupled PQ are some of very useful when considering linear loads, but they fall short when considering nonlinear loads. The deficiency of traditional control methods of microgrid suggests that more research in the control of microgrids should be done. This research aims at introducing the dq technique concept into decoupled PQ for dynamic load demand control in inverter interfaced DG system operating as isolated LV microgrid. Decoupled PQ in exact mathematical formulation in dq frame is expected to accommodate all variations of the line parameters (resistance and inductance) and to relinquish forced relationship between the DG variables such as power, voltage and frequency in LV microgrids and allow for individual parameter control (frequency and line voltages). This concept is expected to address and achieve accurate control, improve microgrid stability and power quality at all load conditions.
Keywords: Decoupled PQ, microgrid, multisource, renewable energy, dq control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25392885 Investigation of Rehabilitation Effects on Fire Damaged High Strength Concrete Beams
Authors: Eun Mi Ryu, Ah Young An, Ji Yeon Kang, Yeong Soo Shin, Hee Sun Kim
Abstract:
When high strength reinforced concrete is exposed to high temperature due to a fire, deteriorations occur such as loss in strength and elastic modulus, cracking and spalling of the concrete. Therefore, it is important to understand risk of structural safety in building structures by studying structural behaviors and rehabilitation of fire damaged high strength concrete structures. This paper aims at investigating rehabilitation effect on fire damaged high strength concrete beams using experimental and analytical methods. In the experiments, flexural specimens with high strength concrete are exposed to high temperatures according to ISO 834 standard time temperature curve. From four-point loading test, results show that maximum loads of the rehabilitated beams are similar to or higher than those of the non-fire damaged RC beam. In addition, structural analyses are performed using ABAQUS 6.10-3 with same conditions as experiments to provide accurate predictions on structural and mechanical behaviors of rehabilitated RC beams. The parameters are the fire cover thickness and strengths of repairing mortar. Analytical results show good rehabilitation effects, when the results predicted from the rehabilitated models are compared to structural behaviors of the non-damaged RC beams. In this study, fire damaged high strength concrete beams are rehabilitated using polymeric cement mortar. The predictions from the finite element (FE) models show good agreements with the experimental results and the modeling approaches can be used to investigate applicability of various rehabilitation methods for further study.Keywords: Fire, High strength concrete, Rehabilitation, Reinforced concrete beam.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23772884 Qualitative and Quantitative Analyses of Phytochemicals and Antioxidant Activity of Ficus sagittifolia (Warburg Ex Mildbread and Burret)
Authors: Taiwo O. Margaret, Olaoluwa O. Olaoluwa
Abstract:
Moraceae family has immense phytochemical constituents and significant pharmacological properties, hence have great medicinal values. The aim of this study was to screen and quantify phytochemicals as well as the antioxidant activities of the leaf and stem bark extracts and fractions (crude ethanol extracts, n-hexane, ethyl acetate and aqueous ethanol fractions) of Ficus sagittifolia. Leaf and stem bark of F. sagittifolia were extracted by maceration method using ethanol to give ethanol crude extract. The ethanol crude extract was partitioned by n-hexane and ethyl-acetate to give their respective fractions. All the extracts were screened for their phytochemicals using standard methods. The total phenolic, flavonoid, tannin, saponin contents and antioxidant activity were determined by spectrophotometric method while the alkaloid content was evaluated by titrimetric method. The amount of total phenolic in extracts and fractions were estimated in comparison to gallic acid, whereas total flavonoids, tannins and saponins were estimated corresponding to quercetin, tannic acid and saponin respectively. 2, 2-diphenylpicryl hydrazyl radical (DPPH)* and phosphomolybdate methods were used to evaluate the antioxidant activities of leaf and stem bark of F. sagittifolia. Phytochemical screening revealed the presence of flavonoids, saponins, terpenoids/steroids, alkaloids for both extracts of leaf and stem bark of F. sagittifolia. The phenolic content of F. sagittifolia was most abundant in leaf ethanol crude extract as 3.53 ± 0.03 mg/g equivalent of gallic acid. Total flavonoids and tannins content were highest in stem bark aqueous ethanol fraction of F. sagittifolia estimated as 3.41 ± 0.08 mg/g equivalent of quercetin and 1.52 ± 0.05 mg/g equivalent of tannic acid respectively. The hexane leaf fraction of F. sagittifolia had the utmost saponin and alkaloid content as 5.10 ± 0.48 mg/g equivalent of saponins and 0.171 ± 0.39 g of alkaloids. Leaf aqueous ethanol fraction of F. sagittifolia showed high antioxidant activity (IC50 value of 63.092 µg/mL) and stem ethanol crude extract (227.43 ± 0.78 mg/g equivalent of ascorbic acid) for DPPH and phosphomolybdate method respectively and the least active was found to be the stem hexane fraction using both methods (313.32 µg/mL; 16.21 ± 1.30 mg/g equivalent of ascorbic acid). The presence of these phytochemicals in the leaf and stem bark of F. sagittifolia are responsible for their therapeutic importance as well as the ability to scavenge free radicals in living systems.
Keywords: Antioxidant activity, Ficus sagittifolia, Moraceae, phytochemicals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10522883 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses
Authors: Neil Bar, Andrew Heweston
Abstract:
Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.
Keywords: Probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11982882 Seismic Vulnerability Assessment of Masonry Buildings in Seismic Prone Regions: The Case of Annaba City, Algeria
Authors: Allaeddine Athmani, Abdelhacine Gouasmia, Tiago Ferreira, Romeu Vicente
Abstract:
Seismic vulnerability assessment of masonry buildings is a fundamental issue even for moderate to low seismic hazard regions. This fact is even more important when dealing with old structures such as those located in Annaba city (Algeria), which the majority of dates back to the French colonial era from 1830. This category of buildings is in high risk due to their highly degradation state, heterogeneous materials and intrusive modifications to structural and non-structural elements. Furthermore, they are usually shelter a dense population, which is exposed to such risk. In order to undertake a suitable seismic risk mitigation strategies and reinforcement process for such structures, it is essential to estimate their seismic resistance capacity at a large scale. In this sense, two seismic vulnerability index methods and damage estimation have been adapted and applied to a pilot-scale building area located in the moderate seismic hazard region of Annaba city: The first one based on the EMS-98 building typologies, and the second one derived from the Italian GNDT approach. To perform this task, the authors took the advantage of an existing data survey previously performed for other purposes. The results obtained from the application of the two methods were integrated and compared using a geographic information system tool (GIS), with the ultimate goal of supporting the city council of Annaba for the implementation of risk mitigation and emergency planning strategies.Keywords: Annaba city, EMS98 concept, GNDT method, old city center, seismic vulnerability index, unreinforced masonry buildings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16352881 An Approach to Image Extraction and Accurate Skin Detection from Web Pages
Authors: Moheb R. Girgis, Tarek M. Mahmoud, Tarek Abd-El-Hafeez
Abstract:
This paper proposes a system to extract images from web pages and then detect the skin color regions of these images. As part of the proposed system, using BandObject control, we built a Tool bar named 'Filter Tool Bar (FTB)' by modifying the Pavel Zolnikov implementation. The Yahoo! Team provides us with the Yahoo! SDK API, which also supports image search and is really useful. In the proposed system, we introduced three new methods for extracting images from the web pages (after loading the web page by using the proposed FTB, before loading the web page physically from the localhost, and before loading the web page from any server). These methods overcome the drawback of the regular expressions method for extracting images suggested by Ilan Assayag. The second part of the proposed system is concerned with the detection of the skin color regions of the extracted images. So, we studied two famous skin color detection techniques. The first technique is based on the RGB color space and the second technique is based on YUV and YIQ color spaces. We modified the second technique to overcome the failure of detecting complex image's background by using the saturation parameter to obtain an accurate skin detection results. The performance evaluation of the efficiency of the proposed system in extracting images before and after loading the web page from localhost or any server in terms of the number of extracted images is presented. Finally, the results of comparing the two skin detection techniques in terms of the number of pixels detected are presented.
Keywords: Browser Helper Object, Color spaces, Image and URL extraction, Skin detection, Web Browser events.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18972880 Arriving at an Optimum Value of Tolerance Factor for Compressing Medical Images
Authors: Sumathi Poobal, G. Ravindran
Abstract:
Medical imaging uses the advantage of digital technology in imaging and teleradiology. In teleradiology systems large amount of data is acquired, stored and transmitted. A major technology that may help to solve the problems associated with the massive data storage and data transfer capacity is data compression and decompression. There are many methods of image compression available. They are classified as lossless and lossy compression methods. In lossy compression method the decompressed image contains some distortion. Fractal image compression (FIC) is a lossy compression method. In fractal image compression an image is coded as a set of contractive transformations in a complete metric space. The set of contractive transformations is guaranteed to produce an approximation to the original image. In this paper FIC is achieved by PIFS using quadtree partitioning. PIFS is applied on different images like , Ultrasound, CT Scan, Angiogram, X-ray, Mammograms. In each modality approximately twenty images are considered and the average values of compression ratio and PSNR values are arrived. In this method of fractal encoding, the parameter, tolerance factor Tmax, is varied from 1 to 10, keeping the other standard parameters constant. For all modalities of images the compression ratio and Peak Signal to Noise Ratio (PSNR) are computed and studied. The quality of the decompressed image is arrived by PSNR values. From the results it is observed that the compression ratio increases with the tolerance factor and mammogram has the highest compression ratio. The quality of the image is not degraded upto an optimum value of tolerance factor, Tmax, equal to 8, because of the properties of fractal compression.Keywords: Fractal image compression, IFS, PIFS, PSNR, Quadtree partitioning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17402879 A Ground Structure Method to Minimize the Total Installed Cost of Steel Frame Structures
Authors: Filippo Ranalli, Forest Flager, Martin Fischer
Abstract:
This paper presents a ground structure method to optimize the topology and discrete member sizing of steel frame structures in order to minimize total installed cost, including material, fabrication and erection components. The proposed method improves upon existing cost-based ground structure methods by incorporating constructability considerations well as satisfying both strength and serviceability constraints. The architecture for the method is a bi-level Multidisciplinary Feasible (MDF) architecture in which the discrete member sizing optimization is nested within the topology optimization process. For each structural topology generated, the sizing optimization process seek to find a set of discrete member sizes that result in the lowest total installed cost while satisfying strength (member utilization) and serviceability (node deflection and story drift) criteria. To accurately assess cost, the connection details for the structure are generated automatically using accurate site-specific cost information obtained directly from fabricators and erectors. Member continuity rules are also applied to each node in the structure to improve constructability. The proposed optimization method is benchmarked against conventional weight-based ground structure optimization methods resulting in an average cost savings of up to 30% with comparable computational efficiency.
Keywords: Cost-based structural optimization, cost-based topology and sizing optimization, steel frame ground structure optimization, multidisciplinary optimization of steel structures.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14242878 Using Seismic Base Isolation Systems in High-Rise Hospital Buildings and a Hybrid Proposal
Authors: E. Bakkaloğlu, N. Torunbalcı
Abstract:
Earthquakes are inevitable natural disasters in Turkey. Therefore, buildings must be prepared for this natural hazard. Especially in hospital buildings, earthquake resistance is an essential point because hospitals are one of the first places where people come after earthquake. Although hospital buildings are more suitable for horizontal architecture, it is necessary to construct and expand multi-story hospital buildings due to difficulties in finding suitable places as a result of excessive urbanization, difficulties in obtaining appropriate size land and decrease in suitable places and increase in land values. In Turkey, using seismic isolators in public hospitals, which are placed in first degree earthquake zone and have more than 100 beds, is made obligatory by general instruction. As a result of this decision, it may sometimes be necessary to construct seismic isolated multi-story hospital buildings in cities where those problems are experienced. Although there is widespread use of seismic isolators in Japan, there are few multi-story buildings in which seismic isolators are used in Turkey. As it is known, base isolation systems are the most effective methods of earthquake resistance, as the number of floors increases, the center of gravity moves away from the base in multi-story buildings, increasing the overturning effect and limiting use of these systems. In this context, it is aimed to investigate structural systems of multi-story buildings which are built using seismic isolation methods in the world. In addition to this, a working principle is suggested for the disseminating seismic isolator used in multi-story hospital buildings. The results to be obtained from the study will guide architects who design multi-story hospital buildings in their architectural designs, and engineers in terms of structural system design.
Keywords: Earthquake, energy absorbing systems, hospital, seismic isolation systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 312877 Time Series Forecasting Using Various Deep Learning Models
Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan
Abstract:
Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed length window in the past as an explicit input. In this paper, we study how the performance of predictive models change as a function of different look-back window sizes and different amounts of time to predict into the future. We also consider the performance of the recent attention-based transformer models, which had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (Recurrent Neural Network (RNN), Long Short-term Memory (LSTM), Gated Recurrent Units (GRU), and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the website of University of California, Irvine (UCI), which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean Absolute Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.
Keywords: Air quality prediction, deep learning algorithms, time series forecasting, look-back window.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11702876 Environmental and Technical Modeling of Industrial Solid Waste Management Using Analytical Network Process; A Case Study: Gilan-IRAN
Authors: D. Nouri, M.R. Sabour, M. Ghanbarzadeh Lak
Abstract:
Proper management of residues originated from industrial activities is considered as one of the serious challenges faced by industrial societies due to their potential hazards to the environment. Common disposal methods for industrial solid wastes (ISWs) encompass various combinations of solely management options, i.e. recycling, incineration, composting, and sanitary landfilling. Indeed, the procedure used to evaluate and nominate the best practical methods should be based on environmental, technical, economical, and social assessments. In this paper an environmentaltechnical assessment model is developed using analytical network process (ANP) to facilitate the decision making practice for ISWs generated at Gilan province, Iran. Using the results of performed surveys on industrial units located at Gilan, the various groups of solid wastes in the research area were characterized, and four different ISW management scenarios were studied. The evaluation process was conducted using the above-mentioned model in the Super Decisions software (version 2.0.8) environment. The results indicates that the best ISW management scenario for Gilan province is consist of recycling the metal industries residues, composting the putrescible portion of ISWs, combustion of paper, wood, fabric and polymeric wastes as well as energy extraction in the incineration plant, and finally landfilling the rest of the waste stream in addition with rejected materials from recycling and compost production plants and ashes from the incineration unit.Keywords: Analytical Network Process, Disposal Scenario, Gilan Province, Industrial Waste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19542875 Self-Healing Phenomenon Evaluation in Cementitious Matrix with Different Water/Cement Ratios and Crack Opening Age
Authors: V. G. Cappellesso, D. M. G. da Silva, J. A. Arndt, N. dos Santos Petry, A. B. Masuero, D. C. C. Dal Molin
Abstract:
Concrete elements are subject to cracking, which can be an access point for deleterious agents that can trigger pathological manifestations reducing the service life of these structures. Finding ways to minimize or eliminate the effects of this aggressive agents’ penetration, such as the sealing of these cracks, is a manner of contributing to the durability of these structures. The cementitious self-healing phenomenon can be classified in two different processes. The autogenous self-healing that can be defined as a natural process in which the sealing of this cracks occurs without the stimulation of external agents, meaning, without different materials being added to the mixture, while on the other hand, the autonomous seal-healing phenomenon depends on the insertion of a specific engineered material added to the cement matrix in order to promote its recovery. This work aims to evaluate the autogenous self-healing of concretes produced with different water/cement ratios and exposed to wet/dry cycles, considering two ages of crack openings, 3 days and 28 days. The self-healing phenomenon was evaluated using two techniques: crack healing measurement using ultrasonic waves and image analysis performed with an optical microscope. It is possible to observe that by both methods, it possible to observe the self-healing phenomenon of the cracks. For young ages of crack openings and lower water/cement ratios, the self-healing capacity is higher when compared to advanced ages of crack openings and higher water/cement ratios. Regardless of the crack opening age, these concretes were found to stabilize the self-healing processes after 80 days or 90 days.
Keywords: Self-healing, autogenous, water/cement ratio, curing cycles, test methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9742874 Opportunities for Precision Feed in Apiculture for Managing the Efficacy of Feed and Medicine
Authors: John Michael Russo
Abstract:
Honeybees are important to our food system and continue to suffer from high rates of colony loss. Precision feed has brought many benefits to livestock cultivation and these should transfer to apiculture. However, apiculture has unique challenges. The objective of this research is to understand how principles of precision agriculture, applied to apiculture and feed specifically, might effectively improve state-of-the-art cultivation. The methodology surveys apicultural practice to build a model for assessment. First, a review of apicultural motivators is made. Feed method is then evaluated. Finally, precision feed methods are examined as accelerants with potential to advance the effectiveness of feed practice. Six important motivators emerge: colony loss, disease, climate change, site variance, operational costs, and competition. Feed practice itself is used to compensate for environmental variables. The research finds that the current state-of-the-art in apiculture feed focuses on critical challenges in the management of feed schedules which satisfy requirements of the bees, preserve potency, optimize environmental variables, and manage costs. Many of the challenges are most acute when feed is used to dispense medication. Technology such as RNA treatments have even more rigorous demands. Precision feed solutions focus on strategies which accommodate specific needs of individual livestock. A major component is data; they integrate precise data with methods that respond to individual needs. There is enormous opportunity for precision feed to improve apiculture through the integration of precision data with policies to translate data into optimized action in the apiary, particularly through automation.
Keywords: Apiculture, precision apiculture, RNA varroa treatment, honeybee feed applications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2342873 Iterative Methods for Computing the Weighted Minkowski Inverses of Matrices in Minkowski Space
Authors: Xiaoji Liu, Yonghui Qin
Abstract:
In this note, we consider a family of iterative formula for computing the weighted Minskowski inverses AM,N in Minskowski space, and give two kinds of iterations and the necessary and sufficient conditions of the convergence of iterations.
Keywords: iterative method, the Minskowski inverse, A
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14202872 Concept of a Pseudo-Lower Bound Solution for Reinforced Concrete Slabs
Authors: M. De Filippo, J. S. Kuang
Abstract:
In construction industry, reinforced concrete (RC) slabs represent fundamental elements of buildings and bridges. Different methods are available for analysing the structural behaviour of slabs. In the early ages of last century, the yield-line method has been proposed to attempt to solve such problem. Simple geometry problems could easily be solved by using traditional hand analyses which include plasticity theories. Nowadays, advanced finite element (FE) analyses have mainly found their way into applications of many engineering fields due to the wide range of geometries to which they can be applied. In such cases, the application of an elastic or a plastic constitutive model would completely change the approach of the analysis itself. Elastic methods are popular due to their easy applicability to automated computations. However, elastic analyses are limited since they do not consider any aspect of the material behaviour beyond its yield limit, which turns to be an essential aspect of RC structural performance. Furthermore, their applicability to non-linear analysis for modeling plastic behaviour gives very reliable results. Per contra, this type of analysis is computationally quite expensive, i.e. not well suited for solving daily engineering problems. In the past years, many researchers have worked on filling this gap between easy-to-implement elastic methods and computationally complex plastic analyses. This paper aims at proposing a numerical procedure, through which a pseudo-lower bound solution, not violating the yield criterion, is achieved. The advantages of moment distribution are taken into account, hence the increase in strength provided by plastic behaviour is considered. The lower bound solution is improved by detecting over-yielded moments, which are used to artificially rule the moment distribution among the rest of the non-yielded elements. The proposed technique obeys Nielsen’s yield criterion. The outcome of this analysis provides a simple, yet accurate, and non-time-consuming tool of predicting the lower-bound solution of the collapse load of RC slabs. By using this method, structural engineers can find the fracture patterns and ultimate load bearing capacity. The collapse triggering mechanism is found by detecting yield-lines. An application to the simple case of a square clamped slab is shown, and a good match was found with the exact values of collapse load.Keywords: Computational mechanics, lower bound method, reinforced concrete slabs, yield-line.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10962871 Non-Convex Multi Objective Economic Dispatch Using Ramp Rate Biogeography Based Optimization
Authors: Susanta Kumar Gachhayat, S. K. Dash
Abstract:
Multi objective non-convex economic dispatch problems of a thermal power plant are of grave concern for deciding the cost of generation and reduction of emission level for diminishing the global warming level for improving green-house effect. This paper deals with ramp rate constraints for achieving better inequality constraints so as to incorporate valve point loading for cost of generation in thermal power plant through ramp rate biogeography based optimization involving mutation and migration. Through 50 out of 100 trials, the cost function and emission objective function were found to have outperformed other classical methods such as lambda iteration method, quadratic programming method and many heuristic methods like particle swarm optimization method, weight improved particle swarm optimization method, constriction factor based particle swarm optimization method, moderate random particle swarm optimization method etc. Ramp rate biogeography based optimization applications prove quite advantageous in solving non convex multi objective economic dispatch problems subjected to nonlinear loads that pollute the source giving rise to third harmonic distortions and other such disturbances.
Keywords: Economic load dispatch, Biogeography based optimization, Ramp rate biogeography based optimization, Valve Point loading, Moderate random particle swarm optimization method, Weight improved particle swarm optimization method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10502870 Detecting Fake News: A Natural Language Processing, Reinforcement Learning, and Blockchain Approach
Authors: Ashly Joseph, Jithu Paulose
Abstract:
In an era where misleading information may quickly circulate on digital news channels, it is crucial to have efficient and trustworthy methods to detect and reduce the impact of misinformation. This research proposes an innovative framework that combines Natural Language Processing (NLP), Reinforcement Learning (RL), and Blockchain technologies to precisely detect and minimize the spread of false information in news articles on social media. The framework starts by gathering a variety of news items from different social media sites and performing preprocessing on the data to ensure its quality and uniformity. NLP methods are utilized to extract complete linguistic and semantic characteristics, effectively capturing the subtleties and contextual aspects of the language used. These features are utilized as input for a RL model. This model acquires the most effective tactics for detecting and mitigating the impact of false material by modeling the intricate dynamics of user engagements and incentives on social media platforms. The integration of blockchain technology establishes a decentralized and transparent method for storing and verifying the accuracy of information. The Blockchain component guarantees the unchangeability and safety of verified news records, while encouraging user engagement for detecting and fighting false information through an incentive system based on tokens. The suggested framework seeks to provide a thorough and resilient solution to the problems presented by misinformation in social media articles.
Keywords: Natural Language Processing, Reinforcement Learning, Blockchain, fake news mitigation, misinformation detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 882869 A Spatial Hypergraph Based Semi-Supervised Band Selection Method for Hyperspectral Imagery Semantic Interpretation
Authors: Akrem Sellami, Imed Riadh Farah
Abstract:
Hyperspectral imagery (HSI) typically provides a wealth of information captured in a wide range of the electromagnetic spectrum for each pixel in the image. Hence, a pixel in HSI is a high-dimensional vector of intensities with a large spectral range and a high spectral resolution. Therefore, the semantic interpretation is a challenging task of HSI analysis. We focused in this paper on object classification as HSI semantic interpretation. However, HSI classification still faces some issues, among which are the following: The spatial variability of spectral signatures, the high number of spectral bands, and the high cost of true sample labeling. Therefore, the high number of spectral bands and the low number of training samples pose the problem of the curse of dimensionality. In order to resolve this problem, we propose to introduce the process of dimensionality reduction trying to improve the classification of HSI. The presented approach is a semi-supervised band selection method based on spatial hypergraph embedding model to represent higher order relationships with different weights of the spatial neighbors corresponding to the centroid of pixel. This semi-supervised band selection has been developed to select useful bands for object classification. The presented approach is evaluated on AVIRIS and ROSIS HSIs and compared to other dimensionality reduction methods. The experimental results demonstrate the efficacy of our approach compared to many existing dimensionality reduction methods for HSI classification.Keywords: Hyperspectral image, spatial hypergraph, dimensionality reduction, semantic interpretation, band selection, feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12202868 Simplified Stress Gradient Method for Stress-Intensity Factor Determination
Authors: Jeries J. Abou-Hanna
Abstract:
Several techniques exist for determining stress-intensity factors in linear elastic fracture mechanics analysis. These techniques are based on analytical, numerical, and empirical approaches that have been well documented in literature and engineering handbooks. However, not all techniques share the same merit. In addition to overly-conservative results, the numerical methods that require extensive computational effort, and those requiring copious user parameters hinder practicing engineers from efficiently evaluating stress-intensity factors. This paper investigates the prospects of reducing the complexity and required variables to determine stress-intensity factors through the utilization of the stress gradient and a weighting function. The heart of this work resides in the understanding that fracture emanating from stress concentration locations cannot be explained by a single maximum stress value approach, but requires use of a critical volume in which the crack exists. In order to understand the effectiveness of this technique, this study investigated components of different notch geometry and varying levels of stress gradients. Two forms of weighting functions were employed to determine stress-intensity factors and results were compared to analytical exact methods. The results indicated that the “exponential” weighting function was superior to the “absolute” weighting function. An error band +/- 10% was met for cases ranging from a steep stress gradient in a sharp v-notch to the less severe stress transitions of a large circular notch. The incorporation of the proposed method has shown to be a worthwhile consideration.
Keywords: Fracture mechanics, finite element method, stress intensity factor, stress gradient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7662867 Atmosphere Water Vapour As Main Sweet Water Resource in the Arid Zones of Central Asia
Authors: S.I.Nikolaeva, Yu.V. Petrov, L.Ye.Skipnikova
Abstract:
It has been shown that the solution of water shortage problem in Central Asia closely connected with inclusion of atmosphere water vapour into the system of response and water resources management. Some methods of water extraction from atmosphere have been discussed.
Keywords: potable water, water resources, water problems, water scarcity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15602866 Evaluation of Electro-Flocculation for Biomass Production of Marine Microalgae Phaodactylum tricornutum
Authors: Luciana C. Ramos, Leandro J. Sousa, Antônio Ferreira da Silva, Valéria Gomes Oliveira Falcão, Suzana T. Cunha Lima
Abstract:
The commercial production of biodiesel using microalgae demands a high-energy input for harvesting biomass, making production economically unfeasible. Methods currently used involve mechanical, chemical, and biological procedures. In this work, a flocculation system is presented as a cost and energy effective process to increase biomass production of Phaeodactylum tricornutum. This diatom is the only species of the genus that present fast growth and lipid accumulation ability that are of great interest for biofuel production. The algae, selected from the Bank of Microalgae, Institute of Biology, Federal University of Bahia (Brazil), have been bred in tubular reactor with photoperiod of 12 h (clear/dark), providing luminance of about 35 μmol photons m-2s-1, and temperature of 22 °C. The medium used for growing cells was the Conway medium, with addition of silica. The seaweed growth curve was accompanied by cell count in Neubauer camera and by optical density in spectrophotometer, at 680 nm. The precipitation occurred at the end of the stationary phase of growth, 21 days after inoculation, using two methods: centrifugation at 5000 rpm for 5 min, and electro-flocculation at 19 EPD and 95 W. After precipitation, cells were frozen at -20 °C and, subsequently, lyophilized. Biomass obtained by electro-flocculation was approximately four times greater than the one achieved by centrifugation. The benefits of this method are that no addition of chemical flocculants is necessary and similar cultivation conditions can be used for the biodiesel production and pharmacological purposes. The results may contribute to improve biodiesel production costs using marine microalgae.
Keywords: Biomass, diatom, flocculation, microalgae.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365