Search results for: input shaping
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2722

Search results for: input shaping

862 Preliminary Analysis on Land Use-Land Cover Assessment of Post-Earthquake Geohazard: A Case Study in Kundasang, Sabah

Authors: Nur Afiqah Mohd Kamal, Khamarrul Azahari Razak

Abstract:

The earthquake aftermath has become a major concern, especially in high seismicity region. In Kundasang, Sabah, the earthquake on 5th June 2015 resulted in several catastrophes; landslides, rockfalls, mudflows and major slopes affected regardless of the series of the aftershocks. Certainly, the consequences of earthquake generate and induce the episodic disaster, not only life-threatening but it also affects infrastructure and economic development. Therefore, a need for investigating the change in land use and land cover (LULC) of post-earthquake geohazard is essential for identifying the extent of disastrous effects towards the development in Kundasang. With the advancement of remote sensing technology, post-earthquake geohazards (landslides, mudflows, rockfalls, debris flows) assessment can be evaluated by the employment of object-based image analysis in investigating the LULC change which consists of settlements, public infrastructure and vegetation cover. Therefore, this paper discusses the preliminary results on post-earthquakes geohazards distribution in Kundasang and evaluates the LULC classification effect upon the occurrences of geohazards event. The result of this preliminary analysis will provide an overview to determine the extent of geohazard impact on LULC. This research also provides beneficial input to the local authority in Kundasang about the risk of future structural development on the geohazard area.

Keywords: geohazard, land use land cover, object-based image analysis, remote sensing

Procedia PDF Downloads 248
861 Evaluation of the Performance of Solar Stills as an Alternative for Brine Treatment Applying the Monte Carlo Ray Tracing Method

Authors: B. E. Tarazona-Romero, J. G. Ascanio-Villabona, O. Lengerke-Perez, A. D. Rincon-Quintero, C. L. Sandoval-Rodriguez

Abstract:

Desalination offers solutions for the shortage of water in the world, however, the process of eliminating salts generates a by-product known as brine, generally eliminated in the environment through techniques that mitigate its impact. Brine treatment techniques are vital to developing an environmentally sustainable desalination process. Consequently, this document evaluates three different geometric configurations of solar stills as an alternative for brine treatment to be integrated into a low-scale desalination process. The geometric scenarios to be studied were selected because they have characteristics that adapt to the concept of appropriate technology; low cost, intensive labor and material resources for local manufacturing, modularity, and simplicity in construction. Additionally, the conceptual design of the collectors was carried out, and the ray tracing methodology was applied through the open access software SolTrace and Tonatiuh. The simulation process used 600.00 rays and modified two input parameters; direct normal radiation (DNI) and reflectance. In summary, for the scenarios evaluated, the ladder-type distiller presented higher efficiency values compared to the pyramid-type and single-slope collectors. Finally, the efficiency of the collectors studied was directly related to their geometry, that is, large geometries allow them to receive a greater number of solar rays in various paths, affecting the efficiency of the device.

Keywords: appropriate technology, brine treatment techniques, desalination, monte carlo ray tracing

Procedia PDF Downloads 74
860 Reduction of False Positives in Head-Shoulder Detection Based on Multi-Part Color Segmentation

Authors: Lae-Jeong Park

Abstract:

The paper presents a method that utilizes figure-ground color segmentation to extract effective global feature in terms of false positive reduction in the head-shoulder detection. Conventional detectors that rely on local features such as HOG due to real-time operation suffer from false positives. Color cue in an input image provides salient information on a global characteristic which is necessary to alleviate the false positives of the local feature based detectors. An effective approach that uses figure-ground color segmentation has been presented in an effort to reduce the false positives in object detection. In this paper, an extended version of the approach is presented that adopts separate multipart foregrounds instead of a single prior foreground and performs the figure-ground color segmentation with each of the foregrounds. The multipart foregrounds include the parts of the head-shoulder shape and additional auxiliary foregrounds being optimized by a search algorithm. A classifier is constructed with the feature that consists of a set of the multiple resulting segmentations. Experimental results show that the presented method can discriminate more false positive than the single prior shape-based classifier as well as detectors with the local features. The improvement is possible because the presented approach can reduce the false positives that have the same colors in the head and shoulder foregrounds.

Keywords: pedestrian detection, color segmentation, false positive, feature extraction

Procedia PDF Downloads 283
859 Artificial Neural Network in Ultra-High Precision Grinding of Borosilicate-Crown Glass

Authors: Goodness Onwuka, Khaled Abou-El-Hossein

Abstract:

Borosilicate-crown (BK7) glass has found broad application in the optic and automotive industries and the growing demands for nanometric surface finishes is becoming a necessity in such applications. Thus, it has become paramount to optimize the parameters influencing the surface roughness of this precision lens. The research was carried out on a 4-axes Nanoform 250 precision lathe machine with an ultra-high precision grinding spindle. The experiment varied the machining parameters of feed rate, wheel speed and depth of cut at three levels for different combinations using Box Behnken design of experiment and the resulting surface roughness values were measured using a Taylor Hobson Dimension XL optical profiler. Acoustic emission monitoring technique was applied at a high sampling rate to monitor the machining process while further signal processing and feature extraction methods were implemented to generate the input to a neural network algorithm. This paper highlights the training and development of a back propagation neural network prediction algorithm through careful selection of parameters and the result show a better classification accuracy when compared to a previously developed response surface model with very similar machining parameters. Hence artificial neural network algorithms provide better surface roughness prediction accuracy in the ultra-high precision grinding of BK7 glass.

Keywords: acoustic emission technique, artificial neural network, surface roughness, ultra-high precision grinding

Procedia PDF Downloads 305
858 The Development of an Automated Computational Workflow to Prioritize Potential Resistance Variants in HIV Integrase Subtype C

Authors: Keaghan Brown

Abstract:

The prioritization of drug resistance mutations impacting protein folding or protein-drug and protein-DNA interactions within macromolecular systems is critical to the success of treatment regimens. With a continual increase in computational tools to assess these impacts, the need for scalability and reproducibility became an essential component of computational analysis and experimental research. Here it introduce a bioinformatics pipeline that combines several structural analysis tools in a simplified workflow, by optimizing the present computational hardware and software to automatically ease the flow of data transformations. Utilizing preestablished software tools, it was possible to develop a pipeline with a set of pre-defined functions that will automate mutation introduction into the HIV-1 Integrase protein structure, calculate the gain and loss of polar interactions and calculate the change in energy of protein fold. Additionally, an automated molecular dynamics analysis was implemented which reduces the constant need for user input and output management. The resulting pipeline, Automated Mutation Introduction and Analysis (AMIA) is an open source set of scripts designed to introduce and analyse the effects of mutations on the static protein structure as well as the results of the multi-conformational states from molecular dynamic simulations. The workflow allows the user to visualize all outputs in a user friendly manner thereby successfully enabling the prioritization of variant systems for experimental validation.

Keywords: automated workflow, variant prioritization, drug resistance, HIV Integrase

Procedia PDF Downloads 80
857 An Automated Approach to the Nozzle Configuration of Polycrystalline Diamond Compact Drill Bits for Effective Cuttings Removal

Authors: R. Suresh, Pavan Kumar Nimmagadda, Ming Zo Tan, Shane Hart, Sharp Ugwuocha

Abstract:

Polycrystalline diamond compact (PDC) drill bits are extensively used in the oil and gas industry as well as the mining industry. Industry engineers continually improve upon PDC drill bit designs and hydraulic conditions. Optimized injection nozzles play a key role in improving the drilling performance and efficiency of these ever changing PDC drill bits. In the first part of this study, computational fluid dynamics (CFD) modelling is performed to investigate the hydrodynamic characteristics of drilling fluid flow around the PDC drill bit. An Open-source CFD software – OpenFOAM simulates the flow around the drill bit, based on the field input data. A specifically developed console application integrates the entire CFD process including, domain extraction, meshing, and solving governing equations and post-processing. The results from the OpenFOAM solver are then compared with that of the ANSYS Fluent software. The data from both software programs agree. The second part of the paper describes the parametric study of the PDC drill bit nozzle to determine the effect of parameters such as number of nozzles, nozzle velocity, nozzle radial position and orientations on the flow field characteristics and bit washing patterns. After analyzing a series of nozzle configurations, the best configuration is identified and recommendations are made for modifying the PDC bit design.

Keywords: ANSYS Fluent, computational fluid dynamics, nozzle configuration, OpenFOAM, PDC dill bit

Procedia PDF Downloads 426
856 Vehicle Gearbox Fault Diagnosis Based on Cepstrum Analysis

Authors: Mohamed El Morsy, Gabriela Achtenová

Abstract:

Research on damage of gears and gear pairs using vibration signals remains very attractive, because vibration signals from a gear pair are complex in nature and not easy to interpret. Predicting gear pair defects by analyzing changes in vibration signal of gears pairs in operation is a very reliable method. Therefore, a suitable vibration signal processing technique is necessary to extract defect information generally obscured by the noise from dynamic factors of other gear pairs. This article presents the value of cepstrum analysis in vehicle gearbox fault diagnosis. Cepstrum represents the overall power content of a whole family of harmonics and sidebands when more than one family of sidebands is present at the same time. The concept for the measurement and analysis involved in using the technique are briefly outlined. Cepstrum analysis is used for detection of an artificial pitting defect in a vehicle gearbox loaded with different speeds and torques. The test stand is equipped with three dynamometers; the input dynamometer serves as the internal combustion engine, the output dynamometers introduce the load on the flanges of the output joint shafts. The pitting defect is manufactured on the tooth side of a gear of the fifth speed on the secondary shaft. Also, a method for fault diagnosis of gear faults is presented based on order cepstrum. The procedure is illustrated with the experimental vibration data of the vehicle gearbox. The results show the effectiveness of cepstrum analysis in detection and diagnosis of the gear condition.

Keywords: cepstrum analysis, fault diagnosis, gearbox, vibration signals

Procedia PDF Downloads 382
855 The Power-Knowledge Relationship in the Italian Education System between the 19th and 20th Century

Authors: G. Iacoviello, A. Lazzini

Abstract:

This paper focuses on the development of the study of accounting in the Italian education system between the 19th and 20th centuries. It also focuses on the subsequent formation of a scientific and experimental forma mentis that would prepare students for administrative and managerial activities in industry, commerce and public administration. From a political perspective, the period was characterized by two dominant movements - liberalism (1861-1922) and fascism (1922-1945) - that deeply influenced accounting practices and the entire Italian education system. The materials used in the study include both primary and secondary sources. The primary sources used to inform this study are numerous original documents issued from 1890-1935 by the government and maintained in the Historical Archive of the State in Rome. The secondary sources have supported both the development of the theoretical framework and the definition of the historical context. This paper assigns to the educational system the role of cultural producer. Foucauldian analysis identifies the problem confronted by the critical intellectual in finding a way to deploy knowledge through a 'patient labour of investigation' that highlights the contingency and fragility of the circumstances that have shaped current practices and theories. Education can be considered a powerful and political process providing students with values, ideas, and models that they will subsequently use to discipline themselves, remaining as close to them as possible. It is impossible for power to be exercised without knowledge, just as it is impossible for knowledge not to engender power. The power-knowledge relationship can be usefully employed for explaining how power operates within society, how mechanisms of power affect everyday lives. Power is employed at all levels and through many dimensions including government. Schools exercise ‘epistemological power’ – a power to extract a knowledge of individuals from individuals. Because knowledge is a key element in the operation of power, the procedures applied to the formation and accumulation of knowledge cannot be considered neutral instruments for the presentation of the real. Consequently, the same institutions that produce and spread knowledge can be considered part of the ‘power-knowledge’ interrelation. Individuals have become both objects and subject in the development of knowledge. If education plays a fundamental role in shaping all aspects of communities in the same way, the structural changes resulting from economic, social and cultural development affect the educational systems. Analogously, the important changes related to social and economic development required legislative intervention to regulate the functioning of different areas in society. Knowledge can become a means of social control used by the government to manage populations. It can be argued that the evolution of Italy’s education systems is coherent with the idea that power and knowledge do not exist independently but instead are coterminous. This research aims to reduce such a gap by analysing the role of the state in the development of accounting education in Italy.

Keywords: education system, government, knowledge, power

Procedia PDF Downloads 143
854 ISMARA: Completely Automated Inference of Gene Regulatory Networks from High-Throughput Data

Authors: Piotr J. Balwierz, Mikhail Pachkov, Phil Arnold, Andreas J. Gruber, Mihaela Zavolan, Erik van Nimwegen

Abstract:

Understanding the key players and interactions in the regulatory networks that control gene expression and chromatin state across different cell types and tissues in metazoans remains one of the central challenges in systems biology. Our laboratory has pioneered a number of methods for automatically inferring core gene regulatory networks directly from high-throughput data by modeling gene expression (RNA-seq) and chromatin state (ChIP-seq) measurements in terms of genome-wide computational predictions of regulatory sites for hundreds of transcription factors and micro-RNAs. These methods have now been completely automated in an integrated webserver called ISMARA that allows researchers to analyze their own data by simply uploading RNA-seq or ChIP-seq data sets and provides results in an integrated web interface as well as in downloadable flat form. For any data set, ISMARA infers the key regulators in the system, their activities across the input samples, the genes and pathways they target, and the core interactions between the regulators. We believe that by empowering experimental researchers to apply cutting-edge computational systems biology tools to their data in a completely automated manner, ISMARA can play an important role in developing our understanding of regulatory networks across metazoans.

Keywords: gene expression analysis, high-throughput sequencing analysis, transcription factor activity, transcription regulation

Procedia PDF Downloads 67
853 GA3C for Anomalous Radiation Source Detection

Authors: Chia-Yi Liu, Bo-Bin Xiao, Wen-Bin Lin, Hsiang-Ning Wu, Liang-Hsun Huang

Abstract:

In order to reduce the risk of radiation damage that personnel may suffer during operations in the radiation environment, the use of automated guided vehicles to assist or replace on-site personnel in the radiation environment has become a key technology and has become an important trend. In this paper, we demonstrate our proof of concept for autonomous self-learning radiation source searcher in an unknown environment without a map. The research uses GPU version of Asynchronous Advantage Actor-Critic network (GA3C) of deep reinforcement learning to search for radiation sources. The searcher network, based on GA3C architecture, has self-directed learned and improved how search the anomalous radiation source by training 1 million episodes under three simulation environments. In each episode of training, the radiation source position, the radiation source intensity, starting position, are all set randomly in one simulation environment. The input for searcher network is the fused data from a 2D laser scanner and a RGB-D camera as well as the value of the radiation detector. The output actions are the linear and angular velocities. The searcher network is trained in a simulation environment to accelerate the learning process. The well-performance searcher network is deployed to the real unmanned vehicle, Dashgo E2, which mounts LIDAR of YDLIDAR G4, RGB-D camera of Intel D455, and radiation detector made by Institute of Nuclear Energy Research. In the field experiment, the unmanned vehicle is enable to search out the radiation source of the 18.5MBq Na-22 by itself and avoid obstacles simultaneously without human interference.

Keywords: deep reinforcement learning, GA3C, source searching, source detection

Procedia PDF Downloads 115
852 An Intelligent Scheme Switching for MIMO Systems Using Fuzzy Logic Technique

Authors: Robert O. Abolade, Olumide O. Ajayi, Zacheaus K. Adeyemo, Solomon A. Adeniran

Abstract:

Link adaptation is an important strategy for achieving robust wireless multimedia communications based on quality of service (QoS) demand. Scheme switching in multiple-input multiple-output (MIMO) systems is an aspect of link adaptation, and it involves selecting among different MIMO transmission schemes or modes so as to adapt to the varying radio channel conditions for the purpose of achieving QoS delivery. However, finding the most appropriate switching method in MIMO links is still a challenge as existing methods are either computationally complex or not always accurate. This paper presents an intelligent switching method for the MIMO system consisting of two schemes - transmit diversity (TD) and spatial multiplexing (SM) - using fuzzy logic technique. In this method, two channel quality indicators (CQI) namely average received signal-to-noise ratio (RSNR) and received signal strength indicator (RSSI) are measured and are passed as inputs to the fuzzy logic system which then gives a decision – an inference. The switching decision of the fuzzy logic system is fed back to the transmitter to switch between the TD and SM schemes. Simulation results show that the proposed fuzzy logic – based switching technique outperforms conventional static switching technique in terms of bit error rate and spectral efficiency.

Keywords: channel quality indicator, fuzzy logic, link adaptation, MIMO, spatial multiplexing, transmit diversity

Procedia PDF Downloads 157
851 Control of Base Isolated Benchmark using Combined Control Strategy with Fuzzy Algorithm Subjected to Near-Field Earthquakes

Authors: Hashem Shariatmadar, Mozhgansadat Momtazdargahi

Abstract:

The purpose of control structure against earthquake is to dissipate earthquake input energy to the structure and reduce the plastic deformation of structural members. There are different methods for control structure against earthquake to reduce the structure response that they are active, semi-active, inactive and hybrid. In this paper two different combined control systems are used first system comprises base isolator and multi tuned mass dampers (BI & MTMD) and another combination is hybrid base isolator and multi tuned mass dampers (HBI & MTMD) for controlling an eight story isolated benchmark steel structure. Active control force of hybrid isolator is estimated by fuzzy logic algorithms. The influences of the combined systems on the responses of the benchmark structure under the two near-field earthquake (Newhall & Elcentro) are evaluated by nonlinear dynamic time history analysis. Applications of combined control systems consisting of passive or active systems installed in parallel to base-isolation bearings have the capability of reducing response quantities of base-isolated (relative and absolute displacement) structures significantly. Therefore in design and control of irregular isolated structures using the proposed control systems, structural demands (relative and absolute displacement and etc.) in each direction must be considered separately.

Keywords: base-isolated benchmark structure, multi-tuned mass dampers, hybrid isolators, near-field earthquake, fuzzy algorithm

Procedia PDF Downloads 306
850 Design of Demand Pacemaker Using an Embedded Controller

Authors: C. Bala Prashanth Reddy, B. Abhinay, C. Sreekar, D. V. Shobhana Priscilla

Abstract:

The project aims in designing an emergency pacemaker which is capable of giving shocks to a human heart which has stopped working suddenly. A pacemaker is a machine commonly used by cardiologists. This machine is used in order to shock a human’s heart back into usage. The way the heart works is that there are small cells called pacemakers sending electrical pulses to cardiac muscles that tell the heart when to pump blood. When these electrical pulses stop, the heart stops beating. When this happens, a pacemaker is used to shock the heart muscles and the pacemakers back into action. The way this is achieved is by rubbing the two panels of the pacemaker together to create an adequate electrical current, and then the heart gets back to the normal state. The project aims in designing a system which is capable of continuously displaying the heart beat and blood pressure of a person on LCD. The concerned doctor gets the heart beat and also the blood pressure details continuously through the GSM Modem in the form of SMS alerts. In case of abnormal condition, the doctor sends message format regarding the amount of electric shock needed. Automatically the microcontroller gives the input to the pacemaker which in turn gives the shock to the patient. Heart beat monitor and display system is a portable and a best replacement for the old model stethoscope which is less efficient. The heart beat rate is calculated manually using stethoscope where the probability of error is high because the heart beat rate lies in the range of 70 to 90 per minute whose occurrence is less than 1 sec, so this device can be considered as a very good alternative instead of a stethoscope.

Keywords: missing R wave, PWM, demand pacemaker, heart

Procedia PDF Downloads 483
849 Local Procurement in Ghana's Hotel Industry: A Study of the Driving Forces, Perceptions and Procurement Patterns

Authors: Adu-Ampomah Yaw Junior

Abstract:

Local procurement has become one of the latest trends in the discourse of sustainable tourism due to the economic benefits it generates for tourist destinations in developing countries. Local procurement helps in creating jobs which consequently helps in alleviating poverty. However, there have been limited studies on local procurement patterns in developing countries. Research on hotel procurement practices has mainly emphasized the challenges that hoteliers face when procuring locally, leaving questions regarding their motivations to engage in local procurement unanswered. The institutional theory provides a suitable framework to better understand these motivations as it underlines the importance of individual cognitive perceptions on issues in shaping organizational response strategies. More specifically, the extent to which an issue is perceived to belong to the organization’s responsibility. Also the organizational actors’ belief of losses or gains resultant from acting or not acting on an issue (degree of importance). Furthermore the organizational actors’ belief of the probability of resolving an issue (degree of feasibility). These factors influence how an organization will act on this issue. Hence, this paper adopts an institutional perspective to examine local procurement patterns of food by hoteliers in Ghana. Qualitative interviews with 20 procurement managers about their procurement practices and motivations, as well as interviews with different stakeholders for data triangulation purposes, indicated that most hotels sourced their food from middlemen who imported most of their products. However, direct importation was more prevalent foreign owned hotels as opposed to locally owned ones. Notwithstanding, the importation and the usage of foreign foods as opposed to local ones can be explained by the lack of pressure from NGOs and trade associations on hotels to act responsibly. Though guests’ menu preferences were perceived as important to hoteliers business operations, western tourists demand foreign food primarily with the foreign owned hotels make it less important to procure local produce. Lastly hoteliers, particularly those in foreign owned ones, perceive local procurement to be less feasible, raising concerns about quality and variety of local produce. The paper outlines strategies to improve the perception and degree of local Firstly, there is the need for stakeholder engagement in order to make hoteliers feel responsible for acting on the issue.Again it is crucial for Ghana government to promote and encourage hotels to buy local produce. Also, the government has to also make funds and storage facilities available for farmers to impact on the quality and quantity of local produce. Moreover, Sites need to be secured for farmers to engage in sustained farming.Furthermore, there is the need for collaborations between various stakeholders to organize training programs for farmers. Notwithstanding hotels need to market local produce to their guests. Finally, the Ghana hotels association has to encourage hotels to indulge in local procurement.

Keywords: sustainable tourism, feasible, important, local procurement

Procedia PDF Downloads 202
848 The Confiscation of Ill-Gotten Gains in Pollution: The Taiwan Experience and the Interaction between Economic Analysis of Law and Environmental Economics Perspectives

Authors: Chiang-Lead Woo

Abstract:

In reply to serious environmental problems, the Taiwan government quickly adjusted some articles to suit the needs of environmental protection recently, such as the amendment to article 190-1 of the Taiwan Criminal Code. The transfer of legislation comes as an improvement which canceled the limitation of ‘endangering public safety’. At the same time, the article 190-1 goes from accumulative concrete offense to abstract crime of danger. Thus, the public looks forward to whether environmental crime following the imposition of fines or penalties works efficiently in anti-pollution by the deterrent effects. However, according to the addition to article 38-2 of the Taiwan Criminal Code, the confiscation system seems controversial legislation to restrain ill-gotten gains. Most prior studies focused on comparisons with the Administrative Penalty Law and the Criminal Code in environmental issue in Taiwan; recently, more and more studies emphasize calculations on ill-gotten gains. Hence, this paper try to examine the deterrent effect in environmental crime by economic analysis of law and environmental economics perspective. This analysis shows that only if there is an extremely high probability (equal to 100 percent) of an environmental crime case being prosecuted criminally by Taiwan Environmental Protection Agency, the deterrent effects will work. Therefore, this paper suggests deliberating the confiscation system from supplementing the System of Environmental and Economic Accounting, reasonable deterrent fines, input management, real-time system for detection of pollution, and whistleblower system, environmental education, and modernization of law.

Keywords: confiscation, ecosystem services, environmental crime, ill-gotten gains, the deterrent effect, the system of environmental and economic accounting

Procedia PDF Downloads 174
847 Enhancing Temporal Extrapolation of Wind Speed Using a Hybrid Technique: A Case Study in West Coast of Denmark

Authors: B. Elshafei, X. Mao

Abstract:

The demand for renewable energy is significantly increasing, major investments are being supplied to the wind power generation industry as a leading source of clean energy. The wind energy sector is entirely dependable and driven by the prediction of wind speed, which by the nature of wind is very stochastic and widely random. This s0tudy employs deep multi-fidelity Gaussian process regression, used to predict wind speeds for medium term time horizons. Data of the RUNE experiment in the west coast of Denmark were provided by the Technical University of Denmark, which represent the wind speed across the study area from the period between December 2015 and March 2016. The study aims to investigate the effect of pre-processing the data by denoising the signal using empirical wavelet transform (EWT) and engaging the vector components of wind speed to increase the number of input data layers for data fusion using deep multi-fidelity Gaussian process regression (GPR). The outcomes were compared using root mean square error (RMSE) and the results demonstrated a significant increase in the accuracy of predictions which demonstrated that using vector components of the wind speed as additional predictors exhibits more accurate predictions than strategies that ignore them, reflecting the importance of the inclusion of all sub data and pre-processing signals for wind speed forecasting models.

Keywords: data fusion, Gaussian process regression, signal denoise, temporal extrapolation

Procedia PDF Downloads 137
846 Evaluating Emission Reduction Due to a Proposed Light Rail Service: A Micro-Level Analysis

Authors: Saeid Eshghi, Neeraj Saxena, Abdulmajeed Alsultan

Abstract:

Carbon dioxide (CO2) alongside other gas emissions in the atmosphere cause a greenhouse effect, resulting in an increase of the average temperature of the planet. Transportation vehicles are among the main contributors of CO2 emission. Stationary vehicles with initiated motors produce more emissions than mobile ones. Intersections with traffic lights that force the vehicles to become stationary for a period of time produce more CO2 pollution than other parts of the road. This paper focuses on analyzing the CO2 produced by the traffic flow at Anzac Parade Road - Barker Street intersection in Sydney, Australia, before and after the implementation of Light rail transport (LRT). The data are gathered during the construction phase of the LRT by collecting the number of vehicles on each path of the intersection for 15 minutes during the evening rush hour of 1 week (6-7 pm, July 04-31, 2018) and then multiplied by 4 to calculate the flow of vehicles in 1 hour. For analyzing the data, the microscopic simulation software “VISSIM” has been used. Through the analysis, the traffic flow was processed in three stages: before and after implementation of light rail train, and one during the construction phase. Finally, the traffic results were input into another software called “EnViVer”, to calculate the amount of CO2 during 1 h. The results showed that after the implementation of the light rail, CO2 will drop by a minimum of 13%. This finding provides an evidence that light rail is a sustainable mode of transport.

Keywords: carbon dioxide, emission modeling, light rail, microscopic model, traffic flow

Procedia PDF Downloads 146
845 Waste-Based Surface Modification to Enhance Corrosion Resistance of Aluminium Bronze Alloy

Authors: Wilson Handoko, Farshid Pahlevani, Isha Singla, Himanish Kumar, Veena Sahajwalla

Abstract:

Aluminium bronze alloys are well known for their superior abrasion, tensile strength and non-magnetic properties, due to the co-presence of iron (Fe) and aluminium (Al) as alloying elements and have been commonly used in many industrial applications. However, continuous exposure to the marine environment will accelerate the risk of a tendency to Al bronze alloys parts failures. Although a higher level of corrosion resistance properties can be achieved by modifying its elemental composition, it will come at a price through the complex manufacturing process and increases the risk of reducing the ductility of Al bronze alloy. In this research, the use of ironmaking slag and waste plastic as the input source for surface modification of Al bronze alloy was implemented. Microstructural analysis conducted using polarised light microscopy and scanning electron microscopy (SEM) that is equipped with energy dispersive spectroscopy (EDS). An electrochemical corrosion test was carried out through Tafel polarisation method and calculation of protection efficiency against the base-material was determined. Results have indicated that uniform modified surface which is as the result of selective diffusion process, has enhanced corrosion resistance properties up to 12.67%. This approach has opened a new opportunity to access various industrial utilisations in commercial scale through minimising the dependency on natural resources by transforming waste sources into the protective coating in environmentally friendly and cost-effective ways.

Keywords: aluminium bronze, waste-based surface modification, tafel polarisation, corrosion resistance

Procedia PDF Downloads 237
844 Optimization of the Mechanical Performance of Fused Filament Fabrication Parts

Authors: Iván Rivet, Narges Dialami, Miguel Cervera, Michele Chiumenti

Abstract:

Process parameters in Additive Manufacturing (AM) play a critical role in the mechanical performance of the final component. In order to find the input configuration that guarantees the optimal performance of the printed part, the process-performance relationship must be found. Fused Filament Fabrication (FFF) is the selected demonstrative AM technology due to its great popularity in the industrial manufacturing world. A material model that considers the different printing patterns present in a FFF part is used. A voxelized mesh is built from the manufacturing toolpaths described in the G-Code file. An Adaptive Mesh Refinement (AMR) based on the octree strategy is used in order to reduce the complexity of the mesh while maintaining its accuracy. High-fidelity and cost-efficient Finite Element (FE) simulations are performed and the influence of key process parameters in the mechanical performance of the component is analyzed. A robust optimization process based on appropriate failure criteria is developed to find the printing direction that leads to the optimal mechanical performance of the component. The Tsai-Wu failure criterion is implemented due to the orthotropy and heterogeneity constitutive nature of FFF components and because of the differences between the strengths in tension and compression. The optimization loop implements a modified version of an Anomaly Detection (AD) algorithm and uses the computed metrics to obtain the optimal printing direction. The developed methodology is verified with a case study on an industrial demonstrator.

Keywords: additive manufacturing, optimization, printing direction, mechanical performance, voxelization

Procedia PDF Downloads 66
843 Convolutional Neural Networks-Optimized Text Recognition with Binary Embeddings for Arabic Expiry Date Recognition

Authors: Mohamed Lotfy, Ghada Soliman

Abstract:

Recognizing Arabic dot-matrix digits is a challenging problem due to the unique characteristics of dot-matrix fonts, such as irregular dot spacing and varying dot sizes. This paper presents an approach for recognizing Arabic digits printed in dot matrix format. The proposed model is based on Convolutional Neural Networks (CNN) that take the dot matrix as input and generate embeddings that are rounded to generate binary representations of the digits. The binary embeddings are then used to perform Optical Character Recognition (OCR) on the digit images. To overcome the challenge of the limited availability of dotted Arabic expiration date images, we developed a True Type Font (TTF) for generating synthetic images of Arabic dot-matrix characters. The model was trained on a synthetic dataset of 3287 images and 658 synthetic images for testing, representing realistic expiration dates from 2019 to 2027 in the format of yyyy/mm/dd. Our model achieved an accuracy of 98.94% on the expiry date recognition with Arabic dot matrix format using fewer parameters and less computational resources than traditional CNN-based models. By investigating and presenting our findings comprehensively, we aim to contribute substantially to the field of OCR and pave the way for advancements in Arabic dot-matrix character recognition. Our proposed approach is not limited to Arabic dot matrix digit recognition but can also be extended to text recognition tasks, such as text classification and sentiment analysis.

Keywords: computer vision, pattern recognition, optical character recognition, deep learning

Procedia PDF Downloads 97
842 Effectiveness of Lowering the Water Table as a Mitigation Measure for Foundation Settlement in Liquefiable Soils Using 1-g Scale Shake Table Test

Authors: Kausar Alam, Mohammad Yazdi, Peiman Zogh, Ramin Motamed

Abstract:

An earthquake is an unpredictable natural disaster. It induces liquefaction, which causes considerable damage to the structure, life support, and piping systems because of ground settlement. As a result, people are incredibly concerned about how to resolve the situation. Previous researchers adopted different ground improvement techniques to reduce the settlement of the structure during earthquakes. This study evaluates the effectiveness of lowering the water table as a technique to mitigate foundation settlement in liquefiable soil. The performance will be evaluated based on foundation settlement and the reduction of excessive pore water pressure. In this study, a scaled model was prepared based on a full-scale shale table experiment conducted at the University of California, San Diego (UCSD). The model ground consists of three soil layers having a relative density of 55%, 45%, and 90%, respectively. A shallow foundation is seated over an unsaturated crust layer. After preparation of the model ground, the water table was measured to be at 45, 40, and 35 cm (from the bottom). Then, the input motions were applied for 10 seconds, with a peak acceleration of 0.25g and a constant frequency of 2.73 Hz. Based on the experimental results, the effectiveness of the lowering water table in reducing the foundation settlement and excess pore water pressure was evident. The foundation settlement was reduced from 50 mm to 5 mm. In addition, lowering the water table as a mitigation measure is a cost-effective way to decrease liquefaction-induced building settlement.

Keywords: foundation settlement, ground water table, liquefaction, hake table test

Procedia PDF Downloads 116
841 Ecological Relationships Between Material, Colonizing Organisms, and Resulting Performances

Authors: Chris Thurlbourne

Abstract:

Due to the continual demand for material to build, and a limit of good environmental material credentials of 'normal' building materials, there is a need to look at new and reconditioned material types - both biogenic and non-biogenic - and a field of research that accompanies this. This research development focuses on biogenic and non-biogenic material engineering and the impact of our environment on new and reconditioned material types. In our building industry and all the industries involved in constructing our built environment, building material types can be broadly categorized into two types, biogenic and non-biogenic material properties. Both play significant roles in shaping our built environment. Regardless of their properties, all material types originate from our earth, whereas many are modified through processing to provide resistance to 'forces of nature', be it rain, wind, sun, gravity, or whatever the local environmental conditions throw at us. Modifications are succumbed to offer benefits in endurance, resistance, malleability in handling (building with), and ergonomic values - in all types of building material. We assume control of all building materials through rigorous quality control specifications and regulations to ensure materials perform under specific constraints. Yet materials confront an external environment that is not controlled with live forces undetermined, and of which materials naturally act and react through weathering, patination and discoloring, promoting natural chemical reactions such as rusting. The purpose of the paper is to present recent research that explores the after-life of specific new and reconditioned biogenic and non-biogenic material types and how the understanding of materials' natural processes of transformation when exposed to the external climate, can inform initial design decisions. With qualities to receive in a transient and contingent manner, ecological relationships between material, the colonizing organisms and resulting performances invite opportunities for new design explorations for the benefit of both the needs of human society and the needs of our natural environment. The research follows designing for the benefit of both and engaging in both biogenic and non-biogenic material engineering whilst embracing the continual demand for colonization - human and environment, and the aptitude of a material to be colonized by one or several groups of living organisms without necessarily undergoing any severe deterioration, but embracing weathering, patination and discoloring, and at the same time establishing new habitat. The research follows iterative prototyping processes where knowledge has been accumulated via explorations of specific material performances, from laboratory to construction mock-ups focusing on the architectural qualities embedded in control of production techniques and facilitating longer-term patinas of material surfaces to extend the aesthetic beyond common judgments. Experiments are therefore focused on how the inherent material qualities drive a design brief toward specific investigations to explore aesthetics induced through production, patinas and colonization obtained over time while exposed and interactions with external climate conditions.

Keywords: biogenic and non-biogenic, natural processes of transformation, colonization, patina

Procedia PDF Downloads 88
840 The Immediate Effects of Thrust Manipulation for Thoracic Hyperkyphosis

Authors: Betul Taspinar, Eda O. Okur, Ismail Saracoglu, Ismail Okur, Ferruh Taspinar

Abstract:

Thoracic hyperkyphosis, is a well-known spinal phenomenon, refers to an excessive curvature (> 40 degrees) of the thoracic spine. The aim of this study was to explore the effectiveness of thrust manipulation on thoracic spine alignment. 31 young adults with hyperkyphosis diagnosed with Spinal Mouse® device were randomly assigned either thrust manipulation group (n=16, 11 female, 5 male) or sham manipulation group (n=15, 8 female, 7 male). Thrust and sham manipulations were performed by a blinded physiotherapist who is a certificated expert in musculoskeletal physiotherapy. Thoracic kyphosis degree was measured after the interventions via Spinal Mouse®. Wilcoxon test was used to analyse the data obtained before and after the manipulation for each group, whereas Mann-Whitney U test was used to compare the groups. The mean of baseline thoracic kyphosis degrees in thrust and sham groups were 50.69 o ± 7.73 and 48.27o ± 6.43, respectively. There was no statistically significant difference between groups in terms of initial thoracic kyphosis degrees (p=0.51). After the interventions, the mean of thoracic kyphosis degree in thrust and sham groups were measured as 44.06o ± 6.99 and 48.93o ± 6.57 respectively (p=0.03). There was no statistically significant difference between before and after interventions in sham group (p=0.33), while the mean of thoracic kyphosis degree in thrust group decreased significantly (p=0.00). Thrust manipulation can attenuate thoracic hyperkyphosis immediately in young adults by not using placebo effect. Manipulation might provide accurate proprioceptive (sensory) input to the spine joints and reduce kyphosis by restoring normal segment mobility. Therefore thoracic manipulation might be included in the physiotherapy programs to treat hyperkyphosis.

Keywords: hyperkyphosis, manual therapy, spinal mouse, physiotherapy

Procedia PDF Downloads 348
839 Fluorescence Effect of Carbon Dots Modified with Silver Nanoparticles

Authors: Anna Piasek, Anna Szymkiewicz, Gabriela Wiktor, Jolanta Pulit-Prociak, Marcin Banach

Abstract:

Carbon dots (CDs) have great potential for application in many fields of science. They are characterized by fluorescent properties that can be manipulated. The nanomaterial has many advantages in addition to its unique properties. CDs may be obtained easily, and they undergo surface functionalization in a simple way. In addition, there is a wide range of raw materials that can be used for their synthesis. An interesting possibility is the use of numerous waste materials of natural origin. In the research presented here, the synthesis of CDs was carried out according to the principles of Green chemistry. Beet molasses was used as a natural raw material. It has a high sugar content. This makes it an excellent high-carbon precursor for obtaining CDs. To increase the fluorescence effect, we modified the surface of CDs with silver (Ag-CDs) nanoparticles. The process of obtaining CQD was based on the hydrothermal method by applying microwave radiation. Silver nanoparticles were formed via the chemical reduction method. The synthesis plans were performed on the Design of the Experimental method (DoE). Variable process parameters such as concentration of beet molasses, temperature and concentration of nanosilver were used in these syntheses. They affected the obtained properties and particle parameters. The Ag-CDs were analyzed by UV-vis spectroscopy. The fluorescence properties and selection of the appropriate excitation light wavelength were performed by spectrofluorimetry. Particle sizes were checked using the DLS method. The influence of the input parameters on the obtained results was also studied.

Keywords: fluorescence, modification, nanosilver, molasses, Green chemistry, carbon dots

Procedia PDF Downloads 86
838 A Study on Green Building Certification Systems within the Context of Anticipatory Systems

Authors: Taner Izzet Acarer, Ece Ceylan Baba

Abstract:

This paper examines green building certification systems and their current processes in comparison with anticipatory systems. Rapid growth of human population and depletion of natural resources are causing irreparable damage to urban and natural environment. In this context, the concept of ‘sustainable architecture’ has emerged in the 20th century so as to establish and maintain standards for livable urban spaces, to improve quality of urban life, and to preserve natural resources for future generations. The construction industry is responsible for a large part of the resource consumption and it is believed that the ‘green building’ designs that emerge in construction industry can reduce environmental problems and contribute to sustainable development around the world. A building must meet a specific set of criteria, set forth through various certification systems, in order to be eligible for designation as a green building. It is disputable whether methods used by green building certification systems today truly serve the purposes of creating a sustainable world. Accordingly, this study will investigate the sets of rating systems used by the most popular green building certification programs, including LEED (Leadership in Energy and Environmental Design), BREEAM (Building Research Establishment's Environmental Assessment Methods), DGNB (Deutsche Gesellschaft für Nachhaltiges Bauen System), in terms of ‘Anticipatory Systems’ in accordance with the certification processes and their goals, while discussing their contribution to architecture. The basic methodology of the study is as follows. Firstly analyzes of brief historical and literature review of green buildings and certificate systems will be stated. Secondly, processes of green building certificate systems will be disputed by the help of anticipatory systems. Anticipatory Systems is a set of systems designed to generate action-oriented projections and to forecast potential side effects using the most current data. Anticipatory Systems pull the future into the present and take action based on future predictions. Although they do not have a claim to see into the future, they can provide foresight data. When shaping the foresight data, Anticipatory Systems use feedforward instead of feedback, enabling them to forecast the system’s behavior and potential side effects by establishing a correlation between the system’s present/past behavior and projected results. This study indicates the goals and current status of LEED, BREEAM and DGNB rating systems that created by using the feedback technique will be examined and presented in a chart. In addition, by examining these rating systems with the anticipatory system that using the feedforward method, the negative influences of the potential side effects on the purpose and current status of the rating systems will be shown in another chart. By comparing the two obtained data, the findings will be shown that rating systems are used for different goals than the purposes they are aiming for. In conclusion, the side effects of green building certification systems will be stated by using anticipatory system models.

Keywords: anticipatory systems, BREEAM, certificate systems, DGNB, green buildings, LEED

Procedia PDF Downloads 221
837 Accelerating Quantum Chemistry Calculations: Machine Learning for Efficient Evaluation of Electron-Repulsion Integrals

Authors: Nishant Rodrigues, Nicole Spanedda, Chilukuri K. Mohan, Arindam Chakraborty

Abstract:

A crucial objective in quantum chemistry is the computation of the energy levels of chemical systems. This task requires electron-repulsion integrals as inputs, and the steep computational cost of evaluating these integrals poses a major numerical challenge in efficient implementation of quantum chemical software. This work presents a moment-based machine-learning approach for the efficient evaluation of electron-repulsion integrals. These integrals were approximated using linear combinations of a small number of moments. Machine learning algorithms were applied to estimate the coefficients in the linear combination. A random forest approach was used to identify promising features using a recursive feature elimination approach, which performed best for learning the sign of each coefficient but not the magnitude. A neural network with two hidden layers were then used to learn the coefficient magnitudes along with an iterative feature masking approach to perform input vector compression, identifying a small subset of orbitals whose coefficients are sufficient for the quantum state energy computation. Finally, a small ensemble of neural networks (with a median rule for decision fusion) was shown to improve results when compared to a single network.

Keywords: quantum energy calculations, atomic orbitals, electron-repulsion integrals, ensemble machine learning, random forests, neural networks, feature extraction

Procedia PDF Downloads 119
836 Conceptual Solution and Thermal Analysis of the Final Cooling Process of Biscuits in One Confectionary Factory in Serbia

Authors: Duško Salemović, Aleksandar Dedić, Matilda Lazić, Dragan Halas

Abstract:

The paper presents the conceptual solution for the final cooling of the chocolate dressing of biscuits in one confectionary factory in Serbia. The proposed concept solution was derived from the desired technological process of final cooling of biscuits and the required process parameters that were to be achieved, and which were an integral part of the project task. The desired process parameters for achieving proper hardening and coating formation are the exchanged amount of heat in the time unit between the two media (air and chocolate dressing), the speed of air inside the tunnel cooler, and the surface of all biscuits in contact with the air. These parameters were calculated in the paper. The final cooling of chocolate dressing on biscuits could be optimized by changing process parameters and dimensions of the tunnel cooler and looking for the appropriate values for them. The accurate temperature predictions and fluid flow analysis could be conducted by using heat balance and flow balance equations, having in mind the theory of similarity. Furthermore, some parameters were adopted from previous technology processes, such as the inlet temperature of biscuits and input air temperature. A thermal calculation was carried out, and it was demonstrated that the percentage error between the contact surface of the air and the chocolate biscuit topping, which is obtained from the heat balance and geometrically through the proposed conceptual solution, does not exceed 0.67%, which is a very good agreement. This enabled the quality of the cooling process of chocolate dressing applied on the biscuit and the hardness of its coating.

Keywords: chocolate dressing, air, cooling, heat balance

Procedia PDF Downloads 84
835 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications

Authors: Jongbae Lee, Seongsoo Lee

Abstract:

Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.

Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL

Procedia PDF Downloads 305
834 Use of Chlorophyll Meters to Assess In-Season Wheat Nitrogen Fertilizer Requirements in the Southern San Joaquin Valley

Authors: Brian Marsh

Abstract:

Nitrogen fertilizer is the most used and often the most mismanaged nutrient input. Nitrogen management has tremendous implications on crop productivity, quality and environmental stewardship. Sufficient nitrogen is needed to optimum yield and quality. Soil and in-season plant tissue testing for nitrogen status are a time consuming and expensive process. Real time sensing of plant nitrogen status can be a useful tool in managing nitrogen inputs. The objectives of this project were to assess the reliability of remotely sensed non-destructive plant nitrogen measurements compared to wet chemistry data from sampled plant tissue, develop in-season nitrogen recommendations based on remotely sensed data for improved nitrogen use efficiency and assess the potential for determining yield and quality from remotely sensed data. Very good correlations were observed between early-season remotely sensed crop nitrogen status and plant nitrogen concentrations and subsequent in-season fertilizer recommendations. The transmittance/absorbance type meters gave the most accurate readings. Early in-season fertilizer recommendation would be to apply 40 kg nitrogen per hectare plus 16 kg nitrogen per hectare for each unit difference measured with the SPAD meter between the crop and reference area or 25 kg plus 13 kg per hectare for each unit difference measured with the CCM 200. Once the crop was sufficiently fertilized meter readings became inconclusive and were of no benefit for determining nitrogen status, silage yield and quality and grain yield and protein.

Keywords: wheat, nitrogen fertilization, chlorophyll meter

Procedia PDF Downloads 396
833 Developing a Framework to Aid Sustainable Assessment in Indian Buildings

Authors: P. Amarnath, Albert Thomas

Abstract:

Buildings qualify to be the major consumer of energy and resources thereby urging the designers, architects and policy makers to place a great deal of effort in achieving and implementing sustainable building strategies in construction. Green building rating systems help a great deal in this by measuring the effectiveness of these strategies along with the escalation of building performance in social, environmental and economic perspective, and construct new sustainable buildings. However, for a country like India, enormous population and its rapid rate of growth impose an increasing burden on the country's limited and continuously degrading natural resource base, which also includes the land available for construction. In general, the number of sustainable rated buildings in India is very minimal primarily due to the complexity and obstinate nature of the assessment systems/regulations that restrict the stakeholders and designers in proper implementation and utilization of these rating systems. This paper aims to introduce a data driven and user-friendly framework which cross compares the present prominent green building rating systems such as LEED, BREEAM, and GRIHA and subsequently help the users to rate their proposed building design as per the regulations of these assessment frameworks. This framework is validated using the input data collected from green buildings constructed globally. The proposed system has prospects to encourage the users to test the efficiency of various sustainable construction practices and thereby promote more sustainable buildings in the country.

Keywords: BREEAM, GRIHA, green building rating systems, LEED, sustainable buildings

Procedia PDF Downloads 141