Search results for: tool validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6154

Search results for: tool validation

4234 Development of GIS-Based Geotechnical Guidance Maps for Prediction of Soil Bearing Capacity

Authors: Q. Toufeeq, R. Kauser, U. R. Jamil, N. Sohaib

Abstract:

Foundation design of a structure needs soil investigation to avoid failures due to settlements. This soil investigation is expensive and time-consuming. Developments of new residential societies involve huge leveling of large sites that is accompanied by heavy land filling. Poor practices of land fill for deep depths cause differential settlements and consolidations of underneath soil that sometimes result in the collapse of structures. The extent of filling remains unknown to the individual developer unless soil investigation is carried out. Soil investigation cannot be performed on each available site due to involved costs. However, fair estimate of bearing capacity can be made if such tests are already done in the surrounding areas. The geotechnical guidance maps can provide a fair assessment of soil properties. Previously, GIS-based approaches have been used to develop maps using extrapolation and interpolations techniques for bearing capacities, underground recharge, soil classification, geological hazards, landslide hazards, socio-economic, and soil liquefaction mapping. Standard penetration test (SPT) data of surrounding sites were already available. Google Earth is used for digitization of collected data. Few points were considered for data calibration and validation. Resultant Geographic information system (GIS)-based guidance maps are helpful to anticipate the bearing capacity in the real estate industry.

Keywords: bearing capacity, soil classification, geographical information system, inverse distance weighted, radial basis function

Procedia PDF Downloads 135
4233 Constraining the Potential Nickel Laterite Area Using Geographic Information System-Based Multi-Criteria Rating in Surigao Del Sur

Authors: Reiner-Ace P. Mateo, Vince Paolo F. Obille

Abstract:

The traditional method of classifying the potential mineral resources requires a significant amount of time and money. In this paper, an alternative way to classify potential mineral resources with GIS application in Surigao del Sur. The three (3) analog map data inputs integrated to GIS are geologic map, topographic map, and land cover/vegetation map. The indicators used in the classification of potential nickel laterite integrated from the analog map data inputs are a geologic indicator, which is the presence of ultramafic rock from the geologic map; slope indicator and the presence of plateau edges from the topographic map; areas of forest land, grassland, and shrublands from the land cover/vegetation map. The potential mineral of the area was classified from low up to very high potential. The produced mineral potential classification map of Surigao del Sur has an estimated 4.63% low nickel laterite potential, 42.15% medium nickel laterite potential, 43.34% high nickel laterite potential, and 9.88% very high nickel laterite from its ultramafic terrains. For the validation of the produced map, it was compared with known occurrences of nickel laterite in the area using a nickel mining tenement map from the area with the application of remote sensing. Three (3) prominent nickel mining companies were delineated in the study area. The generated potential classification map of nickel-laterite in Surigao Del Sur may be of aid to the mining companies which are currently in the exploration phase in the study area. Also, the currently operating nickel mines in the study area can help to validate the reliability of the mineral classification map produced.

Keywords: mineral potential classification, nickel laterites, GIS, remote sensing, Surigao del Sur

Procedia PDF Downloads 123
4232 Experimental Set-up for the Thermo-Hydric Study of a Wood Chips Bed Crossed by an Air Flow

Authors: Dimitri Bigot, Bruno Malet-Damour, Jérôme Vigneron

Abstract:

Many studies have been made about using bio-based materials in buildings. The goal is to reduce its environmental footprint by analyzing its life cycle. This can lead to minimize the carbon emissions or energy consumption. A previous work proposed to numerically study the feasibility of using wood chips to regulate relative humidity inside a building. This has shown the capability of a wood chips bed to regulate humidity inside the building, to improve thermal comfort, and so potentially reduce building energy consumption. However, it also shown that some physical parameters of the wood chips must be identified to validate the proposed model and the associated results. This paper presents an experimental setup able to study such a wood chips bed with different solicitations. It consists of a simple duct filled with wood chips and crossed by an air flow with variable temperature and relative humidity. Its main objective is to study the thermal behavior of the wood chips bed by controlling temperature and relative humidity of the air that enters into it and by observing the same parameters at the output. First, the experimental set up is described according to previous results. A focus is made on the particular properties that have to be characterized. Then some case studies are presented in relation to the previous results in order to identify the key physical properties. Finally, the feasibility of the proposed technology is discussed, and some model validation paths are given.

Keywords: wood chips bed, experimental set-up, bio-based material, desiccant, relative humidity, water content, thermal behaviour, air treatment

Procedia PDF Downloads 122
4231 Acoustic Modeling of a Data Center with a Hot Aisle Containment System

Authors: Arshad Alfoqaha, Seth Bard, Dustin Demetriou

Abstract:

A new multi-physics acoustic modeling approach using ANSYS Mechanical FEA and FLUENT CFD methods is developed for modeling servers mounted to racks, such as IBM Z and IBM Power Systems, in data centers. This new approach allows users to determine the thermal and acoustic conditions that people are exposed to within the data center. The sound pressure level (SPL) exposure for a human working inside a hot aisle containment system inside the data center is studied. The SPL is analyzed at the noise source, at the human body, on the rack walls, on the containment walls, and on the ceiling and flooring plenum walls. In the acoustic CFD simulation, it is assumed that a four-inch diameter sphere with monopole acoustic radiation, placed in the middle of each rack, provides a single-source representation of all noise sources within the rack. Ffowcs Williams & Hawkings (FWH) acoustic model is employed. The target frequency is 1000 Hz, and the total simulation time for the transient analysis is 1.4 seconds, with a very small time step of 3e-5 seconds and 10 iterations to ensure convergence and accuracy. A User Defined Function (UDF) is developed to accurately simulate the acoustic noise source, and a Dynamic Mesh is applied to ensure acoustic wave propagation. Initial validation of the acoustic CFD simulation using a closed-form solution for the spherical propagation of an acoustic point source is performed.

Keywords: data centers, FLUENT, acoustics, sound pressure level, SPL, hot aisle containment, IBM

Procedia PDF Downloads 176
4230 Simulation of Glass Breakage Using Voronoi Random Field Tessellations

Authors: Michael A. Kraus, Navid Pourmoghaddam, Martin Botz, Jens Schneider, Geralt Siebert

Abstract:

Fragmentation analysis of tempered glass gives insight into the quality of the tempering process and defines a certain degree of safety as well. Different standard such as the European EN 12150-1 or the American ASTM C 1048/CPSC 16 CFR 1201 define a minimum number of fragments required for soda-lime safety glass on the basis of fragmentation test results for classification. This work presents an approach for the glass breakage pattern prediction using a Voronoi Tesselation over Random Fields. The random Voronoi tessellation is trained with and validated against data from several breakage patterns. The fragments in observation areas of 50 mm x 50 mm were used for training and validation. All glass specimen used in this study were commercially available soda-lime glasses at three different thicknesses levels of 4 mm, 8 mm and 12 mm. The results of this work form a Bayesian framework for the training and prediction of breakage patterns of tempered soda-lime glass using a Voronoi Random Field Tesselation. Uncertainties occurring in this process can be well quantified, and several statistical measures of the pattern can be preservation with this method. Within this work it was found, that different Random Fields as basis for the Voronoi Tesselation lead to differently well fitted statistical properties of the glass breakage patterns. As the methodology is derived and kept general, the framework could be also applied to other random tesselations and crack pattern modelling purposes.

Keywords: glass breakage predicition, Voronoi Random Field Tessellation, fragmentation analysis, Bayesian parameter identification

Procedia PDF Downloads 160
4229 A Robotic Cube to Preschool Children for Acquiring the Mathematical and Colours Concepts

Authors: Ahmed Amin Mousa, Tamer M. Ismail, M. Abd El Salam

Abstract:

This work presents a robot called Conceptual Robotic Cube, CR-Cube. The robot can be used as an educational tool for children from the age of three. It has a cube shape attached with a camera colours sensor. In addition, it contains four wheels to move smoothly. The researchers prepared a questionnaire to measure the efficiency of the robot. The design and the questionnaire was presented to 11 experts who agreed that the robot is appropriate for learning numbering and colours for preschool children.

Keywords: CR-Cube, robotic cube, conceptual robot, conceptual cube, colour concept, early childhood education

Procedia PDF Downloads 410
4228 Aeroelastic Stability Analysis in Turbomachinery Using Reduced Order Aeroelastic Model Tool

Authors: Chandra Shekhar Prasad, Ludek Pesek Prasad

Abstract:

In the present day fan blade of aero engine, turboprop propellers, gas turbine or steam turbine low-pressure blades are getting bigger, lighter and thus, become more flexible. Therefore, flutter, forced blade response and vibration related failure of the high aspect ratio blade are of main concern for the designers, thus need to be address properly in order to achieve successful component design. At the preliminary design stage large number of design iteration is need to achieve the utter free safe design. Most of the numerical method used for aeroelastic analysis is based on field-based methods such as finite difference method, finite element method, finite volume method or coupled. These numerical schemes are used to solve the coupled fluid Flow-Structural equation based on full Naiver-Stokes (NS) along with structural mechanics’ equations. These type of schemes provides very accurate results if modeled properly, however, they are computationally very expensive and need large computing recourse along with good personal expertise. Therefore, it is not the first choice for aeroelastic analysis during preliminary design phase. A reduced order aeroelastic model (ROAM) with acceptable accuracy and fast execution is more demanded at this stage. Similar ROAM are being used by other researchers for aeroelastic and force response analysis of turbomachinery. In the present paper new medium fidelity ROAM is successfully developed and implemented in numerical tool to simulated the aeroelastic stability phenomena in turbomachinery and well as flexible wings. In the present, a hybrid flow solver based on 3D viscous-inviscid coupled 3D panel method (PM) and 3d discrete vortex particle method (DVM) is developed, viscous parameters are estimated using boundary layer(BL) approach. This method can simulate flow separation and is a good compromise between accuracy and speed compared to CFD. In the second phase of the research work, the flow solver (PM) will be coupled with ROM non-linear beam element method (BEM) based FEM structural solver (with multibody capabilities) to perform the complete aeroelastic simulation of a steam turbine bladed disk, propellers, fan blades, aircraft wing etc. The partitioned based coupling approach is used for fluid-structure interaction (FSI). The numerical results are compared with experimental data for different test cases and for the blade cascade test case, experimental data is obtained from in-house lab experiments at IT CAS. Furthermore, the results from the new aeroelastic model will be compared with classical CFD-CSD based aeroelastic models. The proposed methodology for the aeroelastic stability analysis of gas turbine or steam turbine blades, or propellers or fan blades will provide researchers and engineers a fast, cost-effective and efficient tool for aeroelastic (classical flutter) analysis for different design at preliminary design stage where large numbers of design iteration are required in short time frame.

Keywords: aeroelasticity, beam element method (BEM), discrete vortex particle method (DVM), classical flutter, fluid-structure interaction (FSI), panel method, reduce order aeroelastic model (ROAM), turbomachinery, viscous-inviscid coupling

Procedia PDF Downloads 266
4227 Multi-Criteria Decision Making Tool for Assessment of Biorefinery Strategies

Authors: Marzouk Benali, Jawad Jeaidi, Behrang Mansoornejad, Olumoye Ajao, Banafsheh Gilani, Nima Ghavidel Mehr

Abstract:

Canadian forest industry is seeking to identify and implement transformational strategies for enhanced financial performance through the emerging bioeconomy or more specifically through the concept of the biorefinery. For example, processing forest residues or surplus of biomass available on the mill sites for the production of biofuels, biochemicals and/or biomaterials is one of the attractive strategies along with traditional wood and paper products and cogenerated energy. There are many possible process-product biorefinery pathways, each associated with specific product portfolios with different levels of risk. Thus, it is not obvious which unique strategy forest industry should select and implement. Therefore, there is a need for analytical and design tools that enable evaluating biorefinery strategies based on a set of criteria considering a perspective of sustainability over the short and long terms, while selecting the existing core products as well as selecting the new product portfolio. In addition, it is critical to assess the manufacturing flexibility to internalize the risk from market price volatility of each targeted bio-based product in the product portfolio, prior to invest heavily in any biorefinery strategy. The proposed paper will focus on introducing a systematic methodology for designing integrated biorefineries using process systems engineering tools as well as a multi-criteria decision making framework to put forward the most effective biorefinery strategies that fulfill the needs of the forest industry. Topics to be covered will include market analysis, techno-economic assessment, cost accounting, energy integration analysis, life cycle assessment and supply chain analysis. This will be followed by describing the vision as well as the key features and functionalities of the I-BIOREF software platform, developed by CanmetENERGY of Natural Resources Canada. Two industrial case studies will be presented to support the robustness and flexibility of I-BIOREF software platform: i) An integrated Canadian Kraft pulp mill with lignin recovery process (namely, LignoBoost™); ii) A standalone biorefinery based on ethanol-organosolv process.

Keywords: biorefinery strategies, bioproducts, co-production, multi-criteria decision making, tool

Procedia PDF Downloads 232
4226 The Development of an Automated Computational Workflow to Prioritize Potential Resistance Variants in HIV Integrase Subtype C

Authors: Keaghan Brown

Abstract:

The prioritization of drug resistance mutations impacting protein folding or protein-drug and protein-DNA interactions within macromolecular systems is critical to the success of treatment regimens. With a continual increase in computational tools to assess these impacts, the need for scalability and reproducibility became an essential component of computational analysis and experimental research. Here it introduce a bioinformatics pipeline that combines several structural analysis tools in a simplified workflow, by optimizing the present computational hardware and software to automatically ease the flow of data transformations. Utilizing preestablished software tools, it was possible to develop a pipeline with a set of pre-defined functions that will automate mutation introduction into the HIV-1 Integrase protein structure, calculate the gain and loss of polar interactions and calculate the change in energy of protein fold. Additionally, an automated molecular dynamics analysis was implemented which reduces the constant need for user input and output management. The resulting pipeline, Automated Mutation Introduction and Analysis (AMIA) is an open source set of scripts designed to introduce and analyse the effects of mutations on the static protein structure as well as the results of the multi-conformational states from molecular dynamic simulations. The workflow allows the user to visualize all outputs in a user friendly manner thereby successfully enabling the prioritization of variant systems for experimental validation.

Keywords: automated workflow, variant prioritization, drug resistance, HIV Integrase

Procedia PDF Downloads 77
4225 Realization of Hybrid Beams Inertial Amplifier

Authors: Somya Ranjan Patro, Abhigna Bhatt, Arnab Banerjee

Abstract:

Inertial amplifier has recently gained increasing attention as a new mechanism for vibration control of structures. Currently, theoretical investigations are undertaken by researchers to reveal its fundamentals and to understand its underline principles in altering the structural response of structures against dynamic loadings. This paper investigates experimental and analytical studies on the dynamic characteristics of hybrid beam inertial amplifier (HBIA). The analytical formulation of the HBIA has been derived by implementing the spectral element method and rigid body dynamics. This formulation gives the relation between dynamic force and the response of the structure in the frequency domain. Further, for validation of the proposed HBIA, the experiments have been performed. The experimental setup consists of a 3D printed HBIA of polylactic acid (PLA) material screwed at the base plate of the shaker system. Two numbers of accelerometers are used to study the response, one at the base plate of the shaker second one placed at the top of the inertial amplifier. A force transducer is also placed in between the base plate and the inertial amplifier to calculate the total amount of load transferred from the base plate to the inertial amplifier. The obtained time domain response from the accelerometers have been converted into the frequency domain using the Fast Fourier Transform (FFT) algorithm. The experimental transmittance values are successfully validated with the analytical results, providing us essential confidence in our proposed methodology.

Keywords: inertial amplifier, fast fourier transform, natural frequencies, polylactic acid, transmittance, vibration absorbers

Procedia PDF Downloads 103
4224 LWD Acquisition of Caliper and Drilling Mechanics in a Geothermal Well, A Case Study in Sorik Marapi Field – Indonesia

Authors: Vinda B. Manurung, Laila Warkhaida, David Hutabarat, Sentanu Wisnuwardhana, Christovik Simatupang, Dhani Sanjaya, Ashadi, Redha B. Putra, Kiki Yustendi

Abstract:

The geothermal drilling environment presents many obstacles that have limited the use of directional drilling and logging-while-drilling (LWD) technologies, such as borehole washout, mud losses, severe vibration, and high temperature. The case study presented in this paper demonstrates a practice to enhance data logging in geothermal drilling by deploying advanced telemetry and LWD technologies. This operation is aiming continuous improvement in geothermal drilling operations. The case study covers a 12.25-in. hole section of well XX-05 in Pad XX of the Sorik Marapi Geothermal Field. LWD string consists of electromagnetic (EM) telemetry, pressure while drilling (PWD), vibration (DDSr), and acoustic calliper (ACAL). Through this tool configuration, the operator acquired drilling mechanics and caliper logs in real-time and recorded mode, enabling effective monitoring of wellbore stability. Throughout the real-time acquisition, EM-PPM telemetry had provided a three times faster data rate to the surface unit. With the integration of Caliper data and Drilling mechanics data (vibration and ECD -equivalent circulating density), the borehole conditions were more visible to the directional driller, allowing for better control of drilling parameters to minimize vibration and achieve optimum hole cleaning in washed-out or tight formation sequences. After reaching well TD, the recorded data from the caliper sensor indicated an average of 8.6% washout for the entire 12.25-in. interval. Washout intervals were compared with loss occurrence, showing potential for the caliper to be used as an indirect indicator of fractured intervals and validating fault trend prognosis. This LWD case study has given added value in geothermal borehole characterization for both drilling operation and subsurface. Identified challenges while running LWD in this geothermal environment need to be addressed for future improvements, such as the effect of tool eccentricity and the impact of vibration. A perusal of both real-time and recorded drilling mechanics and caliper data has opened various possibilities for maximizing sensor usage in future wells.

Keywords: geothermal drilling, geothermal formation, geothermal technologies, logging-while-drilling, vibration, caliper, case study

Procedia PDF Downloads 131
4223 Barriers and Challenges to a Healthy Lifestyle for Postpartum Women and the Possibilities in an Information Technology-Based Intervention: A Qualitative Study

Authors: Pernille K. Christiansen, Mette Maria Skjøth, Line Lorenzen, Eva Draborg, Christina Anne Vinter, Trine Kjær, Mette Juel Rothmann

Abstract:

Background and aims: Overweight and obesity are an increasing challenge on a global level. In Denmark, more than one-third of all pregnant women are overweight or obese, and many women exceed the gestational weight gain recommendations from the Institute of Medicine. Being overweight or obese, is associated with a higher risk of adverse maternal and fetal outcomes, including gestational diabetes and childhood obesity. Thus, it is important to focus on the women’s lifestyles between their pregnancies to lower the risk of gestational weight retention in the long run. The objective of this study was to explorer what barriers and challenges postpartum women experience with respect to healthy lifestyles during the postpartum period and to access whether an Information Technology based intervention might be a supportive tool to assist and motivate postpartum women to a healthy lifestyle. Materials and methods: The method is inspired by participatory design. A systematic text condensation was applied to semi-structured focus groups. Five focus group interviews were carried out with a total of 17 postpartum women and two interviews with a total of six health professionals. Participants were recruited through the municipality in Svendborg, Denmark, and at Odense University Hospital in Odense, Denmark, during a four-month period in early 2018. Results: From the women’s perspective, better assistance is needed from the health professionals to obtain or maintain a healthy lifestyle. The women need tools that inform and help them understand and prioritise their own health-related risks, and to motivate them to plan and take care of their own health. As the women use Information Technology on a daily basis, the solution could be delivered through Information Technology. Finally, there is room for engaging the partner more in the communication related to the baby and family’s lifestyle. Conclusion: Postpartum women need tools that inform and motivate a healthy lifestyle postpartum. The tools should allow access to high-quality information from health care professionals, when the information is needed, and also allow engagement from the partner. Finally, Information Technology is a potential tool for delivering tools.

Keywords: information technology, lifestyle, overweight, postpartum

Procedia PDF Downloads 147
4222 Actualizing Millennium Development Goals through a Refocused Basic Mathematics Curriculum

Authors: Ali Yaro Kankia

Abstract:

Millennium Development Goals are eight goals set by the 189 United Nations member States with 2015 as its target year of achievement. Since its signing in September 2000, individual nations have been finding ways and means of actualizing them. This paper consider how a refocused basic Mathematics curriculum could serve as an appropriate tool in achieving these goals. This was done by considering the theme in the following sub-headings. Basic Mathematics curriculum before now, basic Mathematics curriculum and the millennium development Goals and challenges of a refocused basic Mathematics curriculum for the MDGs. The appropriate conclusion was reached.

Keywords: actualizing, curriculum, MDGs, refocused

Procedia PDF Downloads 389
4221 Stability-Indicating High-Performance Thin-Layer Chromatography Method for Estimation of Naftopidil

Authors: P. S. Jain, K. D. Bobade, S. J. Surana

Abstract:

A simple, selective, precise and Stability-indicating High-performance thin-layer chromatographic method for analysis of Naftopidil both in a bulk and in pharmaceutical formulation has been developed and validated. The method employed, HPTLC aluminium plates precoated with silica gel as the stationary phase. The solvent system consisted of hexane: ethyl acetate: glacial acetic acid (4:4:2 v/v). The system was found to give compact spot for Naftopidil (Rf value of 0.43±0.02). Densitometric analysis of Naftopidil was carried out in the absorbance mode at 253 nm. The linear regression analysis data for the calibration plots showed good linear relationship with r2=0.999±0.0001 with respect to peak area in the concentration range 200-1200 ng per spot. The method was validated for precision, recovery and robustness. The limits of detection and quantification were 20.35 and 61.68 ng per spot, respectively. Naftopidil was subjected to acid and alkali hydrolysis, oxidation and thermal degradation. The drug undergoes degradation under acidic, basic, oxidation and thermal conditions. This indicates that the drug is susceptible to acid, base, oxidation and thermal conditions. The degraded product was well resolved from the pure drug with significantly different Rf value. Statistical analysis proves that the method is repeatable, selective and accurate for the estimation of investigated drug. The proposed developed HPTLC method can be applied for identification and quantitative determination of Naftopidil in bulk drug and pharmaceutical formulation.

Keywords: naftopidil, HPTLC, validation, stability, degradation

Procedia PDF Downloads 400
4220 A Study of Student Satisfaction of the University TV Station

Authors: Prapoj Na Bangchang

Abstract:

This research aimed to study the satisfaction of university students on the Suan Sunandha Rajabhat University television station. The sample were 250 undergraduate students from Year 1 to Year 4. The tool used to collect data was a questionnaire. Statistics used in data analysis were percentage, mean and standard deviation. The results showed that student satisfaction on the University's television station location received high score, followed by the number of devices, and the content presented received the lowest score. Most students want the content of the programs to be improved especially entertainment content, followed by sports content.

Keywords: student satisfaction, university TV channel, media, broadcasting

Procedia PDF Downloads 385
4219 Digital Structural Monitoring Tools @ADaPT for Cracks Initiation and Growth due to Mechanical Damage Mechanism

Authors: Faizul Azly Abd Dzubir, Muhammad F. Othman

Abstract:

Conventional structural health monitoring approach for mechanical equipment uses inspection data from Non-Destructive Testing (NDT) during plant shut down window and fitness for service evaluation to estimate the integrity of the equipment that is prone to crack damage. Yet, this forecast is fraught with uncertainty because it is often based on assumptions of future operational parameters, and the prediction is not continuous or online. Advanced Diagnostic and Prognostic Technology (ADaPT) uses Acoustic Emission (AE) technology and a stochastic prognostic model to provide real-time monitoring and prediction of mechanical defects or cracks. The forecast can help the plant authority handle their cracked equipment before it ruptures, causing an unscheduled shutdown of the facility. The ADaPT employs process historical data trending, finite element analysis, fitness for service, and probabilistic statistical analysis to develop a prediction model for crack initiation and growth due to mechanical damage. The prediction model is combined with live equipment operating data for real-time prediction of the remaining life span owing to fracture. ADaPT was devised at a hot combined feed exchanger (HCFE) that had suffered creep crack damage. The ADaPT tool predicts the initiation of a crack at the top weldment area by April 2019. During the shutdown window in April 2019, a crack was discovered and repaired. Furthermore, ADaPT successfully advised the plant owner to run at full capacity and improve output by up to 7% by April 2019. ADaPT was also used on a coke drum that had extensive fatigue cracking. The initial cracks are declared safe with ADaPT, with remaining crack lifetimes extended another five (5) months, just in time for another planned facility downtime to execute repair. The prediction model, when combined with plant information data, allows plant operators to continuously monitor crack propagation caused by mechanical damage for improved maintenance planning and to avoid costly shutdowns to repair immediately.

Keywords: mechanical damage, cracks, continuous monitoring tool, remaining life, acoustic emission, prognostic model

Procedia PDF Downloads 77
4218 A Review on Artificial Neural Networks in Image Processing

Authors: B. Afsharipoor, E. Nazemi

Abstract:

Artificial neural networks (ANNs) are powerful tool for prediction which can be trained based on a set of examples and thus, it would be useful for nonlinear image processing. The present paper reviews several paper regarding applications of ANN in image processing to shed the light on advantage and disadvantage of ANNs in this field. Different steps in the image processing chain including pre-processing, enhancement, segmentation, object recognition, image understanding and optimization by using ANN are summarized. Furthermore, results on using multi artificial neural networks are presented.

Keywords: neural networks, image processing, segmentation, object recognition, image understanding, optimization, MANN

Procedia PDF Downloads 407
4217 High-Resolution Spatiotemporal Retrievals of Aerosol Optical Depth from Geostationary Satellite Using Sara Algorithm

Authors: Muhammad Bilal, Zhongfeng Qiu

Abstract:

Aerosols, suspended particles in the atmosphere, play an important role in the earth energy budget, climate change, degradation of atmospheric visibility, urban air quality, and human health. To fully understand aerosol effects, retrieval of aerosol optical properties such as aerosol optical depth (AOD) at high spatiotemporal resolution is required. Therefore, in the present study, hourly AOD observations at 500 m resolution were retrieved from the geostationary ocean color imager (GOCI) using the simplified aerosol retrieval algorithm (SARA) over the urban area of Beijing for the year 2016. The SARA requires top-of-the-atmosphere (TOA) reflectance, solar and sensor geometry information and surface reflectance observations to retrieve an accurate AOD. For validation of the GOCI retrieved AOD, AOD measurements were obtained from the aerosol robotic network (AERONET) version 3 level 2.0 (cloud-screened and quality assured) data. The errors and uncertainties were reported using the root mean square error (RMSE), relative percent mean error (RPME), and the expected error (EE = ± (0.05 + 0.15AOD). Results showed that the high spatiotemporal GOCI AOD observations were well correlated with the AERONET AOD measurements with a correlation coefficient (R) of 0.92, RMSE of 0.07, and RPME of 5%, and 90% of the observations were within the EE. The results suggested that the SARA is robust and has the ability to retrieve high-resolution spatiotemporal AOD observations over the urban area using the geostationary satellite.

Keywords: AEORNET, AOD, SARA, GOCI, Beijing

Procedia PDF Downloads 171
4216 Using a Mobile App to Foster Children Active Travel to School in Spain

Authors: P. Pérez-Martín, G. Pedrós, P. Martínez-Jiménez, M. Varo-Martínez

Abstract:

In recent decades, family habits related to children’s displacements to school have changed, increasing motorized travels against active modes. This entails a major negative impact on the urban environment, road safety in cities and the physical and psychological development of children. One of the more common actions used to reverse this trend is Walking School Bus (WSB), which consists of a predefined adult-scorted pedestrian route to school with several stops along the path where schoolchildren are collected. At Tirso de Molina School in Cordoba (Spain), a new ICT-based methodology to deploy WSB has been tested. A mobile app that allows the geoposition of the group, the notification of the arrival and real-time communication between the WSB participants have been presented to the families in order to organize and register the daily participation. After an initial survey to know the travel mode and the spatial distribution of the interested families, three WSB routes have been established and the families have been trained in the app usage. During nine weeks, 33 children have joined the WSB and their parents have accompanied the groups in turns. A high recurrence in the attendance has been registered. Through a final survey, participants have valued highly the tool and the methodology designed, emphasizing as most useful features of the mobile app: notifications system, chat and real-time monitoring. It has also been found that the tool has had a major impact on the degree of confidence of parents regarding the autonomous on foot displacement of their children to school. Moreover, 37,9% of the participant families have reported a total or partial modal shift from car to walking, and the benefits more reported are an increment of the parents available time and less problems in the travel to school daily organization. As a consequence, It has been proved the effectiveness of this user-centric innovative ICT-based methodology to reduce the levels of private car drop offs, minimize barriers of time constraints, volunteer recruitment, and parents’ safety concerns, while, at the same time, increase convenience and time savings for families. This pilot study can offer guidance for community coordinated actions and local authority interventions to support sustainable school travel outcomes.

Keywords: active travel, mobile app, sustainable mobility, urban transportation planning, walking school bus

Procedia PDF Downloads 336
4215 Global Developmental Delay and Its Association with Risk Factors: Validation by Structural Equation Modelling

Authors: Bavneet Kaur Sidhu, Manoj Tiwari

Abstract:

Global Developmental Delay (GDD) is a common pediatric condition. Etiologies of GDD might, however, differ in developing countries. In the last decade, sporadic families are being reported in various countries. As to the author’s best knowledge, many risk factors and their correlation with the prevalence of GDD have been studied but its statistical correlation has not been done. Thus we propose the present study by targeting the risk factor, prevalence and their statistical correlation with GDD. FMR1 gene was studied to confirm the disease and its penetrance. A complete questionnaire-based performance was designed for the statistical studies having a personal, past and present medical history along with their socio-economic status as well. Methods: We distributed the children’s age in 4 different age groups having 5-year intervals and applied structural equation modeling (SEM) techniques, Spearman’s rank correlation coefficient, Karl Pearson correlation coefficient, and chi-square test.Result: A total of 1100 families were enrolled for this study; among them, 330 were clinically and biologically confirmed (radiological studies) for the disease, 204 were males (61.8%), 126 were females (38.18%). We found that 27.87% were genetic and 72.12 were sporadic, out of 72.12 %, 43.277% cases from urban and 56.72% from the rural locality, the mothers' literacy rate was 32.12% and working women numbers were 41.21%. Conclusions: There is a significant association between mothers' age and GDD prevalence, which is also followed by mothers' literacy rate and mothers' occupation, whereas there was no association between fathers' age and GDD.

Keywords: global developmental delay, FMR1 gene, spearman’ rank correlation coefficient, structural equation modeling

Procedia PDF Downloads 135
4214 Revisiting the Historical Narratives of the Old Churches in Albay, Bikol Region, Philippines

Authors: Ruby Ann L. Ayo

Abstract:

As cultural heritage reflects the historical origin of a certain group of people, it reveals their customs, traits, beliefs, practices and even values they hold on for years. One of the tangible examples of cultural heritage is the physical structures including the old churches. The study looked-into the existing historical narratives of the century Old Catholic churches in the Province of Albay, Bikol Region, Philippines: NuestraSeñora de Salvacion in Joroan, Tiwi, Albay; the Our Lady of the Gate in Daraga, Albay; the San Juan de Bautista in Tabaco City and the St. John the Baptist in Camalig, Albay. The historical narratives were analysed in terms of validity and reliability of the secondary documents with reference to the elements of history revealing consistency and adequacy of historical facts. The contents were examined using a modified Checklist of Historical Documents. The historical narratives were likewise submitted to the content expert for validation as regards historical authenticity and accuracy. The contents of the narratives were scrutinized according to the following codes: (1.1) the Patron Saints;(1.2) factors that paved to their constructions; (1.3) the people responsible for their constructions; (1.4) the misconceptions about their constructions; and (1.5) their contributions to Bikol heritage. Based on the codes, themes were identified as: (2.1) Marian Devotees and Christ-centered Patron Saints; (2.2) geographical, socio-political and cultural factors; (2.3) church and government officials; (2.4) misconceptions on the dates of constructions and original sites; and (2.5) popular pilgrim sites and well-admired architectural designs.

Keywords: historical narratives, old churches, cultural heritage, historical validity and reliability, elements of history

Procedia PDF Downloads 294
4213 KPI and Tool for the Evaluation of Competency in Warehouse Management for Furniture Business

Authors: Kritchakhris Na-Wattanaprasert

Abstract:

The objective of this research is to design and develop a prototype of a key performance indicator system this is suitable for warehouse management in a case study and use requirement. In this study, we design a prototype of key performance indicator system (KPI) for warehouse case study of furniture business by methodology in step of identify scope of the research and study related papers, gather necessary data and users requirement, develop key performance indicator base on balance scorecard, design pro and database for key performance indicator, coding the program and set relationship of database and finally testing and debugging each module. This study use Balance Scorecard (BSC) for selecting and grouping key performance indicator. The system developed by using Microsoft SQL Server 2010 is used to create the system database. In regard to visual-programming language, Microsoft Visual C# 2010 is chosen as the graphic user interface development tool. This system consists of six main menus: menu login, menu main data, menu financial perspective, menu customer perspective, menu internal, and menu learning and growth perspective. Each menu consists of key performance indicator form. Each form contains a data import section, a data input section, a data searches – edit section, and a report section. The system generates outputs in 5 main reports, the KPI detail reports, KPI summary report, KPI graph report, benchmarking summary report and benchmarking graph report. The user will select the condition of the report and period time. As the system has been developed and tested, discovers that it is one of the ways to judging the extent to warehouse objectives had been achieved. Moreover, it encourages the warehouse functional proceed with more efficiency. In order to be useful propose for other industries, can adjust this system appropriately. To increase the usefulness of the key performance indicator system, the recommendations for further development are as follows: -The warehouse should review the target value and set the better suitable target periodically under the situation fluctuated in the future. -The warehouse should review the key performance indicators and set the better suitable key performance indicators periodically under the situation fluctuated in the future for increasing competitiveness and take advantage of new opportunities.

Keywords: key performance indicator, warehouse management, warehouse operation, logistics management

Procedia PDF Downloads 431
4212 i2kit: A Tool for Immutable Infrastructure Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.

Keywords: container, deployment, immutable infrastructure, microservice

Procedia PDF Downloads 179
4211 Response Surface Methodology for Optimum Hardness of TiN on Steel Substrate

Authors: R. Joseph Raviselvan, K. Ramanathan, P. Perumal, M. R. Thansekhar

Abstract:

Hard coatings are widely used in cutting and forming tool industries. Titanium Nitride (TiN) possesses good hardness, strength and corrosion resistant. The coating properties are influenced by many process parameters. The coatings were deposited on steel substrate by changing the process parameters such as substrate temperature, nitrogen flow rate and target power in a D.C planer magnetron sputtering. The structure of coatings were analysed using XRD. The hardness of coatings was found using Micro hardness tester. From the experimental data, a regression model was developed and the optimum response was determined using Response Surface Methodology (RSM).

Keywords: hardness, RSM, sputtering, TiN XRD

Procedia PDF Downloads 321
4210 An Integrated Real-Time Hydrodynamic and Coastal Risk Assessment Model

Authors: M. Reza Hashemi, Chris Small, Scott Hayward

Abstract:

The Northeast Coast of the US faces damaging effects of coastal flooding and winds due to Atlantic tropical and extratropical storms each year. Historically, several large storm events have produced substantial levels of damage to the region; most notably of which were the Great Atlantic Hurricane of 1938, Hurricane Carol, Hurricane Bob, and recently Hurricane Sandy (2012). The objective of this study was to develop an integrated modeling system that could be used as a forecasting/hindcasting tool to evaluate and communicate the risk coastal communities face from these coastal storms. This modeling system utilizes the ADvanced CIRCulation (ADCIRC) model for storm surge predictions and the Simulating Waves Nearshore (SWAN) model for the wave environment. These models were coupled, passing information to each other and computing over the same unstructured domain, allowing for the most accurate representation of the physical storm processes. The coupled SWAN-ADCIRC model was validated and has been set up to perform real-time forecast simulations (as well as hindcast). Modeled storm parameters were then passed to a coastal risk assessment tool. This tool, which is generic and universally applicable, generates spatial structural damage estimate maps on an individual structure basis for an area of interest. The required inputs for the coastal risk model included a detailed information about the individual structures, inundation levels, and wave heights for the selected region. Additionally, calculation of wind damage to structures was incorporated. The integrated coastal risk assessment system was then tested and applied to Charlestown, a small vulnerable coastal town along the southern shore of Rhode Island. The modeling system was applied to Hurricane Sandy and a synthetic storm. In both storm cases, effect of natural dunes on coastal risk was investigated. The resulting damage maps for the area (Charlestown) clearly showed that the dune eroded scenarios affected more structures, and increased the estimated damage. The system was also tested in forecast mode for a large Nor’Easters: Stella (March 2017). The results showed a good performance of the coupled model in forecast mode when compared to observations. Finally, a nearshore model XBeach was then nested within this regional grid (ADCIRC-SWAN) to simulate nearshore sediment transport processes and coastal erosion. Hurricane Irene (2011) was used to validate XBeach, on the basis of a unique beach profile dataset at the region. XBeach showed a relatively good performance, being able to estimate eroded volumes along the beach transects with a mean error of 16%. The validated model was then used to analyze the effectiveness of several erosion mitigation methods that were recommended in a recent study of coastal erosion in New England: beach nourishment, coastal bank (engineered core), and submerged breakwater as well as artificial surfing reef. It was shown that beach nourishment and coastal banks perform better to mitigate shoreline retreat and coastal erosion.

Keywords: ADCIRC, coastal flooding, storm surge, coastal risk assessment, living shorelines

Procedia PDF Downloads 116
4209 Unlocking the Puzzle of Borrowing Adult Data for Designing Hybrid Pediatric Clinical Trials

Authors: Rajesh Kumar G

Abstract:

A challenging aspect of any clinical trial is to carefully plan the study design to meet the study objective in optimum way and to validate the assumptions made during protocol designing. And when it is a pediatric study, there is the added challenge of stringent guidelines and difficulty in recruiting the necessary subjects. Unlike adult trials, there is not much historical data available for pediatrics, which is required to validate assumptions for planning pediatric trials. Typically, pediatric studies are initiated as soon as approval is obtained for a drug to be marketed for adults, so with the adult study historical information and with the available pediatric pilot study data or simulated pediatric data, the pediatric study can be well planned. Generalizing the historical adult study for new pediatric study is a tedious task; however, it is possible by integrating various statistical techniques and utilizing the advantage of hybrid study design, which will help to achieve the study objective in a smoother way even with the presence of many constraints. This research paper will explain how well the hybrid study design can be planned along with integrated technique (SEV) to plan the pediatric study; In brief the SEV technique (Simulation, Estimation (using borrowed adult data and applying Bayesian methods)) incorporates the use of simulating the planned study data and getting the desired estimates to Validate the assumptions.This method of validation can be used to improve the accuracy of data analysis, ensuring that results are as valid and reliable as possible, which allow us to make informed decisions well ahead of study initiation. With professional precision, this technique based on the collected data allows to gain insight into best practices when using data from historical study and simulated data alike.

Keywords: adaptive design, simulation, borrowing data, bayesian model

Procedia PDF Downloads 77
4208 Application of Nonparametric Geographically Weighted Regression to Evaluate the Unemployment Rate in East Java

Authors: Sifriyani Sifriyani, I Nyoman Budiantara, Sri Haryatmi, Gunardi Gunardi

Abstract:

East Java Province has a first rank as a province that has the most counties and cities in Indonesia and has the largest population. In 2015, the population reached 38.847.561 million, this figure showed a very high population growth. High population growth is feared to lead to increase the levels of unemployment. In this study, the researchers mapped and modeled the unemployment rate with 6 variables that were supposed to influence. Modeling was done by nonparametric geographically weighted regression methods with truncated spline approach. This method was chosen because spline method is a flexible method, these models tend to look for its own estimation. In this modeling, there were point knots, the point that showed the changes of data. The selection of the optimum point knots was done by selecting the most minimun value of Generalized Cross Validation (GCV). Based on the research, 6 variables were declared to affect the level of unemployment in eastern Java. They were the percentage of population that is educated above high school, the rate of economic growth, the population density, the investment ratio of total labor force, the regional minimum wage and the ratio of the number of big industry and medium scale industry from the work force. The nonparametric geographically weighted regression models with truncated spline approach had a coefficient of determination 98.95% and the value of MSE equal to 0.0047.

Keywords: East Java, nonparametric geographically weighted regression, spatial, spline approach, unemployed rate

Procedia PDF Downloads 321
4207 A Study of Laminar Natural Convection in Annular Spaces between Differentially Heated Horizontal Circular Cylinders Filled with Non-Newtonian Nano Fluids

Authors: Behzad Ahdiharab, Senol Baskaya, Tamer Calisir

Abstract:

Heat exchangers are one of the most widely used systems in factories, refineries etc. In this study, natural convection heat transfer using nano-fluids in between two cylinders is numerically investigated. The inner and outer cylinders are kept at constant temperatures. One of the most important assumptions in the project is that the working fluid is non-Newtonian. In recent years, the use of nano-fluids in industrial applications has increased profoundly. In this study, nano-Newtonian fluids containing metal particles with high heat transfer coefficients have been used. All fluid properties such as homogeneity has been calculated. In the present study, solutions have been obtained under unsteady conditions, base fluid was water, and effects of various parameters on heat transfer have been investigated. These parameters are Rayleigh number (103 < Ra < 106), power-law index (0.6 < n < 1.4), aspect ratio (0 < AR < 0.8), nano-particle composition, horizontal and vertical displacement of the inner cylinder, rotation of the inner cylinder, and volume fraction of nanoparticles. Results such as the internal cylinder average and local Nusselt number variations, contours of temperature, flow lines are presented. The results are also discussed in detail. From the validation study performed it was found that a very good agreement exists between the present results and those from the open literature. It was found out that the heat transfer is always affected by the investigated parameters. However, the degree to which the heat transfer is affected does change in a wide range.

Keywords: heat transfer, circular space, non-Newtonian, nano fluid, computational fluid dynamics.

Procedia PDF Downloads 415
4206 Design, Construction And Validation Of A Simple, Low-cost Phi Meter

Authors: Gabrielle Peck, Ryan Hayes

Abstract:

The use of a phi meter allows for definition of equivalence ratio during a fire test. Previous phi meter designs have used expensive catalysts and had restricted portability due to the large furnace and requirement for pure oxygen. The new design of the phi meter did not require the use of a catalyst. The furnace design was based on the existing micro-scale combustion calorimetry (MCC) furnace and operating conditions based on the secondary oxidizer furnace used in the steady state tube furnace (SSTF). Preliminary tests were conducted to study the effects of varying furnace temperatures on combustion efficiency. The SSTF was chosen to validate the phi meter measurements as it can both pre-set and independently quantify the equivalence ratio during a test. The data were in agreement with the data obtained on the SSTF. It was also validated by a comparison of CO2 yields obtained from the SSTF oxidizer and those obtained by the phi meter. The phi meter designed and constructed in this work was proven to work effectively on a bench-scale. The phi meter was then used to measure the equivalence ratio on a series of large-scale ISO 9705 tests for numerous fire conditions. The materials used were a range of non-homogenous materials such as polyurethane. The measurements corresponded accurately to the data collected, showing the novel design can be used from bench to large-scale tests to measure equivalence ratio. This cheaper, more portable, safer and easier to use phi meter design will enable more widespread use and the ability to quantify fire conditions of tests, allowing for better understanding of flammability and smoke toxicity.

Keywords: phi meter, smoke toxicity, fire condition, ISO9705, novel equipment

Procedia PDF Downloads 103
4205 Feature Analysis of Predictive Maintenance Models

Authors: Zhaoan Wang

Abstract:

Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.

Keywords: automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation

Procedia PDF Downloads 133