Search results for: time series fractal analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40640

Search results for: time series fractal analysis

39620 User Experience Measurement of User Interfaces

Authors: Mohammad Hashemi, John Herbert

Abstract:

Quantifying and measuring Quality of Experience (QoE) are important and difficult concerns in Human Computer Interaction (HCI). Quality of Service (QoS) and the actual User Interface (UI) of the application are both important contributors to the QoE of a user. This paper describes a framework that measures accurately the way a user uses the UI in order to model users' behaviours and profiles. It monitors the use of the mouse and use of UI elements with accurate time measurement. It does this in real-time and does so unobtrusively and efficiently allowing the user to work as normal with the application. This real-time accurate measurement of the user's interaction provides valuable data and insight into the use of the UI, and is also the basis for analysis of the user's QoE.

Keywords: user modelling, user interface experience, quality of experience, user experience, human and computer interaction

Procedia PDF Downloads 496
39619 Amyloid-β Fibrils Remodeling by an Organic Molecule: Insight from All-Atomic Molecular Dynamics Simulations

Authors: Nikhil Agrawal, Adam A. Skelton

Abstract:

Alzheimer’s disease (AD) is one of the most common forms of dementia, which is caused by misfolding and aggregation of amyloid beta (Aβ) peptides into amyloid-β fibrils (Aβ fibrils). To disrupt the remodeling of Aβ fibrils, a number of candidate molecules have been proposed. To study the molecular mechanisms of Aβ fibrils remodeling we performed a series of all-atom molecular dynamics simulations, a total time of 3µs, in explicit solvent. Several previously undiscovered candidate molecule-Aβ fibrils binding modes are unraveled; one of which shows the direct conformational change of the Aβ fibril by understanding the physicochemical factors responsible for binding and subsequent remodeling of Aβ fibrils by the candidate molecule, open avenues into structure-based drug design for AD can be opened.

Keywords: alzheimer’s disease, amyloid, MD simulations, misfolded protein

Procedia PDF Downloads 340
39618 Application of Simulation of Discrete Events in Resource Management of Massive Concreting

Authors: Mohammad Amin Hamedirad, Seyed Javad Vaziri Kang Olyaei

Abstract:

Project planning and control are one of the most critical issues in the management of construction projects. Traditional methods of project planning and control, such as the critical path method or Gantt chart, are not widely used for planning projects with discrete and repetitive activities, and one of the problems of project managers is planning the implementation process and optimal allocation of its resources. Massive concreting projects is also a project with discrete and repetitive activities. This study uses the concept of simulating discrete events to manage resources, which includes finding the optimal number of resources considering various limitations such as limitations of machinery, equipment, human resources and even technical, time and implementation limitations using analysis of resource consumption rate, project completion time and critical points analysis of the implementation process. For this purpose, the concept of discrete-event simulation has been used to model different stages of implementation. After reviewing the various scenarios, the optimal number of allocations for each resource is finally determined to reach the maximum utilization rate and also to reduce the project completion time or reduce its cost according to the existing constraints. The results showed that with the optimal allocation of resources, the project completion time could be reduced by 90%, and the resulting costs can be reduced by up to 49%. Thus, allocating the optimal number of project resources using this method will reduce its time and cost.

Keywords: simulation, massive concreting, discrete event simulation, resource management

Procedia PDF Downloads 141
39617 Dynamic Two-Way FSI Simulation for a Blade of a Small Wind Turbine

Authors: Alberto Jiménez-Vargas, Manuel de Jesús Palacios-Gallegos, Miguel Ángel Hernández-López, Rafael Campos-Amezcua, Julio Cesar Solís-Sanchez

Abstract:

An optimal wind turbine blade design must be able of capturing as much energy as possible from the wind source available at the area of interest. Many times, an optimal design means the use of large quantities of material and complicated processes that make the wind turbine more expensive, and therefore, less cost-effective. For the construction and installation of a wind turbine, the blades may cost up to 20% of the outline pricing, and become more important due to they are part of the rotor system that is in charge of transmitting the energy from the wind to the power train, and where the static and dynamic design loads for the whole wind turbine are produced. The aim of this work is the develop of a blade fluid-structure interaction (FSI) simulation that allows the identification of the major damage zones during the normal production situation, and thus better decisions for design and optimization can be taken. The simulation is a dynamic case, since we have a time-history wind velocity as inlet condition instead of a constant wind velocity. The process begins with the free-use software NuMAD (NREL), to model the blade and assign material properties to the blade, then the 3D model is exported to ANSYS Workbench platform where before setting the FSI system, a modal analysis is made for identification of natural frequencies and modal shapes. FSI analysis is carried out with the two-way technic which begins with a CFD simulation to obtain the pressure distribution on the blade surface, then these results are used as boundary condition for the FEA simulation to obtain the deformation levels for the first time-step. For the second time-step, CFD simulation is reconfigured automatically with the next time-step inlet wind velocity and the deformation results from the previous time-step. The analysis continues the iterative cycle solving time-step by time-step until the entire load case is completed. This work is part of a set of projects that are managed by a national consortium called “CEMIE-Eólico” (Mexican Center in Wind Energy Research), created for strengthen technological and scientific capacities, the promotion of creation of specialized human resources, and to link the academic with private sector in national territory. The analysis belongs to the design of a rotor system for a 5 kW wind turbine design thought to be installed at the Isthmus of Tehuantepec, Oaxaca, Mexico.

Keywords: blade, dynamic, fsi, wind turbine

Procedia PDF Downloads 473
39616 A Construction Management Tool: Determining a Project Schedule Typical Behaviors Using Cluster Analysis

Authors: Natalia Rudeli, Elisabeth Viles, Adrian Santilli

Abstract:

Delays in the construction industry are a global phenomenon. Many construction projects experience extensive delays exceeding the initially estimated completion time. The main purpose of this study is to identify construction projects typical behaviors in order to develop a prognosis and management tool. Being able to know a construction projects schedule tendency will enable evidence-based decision-making to allow resolutions to be made before delays occur. This study presents an innovative approach that uses Cluster Analysis Method to support predictions during Earned Value Analyses. A clustering analysis was used to predict future scheduling, Earned Value Management (EVM), and Earned Schedule (ES) principal Indexes behaviors in construction projects. The analysis was made using a database with 90 different construction projects. It was validated with additional data extracted from literature and with another 15 contrasting projects. For all projects, planned and executed schedules were collected and the EVM and ES principal indexes were calculated. A complete linkage classification method was used. In this way, the cluster analysis made considers that the distance (or similarity) between two clusters must be measured by its most disparate elements, i.e. that the distance is given by the maximum span among its components. Finally, through the use of EVM and ES Indexes and Tukey and Fisher Pairwise Comparisons, the statistical dissimilarity was verified and four clusters were obtained. It can be said that construction projects show an average delay of 35% of its planned completion time. Furthermore, four typical behaviors were found and for each of the obtained clusters, the interim milestones and the necessary rhythms of construction were identified. In general, detected typical behaviors are: (1) Projects that perform a 5% of work advance in the first two tenths and maintain a constant rhythm until completion (greater than 10% for each remaining tenth), being able to finish on the initially estimated time. (2) Projects that start with an adequate construction rate but suffer minor delays culminating with a total delay of almost 27% of the planned time. (3) Projects which start with a performance below the planned rate and end up with an average delay of 64%, and (4) projects that begin with a poor performance, suffer great delays and end up with an average delay of a 120% of the planned completion time. The obtained clusters compose a tool to identify the behavior of new construction projects by comparing their current work performance to the validated database, thus allowing the correction of initial estimations towards more accurate completion schedules.

Keywords: cluster analysis, construction management, earned value, schedule

Procedia PDF Downloads 254
39615 A Study on Impact of Corporate Social Responsibility on Rural Development

Authors: N. Amruth Raj, Suja S. Nair

Abstract:

The last six decades have borne witness to a radical change in the private sectors relationship with both the state and civil society. Firms have been increasingly called upon to adopt strategies beyond the financial aspects of their operations and consider the social and environmental impact of their business activities. In this context, many companies have modified their policies and activities and engaged into Corporate Social Responsibility (CSR) especially on Rural development in India. At the firm level, CSR is implemented through various practices, which aim to enhance the company’s social and environmental performance and may cover various topics. Examples of CSR practices are abundant in Andhra Pradesh relevant literature. For instance, in India especially at Andhra Pradesh companies like Amara Raaja requires from its suppliers to prohibit child labour, Nagarjuna Cements applies a series of programs for reducing its CO2 emissions, LANCO group of Industries addresses health and safety issues in the workplace whereas GVK works limited has adopted a series of policies for addressing human rights and environmental abuse related to its operations.

Keywords: CSR, limitations, need, objectives, rural development

Procedia PDF Downloads 250
39614 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis

Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate

Abstract:

This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.

Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull

Procedia PDF Downloads 66
39613 The Effects and Interactions of Synthesis Parameters on Properties of Mg Substituted Hydroxyapatite

Authors: S. Sharma, U. Batra, S. Kapoor, A. Dua

Abstract:

In this study, the effects and interactions of reaction time and capping agent assistance during sol-gel synthesis of magnesium substituted hydroxyapatite nanopowder (MgHA) on hydroxyapatite (HA) to β-tricalcium phosphate (β-TCP) ratio, Ca/P ratio and mean crystallite size was examined experimentally as well as through statistical analysis. MgHA nanopowders were synthesized by sol-gel technique at room temperature using aqueous solution of calcium nitrate tetrahydrate, magnesium nitrate hexahydrate and potassium dihydrogen phosphate as starting materials. The reaction time for sol-gel synthesis was varied between 15 to 60 minutes. Two process routes were followed with and without addition of triethanolamine (TEA) in the solutions. The elemental compositions of as-synthesized powders were determined using X-ray fluorescence (XRF) spectroscopy. The functional groups present in the as-synthesized MgHA nanopowders were established through Fourier Transform Infrared Spectroscopy (FTIR). The amounts of phases present, Ca/P ratio and mean crystallite sizes of MgHA nanopowders were determined using X-ray diffraction (XRD). The HA content in biphasic mixture of HA and β-TCP and Ca/P ratio in as-synthesized MgHA nanopowders increased effectively with reaction time of sols (p < 0.0001, two way Anova), however, these were independent of TEA addition (p > 0.15, two way Anova). The MgHA nanopowders synthesized with TEA assistance exhibited 14 nm lower crystallite size (p < 0.018, 2 sample t-test) compared to the powder synthesized without TEA assistance.

Keywords: capping agent, hydroxyapatite, regression analysis, sol-gel, 2- sample t-test, two-way analysis of variance (ANOVA)

Procedia PDF Downloads 365
39612 Effect of Nano-SiO2 Solution on the Strength Characteristics of Kaolinite

Authors: Reza Ziaie Moayed, Hamidreza Rahmani

Abstract:

Today, with developments in science and technology, there is an excessive potential for the use of nanomaterials in various fields of geotechnical project such as soil stabilization. This study investigates the effect of Nano-SiO2 solution on the unconfined compression strength and Young's elastic modulus of Kaolinite. For this purpose, nano-SiO2 was mixed with kaolinite in five different contents: 1, 2, 3, 4 and 5% by weight of the dry soil and a series of the unconfined compression test with curing time of one-day was selected as laboratory test. Analyses of the tests results show that stabilization of kaolinite with Nano-SiO2 solution can improve effectively the unconfined compression strength of modified soil up to 1.43 times compared to  the pure soil.

Keywords: kaolinite, Nano-SiO2, stabilization, unconfined compression test, Young's modulus

Procedia PDF Downloads 385
39611 Aerodynamic Prediction and Performance Analysis for Mars Science Laboratory Entry Vehicle

Authors: Tang Wei, Yang Xiaofeng, Gui Yewei, Du Yanxia

Abstract:

Complex lifting entry was selected for precise landing performance during the Mars Science Laboratory entry. This study aims to develop the three-dimensional numerical method for precise computation and the surface panel method for rapid engineering prediction. Detailed flow field analysis for Mars exploration mission was performed by carrying on a series of fully three-dimensional Navier-Stokes computations. The static aerodynamic performance was then discussed, including the surface pressure, lift and drag coefficient, lift-to-drag ratio with the numerical and engineering method. Computation results shown that the shock layer is thin because of lower effective specific heat ratio, and that calculated results from both methods agree well with each other, and is consistent with the reference data. Aerodynamic performance analysis shows that CG location determines trim characteristics and pitch stability, and certain radially and axially shift of the CG location can alter the capsule lifting entry performance, which is of vital significance for the aerodynamic configuration des0ign and inner instrument layout of the Mars entry capsule.

Keywords: Mars entry capsule, static aerodynamics, computational fluid dynamics, hypersonic

Procedia PDF Downloads 294
39610 Parametric Optimization of Wire Electric Discharge Machining (WEDM) for Aluminium Metal Matrix Composites

Authors: G. Rajyalakhmi, C. Karthik, Gerson Desouza, Rimmie Duraisamy

Abstract:

In this present work, metal matrix composites with combination of aluminium with (Sic/Al2O3) were fabricated using stir casting technique. The objective of the present work is to optimize the process parameters of Wire Electric Discharge Machining (WEDM) composites. Pulse ON Time, Pulse OFF Time, wire feed and sensitivity are considered as input process parameters with responses Material Removal Rate (MRR), Surface Roughness (SR) for optimization of WEDM process. Taguchi L18 Orthogonal Array (OA) is used for experimentation. Grey Relational Analysis (GRA) is coupled with Taguchi technique for multiple process parameters optimization. ANOVA (Analysis of Variance) is used for finding the impact of process parameters individually. Finally confirmation experiments were carried out to validate the predicted results.

Keywords: parametric optimization, particulate reinforced metal matrix composites, Taguchi-grey relational analysis, WEDM

Procedia PDF Downloads 570
39609 Density Based Traffic System Using Pic Microcontroller

Authors: Tatipamula Samiksha Goud, .A.Naveena, M.sresta

Abstract:

Traffic congestion is a major issue in many cities throughout the world, particularly in urban areas, and it is past time to switch from a fixed timer mode to an automated system. The current traffic signalling system is a fixed-time system that is inefficient if one lane is more functional than the others. A structure for an intelligent traffic control system is being designed to address this issue. When traffic density is higher on one side of a junction, the signal's green time is extended in comparison to the regular time. This study suggests a technique in which the signal's time duration is assigned based on the amount of traffic present at the time. Infrared sensors can be used to do this.

Keywords: infrared sensors, micro-controllers, LEDs, oscillators

Procedia PDF Downloads 131
39608 Impact of Import Restriction on Rice Production in Nigeria

Authors: C. O. Igberi, M. U. Amadi

Abstract:

This research paper on the impact of import restriction on rice production in Nigeria is aimed at finding/proffering valid solutions to the age long problem of rice self-sufficiency, through a better understanding of policy measures used in the past, in this case, the effectiveness of rice import restriction of the early 90’s. It tries to answer the questions of; import restriction boosting domestic rice production and the macroeconomic determining factors of Gross Domestic Rice Product (GDRP). The research probe is investigated through literature and analytical frameworks, such that time series data on the GDRP, Gross Fixed Capital Formation (GFCF), average foreign rice producers’ prices(PPF), domestic producers’ prices (PPN) and the labour force (LABF) are collated for analysis (with an import restriction dummy variable, POL1). The research objectives/hypothesis are analysed using; Cointegration, Vector Error Correction Model (VECM), Impulse Response Function (IRF) and Granger Causality Test(GCT) methodologies. Results show that in the short-run error correction specification for GDRP, a percentage (1%) deviation away from the long-run equilibrium in a current quarter is only corrected by 0.14% in the subsequent quarter. Also, the rice import restriction policy had no significant effect on the GDRP at this time. Other findings show that the policy period has, in fact, had effects on the PPN and LABF. The choice variables used are valid macroeconomic factors that explain the GDRP of Nigeria, as adduced from the IRF and GCT, and in the long-run. Policy recommendations suggest that the import restriction is not disqualified as a veritable tool for improving domestic rice production, rather better enforcement procedures and strict adherence to the policy dictates is needed. Furthermore, accompanying policies which drive public and private capital investment and accumulation must be introduced. Also, employment rate and labour substitution in the agricultural sector should not be drastically changed, rather its welfare and efficiency be improved.

Keywords: import restriction, gross domestic rice production, cointegration, VECM, Granger causality, impulse response function

Procedia PDF Downloads 200
39607 Multi-Factor Optimization Method through Machine Learning in Building Envelope Design: Focusing on Perforated Metal Façade

Authors: Jinwooung Kim, Jae-Hwan Jung, Seong-Jun Kim, Sung-Ah Kim

Abstract:

Because the building envelope has a significant impact on the operation and maintenance stage of the building, designing the facade considering the performance can improve the performance of the building and lower the maintenance cost of the building. In general, however, optimizing two or more performance factors confronts the limits of time and computational tools. The optimization phase typically repeats infinitely until a series of processes that generate alternatives and analyze the generated alternatives achieve the desired performance. In particular, as complex geometry or precision increases, computational resources and time are prohibitive to find the required performance, so an optimization methodology is needed to deal with this. Instead of directly analyzing all the alternatives in the optimization process, applying experimental techniques (heuristic method) learned through experimentation and experience can reduce resource waste. This study proposes and verifies a method to optimize the double envelope of a building composed of a perforated panel using machine learning to the design geometry and quantitative performance. The proposed method is to achieve the required performance with fewer resources by supplementing the existing method which cannot calculate the complex shape of the perforated panel.

Keywords: building envelope, machine learning, perforated metal, multi-factor optimization, façade

Procedia PDF Downloads 216
39606 Qsar Studies of Certain Novel Heterocycles Derived From bis-1, 2, 4 Triazoles as Anti-Tumor Agents

Authors: Madhusudan Purohit, Stephen Philip, Bharathkumar Inturi

Abstract:

In this paper we report the quantitative structure activity relationship of novel bis-triazole derivatives for predicting the activity profile. The full model encompassed a dataset of 46 Bis- triazoles. Tripos Sybyl X 2.0 program was used to conduct CoMSIA QSAR modeling. The Partial Least-Squares (PLS) analysis method was used to conduct statistical analysis and to derive a QSAR model based on the field values of CoMSIA descriptor. The compounds were divided into test and training set. The compounds were evaluated by various CoMSIA parameters to predict the best QSAR model. An optimum numbers of components were first determined separately by cross-validation regression for CoMSIA model, which were then applied in the final analysis. A series of parameters were used for the study and the best fit model was obtained using donor, partition coefficient and steric parameters. The CoMSIA models demonstrated good statistical results with regression coefficient (r2) and the cross-validated coefficient (q2) of 0.575 and 0.830 respectively. The standard error for the predicted model was 0.16322. In the CoMSIA model, the steric descriptors make a marginally larger contribution than the electrostatic descriptors. The finding that the steric descriptor is the largest contributor for the CoMSIA QSAR models is consistent with the observation that more than half of the binding site area is occupied by steric regions.

Keywords: 3D QSAR, CoMSIA, triazoles, novel heterocycles

Procedia PDF Downloads 438
39605 The Regulation on Human Exposure to Electromagnetic Fields for Brazilian Power System

Authors: Hugo Manoel Olivera Da Silva, Ricardo Silva Thé Pontes

Abstract:

In this work, is presented an analysis of the Brazilian regulation on human exposure to electromagnetic fields, which provides limits to electric fields, magnetic and electromagnetic fields. The regulations for the electricity sector was in charge of the Agência Nacional de Energia Elétrica-ANEEL, the Brazilian Electricity Regulatory Agency, that made it through the Normative Resolution Nº 398/2010, resulting in a series of obligations for the agents of the electricity sector, especially in the areas of generation, transmission, and distribution.

Keywords: adverse effects, electric energy, electric and magnetic fields, human health, regulation

Procedia PDF Downloads 599
39604 Correlation between Fetal Umbilical Cord pH and the Day, the Time and the Team Hand over Times: An Analysis of 6929 Deliveries of the Ulm University Hospital

Authors: Sabine Pau, Sophia Volz, Emanuel Bauer, Amelie De Gregorio, Frank Reister, Wolfgang Janni, Florian Ebner

Abstract:

Purpose: The umbilical cord pH is a well evaluated contributor for prediction of neonatal outcome. This study correlates nenonatal umbilical cord pH with the weekday of delivery, the time of birth as well as the staff hand over times (midwifes and doctors). Material and Methods: This retrospective study included all deliveries of a 20 year period (1994-2014) at our primary obstetric center. All deliveries with a newborn cord pH under 7,20 were included in this analysis (6929 of 48974 deliveries (14,4%)). Further subgroups were formed according to the pH (< 7,05; 7,05 – 7,09; 7,10 – 7,14; 7,15 – 7,19). The data were then separated in day- and night time (8am-8pm/8pm-8am) for a first analysis. Finally, handover times were defined at 6 am – 6.30 am, 2 pm -2.30 pm, 10 pm- 10.30 pm (midwives) and for the doctors 8-8.30 am, 4 – 4.30 pm (Monday- Thursday); 2 pm -2.30 pm (Friday) and 9 am – 9.30 am (weekend). Routinely a shift consists of at least three doctors as well as three midwives. Results: During the last 20 years, 6929 neonates were born with an umbilical cord ph < 7,20 ( < 7,05 : 7,1%; 7,05 – 7,09 : 10,9%; 7,10 – 7,14 : 30,2%; 7,15 – 7,19:51,8%). There was no significant difference between either night/day delivery (p = 0.408), delivery on different weekdays (p = 0.253), delivery between Monday to Thursday, Friday and the weekend (p = 0.496 ) or delivery during the handover times of the doctors as well as the midwives (p = 0.221). Even the standard deviation showed no differences between the groups. Conclusion: Despite an increased workload over the last 20 years, the standard of care remains high even during the handover times and night shifts. This applies for midwives and doctors. As the neonatal outcome depends on various factors, further studies are necessary to take more factors influencing the fetal outcome into consideration. In order to maintain this high standard of care, an adaption of work-load and changing conditions is necessary.

Keywords: delivery, fetal umbilical cord pH, day time, hand over times

Procedia PDF Downloads 309
39603 Data Recording for Remote Monitoring of Autonomous Vehicles

Authors: Rong-Terng Juang

Abstract:

Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.

Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar

Procedia PDF Downloads 156
39602 Investigation into Relationship between Spaced Repetitions and Problems Solving Efficiency

Authors: Sidharth Talan, Rajlakshmi G. Majumdar

Abstract:

Problem-solving skill is one the few skills which is constantly endeavored to improve upon by the professionals and academicians around the world in order to sustain themselves in the ever-growing competitive environment. The given paper focuses on evaluating a hypothesized relationship between the problems solving efficiency of an individual with spaced repetitions, conducted with a time interval of one day over a period of two weeks. The paper has utilized uni-variate regression analysis technique to assess the best fit curve that can explain the significant relationship between the given two variables. The paper has incorporated Anagrams solving as the appropriate testing process for the analysis. Since Anagrams solving involves rearranging a jumbled word to form a correct word, it projects to be an efficient process to observe the attention span, visual- motor coordination and the verbal ability of an individual. Based on the analysis for a sample population of 30, it was observed that problem-solving efficiency of an individual, measured in terms of the score in each test was found to be significantly correlated with time period measured in days.

Keywords: Anagrams, histogram plot, moving average curve, spacing effect

Procedia PDF Downloads 155
39601 Can Exams Be Shortened? Using a New Empirical Approach to Test in Finance Courses

Authors: Eric S. Lee, Connie Bygrave, Jordan Mahar, Naina Garg, Suzanne Cottreau

Abstract:

Marking exams is universally detested by lecturers. Final exams in many higher education courses often last 3.0 hrs. Do exams really need to be so long? Can we justifiably reduce the number of questions on them? Surprisingly few have researched these questions, arguably because of the complexity and difficulty of using traditional methods. To answer these questions empirically, we used a new approach based on three key elements: Use of an unusual variation of a true experimental design, equivalence hypothesis testing, and an expanded set of six psychometric criteria to be met by any shortened exam if it is to replace a current 3.0-hr exam (reliability, validity, justifiability, number of exam questions, correspondence, and equivalence). We compared student performance on each official 3.0-hr exam with that on five shortened exams having proportionately fewer questions (2.5, 2.0, 1.5, 1.0, and 0.5 hours) in a series of four experiments conducted in two classes in each of two finance courses (224 students in total). We found strong evidence that, in these courses, shortening of final exams to 2.0 hrs was warranted on all six psychometric criteria. Shortening these exams by one hour should result in a substantial one-third reduction in lecturer time and effort spent marking, lower student stress, and more time for students to prepare for other exams. Our approach provides a relatively simple, easy-to-use methodology that lecturers can use to examine the effect of shortening their own exams.

Keywords: exam length, psychometric criteria, synthetic experimental designs, test length

Procedia PDF Downloads 267
39600 Preliminary Analysis on Land Use-Land Cover Assessment of Post-Earthquake Geohazard: A Case Study in Kundasang, Sabah

Authors: Nur Afiqah Mohd Kamal, Khamarrul Azahari Razak

Abstract:

The earthquake aftermath has become a major concern, especially in high seismicity region. In Kundasang, Sabah, the earthquake on 5th June 2015 resulted in several catastrophes; landslides, rockfalls, mudflows and major slopes affected regardless of the series of the aftershocks. Certainly, the consequences of earthquake generate and induce the episodic disaster, not only life-threatening but it also affects infrastructure and economic development. Therefore, a need for investigating the change in land use and land cover (LULC) of post-earthquake geohazard is essential for identifying the extent of disastrous effects towards the development in Kundasang. With the advancement of remote sensing technology, post-earthquake geohazards (landslides, mudflows, rockfalls, debris flows) assessment can be evaluated by the employment of object-based image analysis in investigating the LULC change which consists of settlements, public infrastructure and vegetation cover. Therefore, this paper discusses the preliminary results on post-earthquakes geohazards distribution in Kundasang and evaluates the LULC classification effect upon the occurrences of geohazards event. The result of this preliminary analysis will provide an overview to determine the extent of geohazard impact on LULC. This research also provides beneficial input to the local authority in Kundasang about the risk of future structural development on the geohazard area.

Keywords: geohazard, land use land cover, object-based image analysis, remote sensing

Procedia PDF Downloads 239
39599 Flow Reproduction Using Vortex Particle Methods for Wake Buffeting Analysis of Bluff Structures

Authors: Samir Chawdhury, Guido Morgenthal

Abstract:

The paper presents a novel extension of Vortex Particle Methods (VPM) where the study aims to reproduce a template simulation of complex flow field that is generated from impulsively started flow past an upstream bluff body at certain Reynolds number Re-Vibration of a structural system under upstream wake flow is often considered its governing design criteria. Therefore, the attention is given in this study especially for the reproduction of wake flow simulation. The basic methodology for the implementation of the flow reproduction requires the downstream velocity sampling from the template flow simulation; therefore, at particular distances from the upstream section the instantaneous velocity components are sampled using a series of square sampling-cells arranged vertically where each of the cell contains four velocity sampling points at its corner. Since the grid free Lagrangian VPM algorithm discretises vorticity on particle elements, the method requires transformation of the velocity components into vortex circulation, and finally the simulation of the reproduction of the template flow field by seeding these vortex circulations or particles into a free stream flow. It is noteworthy that the vortex particles have to be released into the free stream exactly at same rate of velocity sampling. Studies have been done, specifically, in terms of different sampling rates and velocity sampling positions to find their effects on flow reproduction quality. The quality assessments are mainly done, using a downstream flow monitoring profile, by comparing the characteristic wind flow profiles using several statistical turbulence measures. Additionally, the comparisons are performed using velocity time histories, snapshots of the flow fields, and the vibration of a downstream bluff section by performing wake buffeting analyses of the section under the original and reproduced wake flows. Convergence study is performed for the validation of the method. The study also describes the possibilities how to achieve flow reproductions with less computational effort.

Keywords: vortex particle method, wake flow, flow reproduction, wake buffeting analysis

Procedia PDF Downloads 307
39598 Actual Fracture Length Determination Using a Technique for Shale Fracturing Data Analysis in Real Time

Authors: M. Wigwe, M. Y Soloman, E. Pirayesh, R. Eghorieta, N. Stegent

Abstract:

The moving reference point (MRP) technique has been used in the analyses of the first three stages of two fracturing jobs. The results obtained verify the proposition that a hydraulic fracture in shale grows in spurts rather than in a continuous pattern as originally interpreted by Nolte-Smith technique. Rather than a continuous Mode I fracture that is followed by Mode II, III or IV fractures, these fracture modes could alternate throughout the pumping period. It is also shown that the Nolte-Smith time parameter plot can be very helpful in identifying the presence of natural fractures that have been intersected by the hydraulic fracture. In addition, with the aid of a fracture length-time plot generated from any fracture simulation that matches the data, the distance from the wellbore to the natural fractures, which also translates to the actual fracture length for the stage, can be determined. An algorithm for this technique is developed. This procedure was used for the first 9 minutes of the simulated frac job data. It was observed that after 7mins, the actual fracture length is about 150ft, instead of 250ft predicted by the simulator output. This difference gets larger as the analysis proceeds.

Keywords: shale, fracturing, reservoir, simulation, frac-length, moving-reference-point

Procedia PDF Downloads 743
39597 Evaluation of Quick Covering Machine for Grain Drying Pavement

Authors: Fatima S. Rodriguez, Victorino T. Taylan, Manolito C. Bulaong, Helen F. Gavino, Vitaliana U. Malamug

Abstract:

In sundrying the quality of the grains are greatly reduced when paddy grains were caught by the rain unsacked and unstored resulting to reduced profit. The objectives of this study were to design and fabricate a quick covering machine for grain drying pavement; to test and evaluate the operating characteristics of the machine according to its deployment speed, recovery speed, deployment time, recovery time, power consumption, aesthetics of laminated sack; and to conduct partial budget and cost curve analysis. The machine was able to cover the grains in a 12.8 m x 22.5 m grain drying pavement at an average time of 17.13 s. It consumed 0.53 W-hr for the deployment and recovery of the cover. The machine entailed an investment cost of $1,344.40 and an annual cost charge of $647.32. Moreover, the savings per year using the quick covering machine was $101.83.

Keywords: quick covering machine, grain drying pavement, laminated polypropylene, recovery time

Procedia PDF Downloads 313
39596 2D Hexagonal Cellular Automata: The Complexity of Forms

Authors: Vural Erdogan

Abstract:

We created two-dimensional hexagonal cellular automata to obtain complexity by using simple rules same as Conway’s game of life. Considering the game of life rules, Wolfram's works about life-like structures and John von Neumann's self-replication, self-maintenance, self-reproduction problems, we developed 2-states and 3-states hexagonal growing algorithms that reach large populations through random initial states. Unlike the game of life, we used six neighbourhoods cellular automata instead of eight or four neighbourhoods. First simulations explained that whether we are able to obtain sort of oscillators, blinkers, and gliders. Inspired by Wolfram's 1D cellular automata complexity and life-like structures, we simulated 2D synchronous, discrete, deterministic cellular automata to reach life-like forms with 2-states cells. The life-like formations and the oscillators have been explained how they contribute to initiating self-maintenance together with self-reproduction and self-replication. After comparing simulation results, we decided to develop the algorithm for another step. Appending a new state to the same algorithm, which we used for reaching life-like structures, led us to experiment new branching and fractal forms. All these studies tried to demonstrate that complex life forms might come from uncomplicated rules.

Keywords: hexagonal cellular automata, self-replication, self-reproduction, self- maintenance

Procedia PDF Downloads 144
39595 Manufacturing Process and Cost Estimation through Process Detection by Applying Image Processing Technique

Authors: Chalakorn Chitsaart, Suchada Rianmora, Noppawat Vongpiyasatit

Abstract:

In order to reduce the transportation time and cost for direct interface between customer and manufacturer, the image processing technique has been introduced in this research where designing part and defining manufacturing process can be performed quickly. A3D virtual model is directly generated from a series of multi-view images of an object, and it can be modified, analyzed, and improved the structure, or function for the further implementations, such as computer-aided manufacturing (CAM). To estimate and quote the production cost, the user-friendly platform has been developed in this research where the appropriate manufacturing parameters and process detections have been identified and planned by CAM simulation.

Keywords: image processing technique, feature detections, surface registrations, capturing multi-view images, Production costs and Manufacturing processes

Procedia PDF Downloads 243
39594 Evaluation of Elements Impurities in Drugs According to Pharmacopoeia by use FESEM-EDS Technique

Authors: Rafid Doulab

Abstract:

Elemental Impurities in the Pharmaceuticals industryis are indispensable to ensure pharmaceuticalssafety for 24 elements. Although atomic absorption and inductively coupled plasma are used in the U.S Pharmacopeia and the European Pharmacopoeia, FESEM with energy dispersive spectrometers can be applied as an alternative analysis method for quantitative and qualitative results for a variety of elements without chemical pretreatment, unlike other techniques. This technique characterizes by shortest time, with more less contamination, no reagent consumption, and generation of minimal residue or waste, as well as sample preparations time limiting, with minimal analysis error. Simple dilution for powder or direct analysis for liquid, we analyzed the usefulness of EDS method in testing with field emission scanning electron microscopy (FESEM, SUPRA 55 Carl Zeiss Germany) with an X-ray energy dispersion (XFlash6l10 Bruker Germany). The samples analyzed directly without coating by applied 5µ of known concentrated diluted sample on carbon stub with accelerated voltage according to sample thickness, the result for this spot was in atomic percentage, and by Avogadro converted factor, the final result will be in microgram. Conclusion and recommendation: The conclusion of this study is application of FESEM-EDS in US pharmacopeia and ICH /Q3D guideline to reach a high-precision and accurate method in element impurities analysis of drugs or bulk materials to determine the permitted daily exposure PDE in liquid or solid specimens, and to obtain better results than other techniques, by the way it does not require complex methods or chemicals for digestion, which interfere with the final results with the possibility of to keep the sample at any time for re analysis. The recommendation is to use this technique in pharmacopeia as standard methods like inductively coupled plasma both ICP-AES, ICP-OES, and ICP-MS.

Keywords: pharmacopoeia, FESEM-EDS, element impurities, atomic concentration

Procedia PDF Downloads 112
39593 Behavior of Steel Moment Frames Subjected to Impact Load

Authors: Hyungoo Kang, Minsung Kim, Jinkoo Kim

Abstract:

This study investigates the performance of a 2D and 3D steel moment frame subjected to vehicle collision at a first story column using LS-DYNA. The finite element models of vehicles provided by the National Crash Analysis Center (NCAC) are used for numerical analysis. Nonlinear dynamic time history analysis of the 2D and 3D model structures are carried out based on the arbitrary column removal scenario, and the vertical displacement of the damaged structures are compared with that obtained from collision analysis. The analysis results show that the model structure remains stable when the speed of the vehicle is 40km/h. However, at the speed of 80 and 120km/h both the 2D and 3D structures collapse by progressive collapse. The vertical displacement of the damaged joint obtained from collision analysis is significantly larger than the displacement computed based on the arbitrary column removal scenario.

Keywords: vehicle collision, progressive collapse, FEM, LS-DYNA

Procedia PDF Downloads 330
39592 Driver Take-Over Time When Resuming Control from Highly Automated Driving in Truck Platooning Scenarios

Authors: Bo Zhang, Ellen S. Wilschut, Dehlia M. C. Willemsen, Marieke H. Martens

Abstract:

With the rapid development of intelligent transportation systems, automated platooning of trucks is drawing increasing interest for its beneficial effects on safety, energy consumption and traffic flow efficiency. Nevertheless, one major challenge lies in the safe transition of control from the automated system back to the human drivers, especially when they have been inattentive after a long period of highly automated driving. In this study, we investigated driver take-over time after a system initiated request to leave the platooning system Virtual Tow Bar in a non-critical scenario. 22 professional truck drivers participated in the truck driving simulator experiment, and each was instructed to drive under three experimental conditions before the presentation of the take-over request (TOR): driver ready (drivers were instructed to monitor the road constantly), driver not-ready (drivers were provided with a tablet) and eye-shut. The results showed significantly longer take-over time in both driver not-ready and eye-shut conditions compared with the driver ready condition. Further analysis revealed hand movement time as the main factor causing long response time in the driver not-ready condition, while in the eye-shut condition, gaze reaction time also influenced the total take-over time largely. In addition to comparing the means, large individual differences can be found especially in two driver, not attentive conditions. The importance of a personalized driver readiness predictor for a safe transition is concluded.

Keywords: driving simulation, highly automated driving, take-over time, transition of control, truck platooning

Procedia PDF Downloads 246
39591 A Non-Destructive Estimation Method for Internal Time in Perilla Leaf Using Hyperspectral Data

Authors: Shogo Nagano, Yusuke Tanigaki, Hirokazu Fukuda

Abstract:

Vegetables harvested early in the morning or late in the afternoon are valued in plant production, and so the time of harvest is important. The biological functions known as circadian clocks have a significant effect on this harvest timing. The purpose of this study was to non-destructively estimate the circadian clock and so construct a method for determining a suitable harvest time. We took eight samples of green busil (Perilla frutescens var. crispa) every 4 hours, six times for 1 day and analyzed all samples at the same time. A hyperspectral camera was used to collect spectrum intensities at 141 different wavelengths (350–1050 nm). Calculation of correlations between spectrum intensity of each wavelength and harvest time suggested the suitability of the hyperspectral camera for non-destructive estimation. However, even the highest correlated wavelength had a weak correlation, so we used machine learning to raise the accuracy of estimation and constructed a machine learning model to estimate the internal time of the circadian clock. Artificial neural networks (ANN) were used for machine learning because this is an effective analysis method for large amounts of data. Using the estimation model resulted in an error between estimated and real times of 3 min. The estimations were made in less than 2 hours. Thus, we successfully demonstrated this method of non-destructively estimating internal time.

Keywords: artificial neural network (ANN), circadian clock, green busil, hyperspectral camera, non-destructive evaluation

Procedia PDF Downloads 293