Search results for: data mining technique
27312 Application of a Hybrid Modified Blade Element Momentum Theory/Computational Fluid Dynamics Approach for Wine Turbine Aerodynamic Performances Prediction
Authors: Samah Laalej, Abdelfattah Bouatem
Abstract:
In the field of wind turbine blades, it is complicated to evaluate the aerodynamic performances through experimental measurements as it requires a lot of computing time and resources. Therefore, in this paper, a hybrid BEM-CFD numerical technique is developed to predict power and aerodynamic forces acting on the blades. Computational fluid dynamics (CFD) simulation was conducted to calculate the drag and lift forces through Ansys software using the K-w model. Then an enhanced BEM code was created to predict the power outputs generated by the wind turbine using the aerodynamic properties extracted from the CFD approach. The numerical approach was compared and validated with experimental data. The power curves calculated from this hybrid method were in good agreement with experimental measurements for all velocity ranges.Keywords: blade element momentum, aerodynamic forces, wind turbine blades, computational fluid dynamics approach
Procedia PDF Downloads 6727311 Frequency Domain Decomposition, Stochastic Subspace Identification and Continuous Wavelet Transform for Operational Modal Analysis of Three Story Steel Frame
Authors: Ardalan Sabamehr, Ashutosh Bagchi
Abstract:
Recently, Structural Health Monitoring (SHM) based on the vibration of structures has attracted the attention of researchers in different fields such as: civil, aeronautical and mechanical engineering. Operational Modal Analysis (OMA) have been developed to identify modal properties of infrastructure such as bridge, building and so on. Frequency Domain Decomposition (FDD), Stochastic Subspace Identification (SSI) and Continuous Wavelet Transform (CWT) are the three most common methods in output only modal identification. FDD, SSI, and CWT operate based on the frequency domain, time domain, and time-frequency plane respectively. So, FDD and SSI are not able to display time and frequency at the same time. By the way, FDD and SSI have some difficulties in a noisy environment and finding the closed modes. CWT technique which is currently developed works on time-frequency plane and a reasonable performance in such condition. The other advantage of wavelet transform rather than other current techniques is that it can be applied for the non-stationary signal as well. The aim of this paper is to compare three most common modal identification techniques to find modal properties (such as natural frequency, mode shape, and damping ratio) of three story steel frame which was built in Concordia University Lab by use of ambient vibration. The frame has made of Galvanized steel with 60 cm length, 27 cm width and 133 cm height with no brace along the long span and short space. Three uniaxial wired accelerations (MicroStarin with 100mv/g accuracy) have been attached to the middle of each floor and gateway receives the data and send to the PC by use of Node Commander Software. The real-time monitoring has been performed for 20 seconds with 512 Hz sampling rate. The test is repeated for 5 times in each direction by hand shaking and impact hammer. CWT is able to detect instantaneous frequency by used of ridge detection method. In this paper, partial derivative ridge detection technique has been applied to the local maxima of time-frequency plane to detect the instantaneous frequency. The extracted result from all three methods have been compared, and it demonstrated that CWT has the better performance in term of its accuracy in noisy environment. The modal parameters such as natural frequency, damping ratio and mode shapes are identified from all three methods.Keywords: ambient vibration, frequency domain decomposition, stochastic subspace identification, continuous wavelet transform
Procedia PDF Downloads 29627310 Analyzing Test Data Generation Techniques Using Evolutionary Algorithms
Authors: Arslan Ellahi, Syed Amjad Hussain
Abstract:
Software Testing is a vital process in software development life cycle. We can attain the quality of software after passing it through software testing phase. We have tried to find out automatic test data generation techniques that are a key research area of software testing to achieve test automation that can eventually decrease testing time. In this paper, we review some of the approaches presented in the literature which use evolutionary search based algorithms like Genetic Algorithm, Particle Swarm Optimization (PSO), etc. to validate the test data generation process. We also look into the quality of test data generation which increases or decreases the efficiency of testing. We have proposed test data generation techniques for model-based testing. We have worked on tuning and fitness function of PSO algorithm.Keywords: search based, evolutionary algorithm, particle swarm optimization, genetic algorithm, test data generation
Procedia PDF Downloads 19227309 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential
Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag
Abstract:
Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.Keywords: climate, reanalysis, renewable energy, solar radiation
Procedia PDF Downloads 21027308 A Comparison Between Different Discretization Techniques for the Doyle-Fuller-Newman Li+ Battery Model
Authors: Davide Gotti, Milan Prodanovic, Sergio Pinilla, David Muñoz-Torrero
Abstract:
Since its proposal, the Doyle-Fuller-Newman (DFN) lithium-ion battery model has gained popularity in the electrochemical field. In fact, this model provides the user with theoretical support for designing the lithium-ion battery parameters, such as the material particle or the diffusion coefficient adjustment direction. However, the model is mathematically complex as it is composed of several partial differential equations (PDEs) such as Fick’s law of diffusion, the MacInnes and Ohm’s equations, among other phenomena. Thus, to efficiently use the model in a time-domain simulation environment, the selection of the discretization technique is of a pivotal importance. There are several numerical methods available in the literature that can be used to carry out this task. In this study, a comparison between the explicit Euler, Crank-Nicolson, and Chebyshev discretization methods is proposed. These three methods are compared in terms of accuracy, stability, and computational times. Firstly, the explicit Euler discretization technique is analyzed. This method is straightforward to implement and is computationally fast. In this work, the accuracy of the method and its stability properties are shown for the electrolyte diffusion partial differential equation. Subsequently, the Crank-Nicolson method is considered. It represents a combination of the implicit and explicit Euler methods that has the advantage of being of the second order in time and is intrinsically stable, thus overcoming the disadvantages of the simpler Euler explicit method. As shown in the full paper, the Crank-Nicolson method provides accurate results when applied to the DFN model. Its stability does not depend on the integration time step, thus it is feasible for both short- and long-term tests. This last remark is particularly important as this discretization technique would allow the user to implement parameter estimation and optimization techniques such as system or genetic parameter identification methods using this model. Finally, the Chebyshev discretization technique is implemented in the DFN model. This discretization method features swift convergence properties and, as other spectral methods used to solve differential equations, achieves the same accuracy with a smaller number of discretization nodes. However, as shown in the literature, these methods are not suitable for handling sharp gradients, which are common during the first instants of the charge and discharge phases of the battery. The numerical results obtained and presented in this study aim to provide the guidelines on how to select the adequate discretization technique for the DFN model according to the type of application to be performed, highlighting the pros and cons of the three methods. Specifically, the non-eligibility of the simple Euler method for longterm tests will be presented. Afterwards, the Crank-Nicolson and the Chebyshev discretization methods will be compared in terms of accuracy and computational times under a wide range of battery operating scenarios. These include both long-term simulations for aging tests, and short- and mid-term battery charge/discharge cycles, typically relevant in battery applications like grid primary frequency and inertia control and electrical vehicle breaking and acceleration.Keywords: Doyle-Fuller-Newman battery model, partial differential equations, discretization, numerical methods
Procedia PDF Downloads 2527307 Using Different Methods of Nanofabrication as a New Way to Activate Cement Replacement Materials in Concrete Industry
Authors: Azadeh Askarinejad, Parham Hayati, Reza Parchami, Parisa Hayati
Abstract:
One of the most important industries and building operations causing carbon dioxide emission is the cement and concrete related industries so that cement production (including direct fuel for mining and transporting raw material) consumes approximately 6 million Btus per metric-ton, and releases about 1 metric-ton of CO2. Reducing the consumption of cement with simultaneous utilizing waste materials as cement replacement is preferred for reasons of environmental protection. Blended cements consist of different supplementary cementitious materials (SCM), such as fly ash, silica fume, Ground Granulated Blast Furnace Slag (GGBFS), limestone, natural pozzolans, etc. these materials should be chemically activated to show effective cementitious properties. The present review article reports three different methods of nanofabrication that were used for activation of two types of SCMs.Keywords: nanofabrication, cement replacement materials, activation, concrete
Procedia PDF Downloads 61427306 The Impact of Artificial Intelligence on Construction Projects
Authors: Muller Salah Zaky Toudry
Abstract:
The complexity arises in defining the development great due to its notion, based on inherent market situations and their requirements, the diverse stakeholders itself and their desired output. An quantitative survey based totally approach was adopted in this optimistic examine. A questionnaire-primarily based survey was performed for the assessment of production fine belief and expectations within the context of excellent development technique. The survey feedback of experts of the leading creation corporations/companies of Pakistan production industry have been analyzed. The monetary ability, organizational shape, and production revel in of the construction companies shaped basis for their selection. The great belief become located to be venture-scope-orientated and taken into consideration as an extra cost for a production assignment. Any excellent improvement technique changed into expected to maximize the profit for the employer, via enhancing the productiveness in a creation project. The look at is beneficial for the construction specialists to evaluate the prevailing creation great perception and the expectations from implementation of any pleasant improvement approach in production projects.Keywords: correlation analysis, lean construction tools, lean construction, logistic regression analysis, risk management, safety construction quality, expectation, improvement, perception client loyalty, NPS, pre-construction, schedule reduction
Procedia PDF Downloads 2227305 Developing Manufacturing Process for the Graphene Sensors
Authors: Abdullah Faqihi, John Hedley
Abstract:
Biosensors play a significant role in the healthcare sectors, scientific and technological progress. Developing electrodes that are easy to manufacture and deliver better electrochemical performance is advantageous for diagnostics and biosensing. They can be implemented extensively in various analytical tasks such as drug discovery, food safety, medical diagnostics, process controls, security and defence, in addition to environmental monitoring. Development of biosensors aims to create high-performance electrochemical electrodes for diagnostics and biosensing. A biosensor is a device that inspects the biological and chemical reactions generated by the biological sample. A biosensor carries out biological detection via a linked transducer and transmits the biological response into an electrical signal; stability, selectivity, and sensitivity are the dynamic and static characteristics that affect and dictate the quality and performance of biosensors. In this research, a developed experimental study for laser scribing technique for graphene oxide inside a vacuum chamber for processing of graphene oxide is presented. The processing of graphene oxide (GO) was achieved using the laser scribing technique. The effect of the laser scribing on the reduction of GO was investigated under two conditions: atmosphere and vacuum. GO solvent was coated onto a LightScribe DVD. The laser scribing technique was applied to reduce GO layers to generate rGO. The micro-details for the morphological structures of rGO and GO were visualised using scanning electron microscopy (SEM) and Raman spectroscopy so that they could be examined. The first electrode was a traditional graphene-based electrode model, made under normal atmospheric conditions, whereas the second model was a developed graphene electrode fabricated under a vacuum state using a vacuum chamber. The purpose was to control the vacuum conditions, such as the air pressure and the temperature during the fabrication process. The parameters to be assessed include the layer thickness and the continuous environment. Results presented show high accuracy and repeatability achieving low cost productivity.Keywords: laser scribing, lightscribe DVD, graphene oxide, scanning electron microscopy
Procedia PDF Downloads 12327304 Exposing Latent Fingermarks on Problematic Metal Surfaces Using Time of Flight Secondary Ion Mass Spectroscopy
Authors: Tshaiya Devi Thandauthapani, Adam J. Reeve, Adam S. Long, Ian J. Turner, James S. Sharp
Abstract:
Fingermarks are a crucial form of evidence for identifying a person at a crime scene. However, visualising latent (hidden) fingermarks can be difficult, and the correct choice of techniques is essential to develop and preserve any fingermarks that might be present. Knives, firearms and other metal weapons have proven to be challenging substrates (stainless steel in particular) from which to reliably obtain fingermarks. In this study, time of flight secondary ion mass spectroscopy (ToF-SIMS) was used to image fingermarks on metal surfaces. This technique was compared to a conventional superglue based fuming technique that was accompanied by a series of contrast enhancing dyes (basic yellow 40 (BY40), crystal violet (CV) and Sudan black (SB)) on three different metal surfaces. The conventional techniques showed little to no evidence of fingermarks being present on the metal surfaces after a few days. However, ToF-SIMS images revealed fingermarks on the same and similar substrates with an exceptional level of detail demonstrating clear ridge definition as well as detail about sweat pore position and shape, that persist for over 26 days after deposition when the samples were stored under ambient conditions.Keywords: conventional techniques, latent fingermarks, metal substrates, time of flight secondary ion mass spectroscopy
Procedia PDF Downloads 16427303 Magnetized Cellulose Nanofiber Extracted from Natural Resources for the Application of Hexavalent Chromium Removal Using the Adsorption Method
Authors: Kebede Gamo Sebehanie, Olu Emmanuel Femi, Alberto Velázquez Del Rosario, Abubeker Yimam Ali, Gudeta Jafo Muleta
Abstract:
Water pollution is one of the most serious worldwide issues today. Among water pollution, heavy metals are becoming a concern to the environment and human health due to their non-biodegradability and bioaccumulation. In this study, a magnetite-cellulose nanocomposite derived from renewable resources is employed for hexavalent chromium elimination by adsorption. Magnetite nanoparticles were synthesized directly from iron ore using solvent extraction and co-precipitation technique. Cellulose nanofiber was extracted from sugarcane bagasse using the alkaline treatment and acid hydrolysis method. Before and after the adsorption process, the MNPs-CNF composites were evaluated using X-ray diffraction (XRD), Scanning electron microscope (SEM), Fourier transform infrared (FTIR), and Vibrator sample magnetometer (VSM), and Thermogravimetric analysis (TGA). The impacts of several parameters such as pH, contact time, initial pollutant concentration, and adsorbent dose on adsorption efficiency and capacity were examined. The kinetic and isotherm adsorption of Cr (VI) was also studied. The highest removal was obtained at pH 3, and it took 80 minutes to establish adsorption equilibrium. The Langmuir and Freundlich isotherm models were used, and the experimental data fit well with the Langmuir model, which has a maximum adsorption capacity of 8.27 mg/g. The kinetic study of the adsorption process using pseudo-first-order and pseudo-second-order equations revealed that the pseudo-second-order equation was more suited for representing the adsorption kinetic data. Based on the findings, pure MNPs and MNPs-CNF nanocomposites could be used as effective adsorbents for the removal of Cr (VI) from wastewater.Keywords: magnetite-cellulose nanocomposite, hexavalent chromium, adsorption, sugarcane bagasse
Procedia PDF Downloads 13127302 Analysis and Prediction of Netflix Viewing History Using Netflixlatte as an Enriched Real Data Pool
Authors: Amir Mabhout, Toktam Ghafarian, Amirhossein Farzin, Zahra Makki, Sajjad Alizadeh, Amirhossein Ghavi
Abstract:
The high number of Netflix subscribers makes it attractive for data scientists to extract valuable knowledge from the viewers' behavioural analyses. This paper presents a set of statistical insights into viewers' viewing history. After that, a deep learning model is used to predict the future watching behaviour of the users based on previous watching history within the Netflixlatte data pool. Netflixlatte in an aggregated and anonymized data pool of 320 Netflix viewers with a length 250 000 data points recorded between 2008-2022. We observe insightful correlations between the distribution of viewing time and the COVID-19 pandemic outbreak. The presented deep learning model predicts future movie and TV series viewing habits with an average loss of 0.175.Keywords: data analysis, deep learning, LSTM neural network, netflix
Procedia PDF Downloads 25727301 Analysis of User Data Usage Trends on Cellular and Wi-Fi Networks
Authors: Jayesh M. Patel, Bharat P. Modi
Abstract:
The availability of on mobile devices that can invoke the demonstrated that the total data demand from users is far higher than previously articulated by measurements based solely on a cellular-centric view of smart-phone usage. The ratio of Wi-Fi to cellular traffic varies significantly between countries, This paper is shown the compression between the cellular data usage and Wi-Fi data usage by the user. This strategy helps operators to understand the growing importance and application of yield management strategies designed to squeeze maximum returns from their investments into the networks and devices that enable the mobile data ecosystem. The transition from unlimited data plans towards tiered pricing and, in the future, towards more value-centric pricing offers significant revenue upside potential for mobile operators, but, without a complete insight into all aspects of smartphone customer behavior, operators will unlikely be able to capture the maximum return from this billion-dollar market opportunity.Keywords: cellular, Wi-Fi, mobile, smart phone
Procedia PDF Downloads 36727300 The Impact of Brand Loyalty on Product Performance
Authors: Tanzeel bin Abdul Rauf Patker, Saba Mateen
Abstract:
This research investigates the impact of Brand Loyalty on the product performance and the factors those are considered more important in brand reputation. Variables selected for this research are Brand quality, Brand Equity, Brand Reputation to explore the impact of these variables on Product performance. For this purpose, primary research has been conducted. The questionnaire survey for this research study was administered among the population mainly at the shopping malls. For this research study, a sample size of 250 respondents has been taken into consideration. Customers from the shopping malls and university students constitute the sample for this research study using random sampling (non-probabilistic) used as a sampling technique for conducting the research survey. According to the results obtained from the collected data, it is interpreted that product performance shares a direct relationship with brand quality, brand quality, and brand reputation. Result also showed that brand quality and brand equity has a significant effect on product performance, whereas brand reputation has an insignificant effect on product performance.Keywords: product performance, brand quality, brand equity, brand reputation
Procedia PDF Downloads 31627299 Total-Reflection X-Ray Spectroscopy as a Tool for Element Screening in Food Samples
Authors: Hagen Stosnach
Abstract:
The analytical demands on modern instruments for element analysis in food samples include the analysis of major, trace and ultra-trace essential elements as well as potentially toxic trace elements. In this study total reflection, X-ray fluorescence analysis (TXRF) is presented as an analytical technique, which meets the requirements, defined by the Association of Official Agricultural Chemists (AOAC) regarding the limit of quantification, repeatability, reproducibility and recovery for most of the target elements. The advantages of TXRF are the small sample mass required, the broad linear range from µg/kg up to wt.-% values, no consumption of gases or cooling water, and the flexible and easy sample preparation. Liquid samples like alcoholic or non-alcoholic beverages can be analyzed without any preparation. For solid food samples, the most common sample pre-treatment methods are mineralization, direct deposition of the sample onto the reflector without/with minimal treatment, mainly as solid suspensions or after extraction. The main disadvantages are due to the possible peaks overlapping, which may lower the accuracy of quantitative analysis and the limit in the element identification. This analytical technique will be presented by several application examples, covering a broad range of liquid and solid food types.Keywords: essential elements, toxic metals, XRF, spectroscopy
Procedia PDF Downloads 13427298 Exploitation behind the Development of Home Batik Industry in Lawean, Solo, Central Java
Authors: Mukhammad Fatkhullah, Ayla Karina Budita, Cut Rizka Al Usrah, Kanita Khoirun Nisa, Muhammad Alhada Fuadilah Habib, Siti Muslihatul Mukaromah
Abstract:
Batik industry has become one of the leading industries in the economy of Indonesia. Since the recognition of batik as one of cultural wealth and national identity of Indonesia by UNESCO, batik production keeps increasing as a result of increasing demands for batik, whether from domestically or abroad. One of the rapid development batik industries in Indonesia is batik industry in Lawean Village, Solo, Central Java, Indonesia. This batik industry generally uses putting-out system where batik workers work in their own houses. With the implementation of this system, therefore employers don’t have to prepare Environmental Impact Analysis (EIA), social security for workers, overtime payment, space for working, and equipment for working. The implementation of putting-out system causes many problems, starting from environmental pollution, the loss of social rights of workers, and even exploitation of workers by batik entrepreneurs. The data used to describe this reality is the primary data from qualitative research with in-depth interview data collection technique. Informants were determined purposively. The theory used to perform data interpretation is the phenomenology of Alfred Schutz. Both qualitative and phenomenology are used in this study to describe batik workers exploitation in terms of the implementation of putting-out system on home batik industry in Lawean. The research result showed that workers in batik industry sector in Lawean were exploited with the implementation of putting-out system. The workers were strictly employed by the entrepreneurs, so that their job cannot be called 'part-time' job anymore. In terms of labor and time, the workers often work more than 12 hours per day and they often work overtime without receiving any overtime payment. In terms of work safety, the workers often have contact with chemical substances contained in batik making materials without using any protection, such as clothes work, which is worsened by the lack of standard or procedure in work that can cause physical damage, such as burnt and peeled off skin. Moreover, exposure and contamination of chemical materials make the workers and their families vulnerable to various diseases. Meanwhile, batik entrepreneurs did not give any social security (including health cost aid). Besides that, the researchers found that batik industry in home industry sector is not environmentally friendly, even damaging ecosystem because industrial waste disposed without EIA.Keywords: exploitation, home batik industry, occupational health and safety, putting-out system
Procedia PDF Downloads 32027297 Causal Estimation for the Left-Truncation Adjusted Time-Varying Covariates under the Semiparametric Transformation Models of a Survival Time
Authors: Yemane Hailu Fissuh, Zhongzhan Zhang
Abstract:
In biomedical researches and randomized clinical trials, the most commonly interested outcomes are time-to-event so-called survival data. The importance of robust models in this context is to compare the effect of randomly controlled experimental groups that have a sense of causality. Causal estimation is the scientific concept of comparing the pragmatic effect of treatments conditional to the given covariates rather than assessing the simple association of response and predictors. Hence, the causal effect based semiparametric transformation model was proposed to estimate the effect of treatment with the presence of possibly time-varying covariates. Due to its high flexibility and robustness, the semiparametric transformation model which shall be applied in this paper has been given much more attention for estimation of a causal effect in modeling left-truncated and right censored survival data. Despite its wide applications and popularity in estimating unknown parameters, the maximum likelihood estimation technique is quite complex and burdensome in estimating unknown parameters and unspecified transformation function in the presence of possibly time-varying covariates. Thus, to ease the complexity we proposed the modified estimating equations. After intuitive estimation procedures, the consistency and asymptotic properties of the estimators were derived and the characteristics of the estimators in the finite sample performance of the proposed model were illustrated via simulation studies and Stanford heart transplant real data example. To sum up the study, the bias of covariates was adjusted via estimating the density function for truncation variable which was also incorporated in the model as a covariate in order to relax the independence assumption of failure time and truncation time. Moreover, the expectation-maximization (EM) algorithm was described for the estimation of iterative unknown parameters and unspecified transformation function. In addition, the causal effect was derived by the ratio of the cumulative hazard function of active and passive experiments after adjusting for bias raised in the model due to the truncation variable.Keywords: causal estimation, EM algorithm, semiparametric transformation models, time-to-event outcomes, time-varying covariate
Procedia PDF Downloads 12827296 Fostering Fresh Graduate Students’ Confidence in Speaking English: An Action Research to Students of Muria Kudus University, Central Java, Indonesia
Authors: Farid Noor Romadlon
Abstract:
Welcoming the ASEAN Economic Community and globalization, people need to have a good communication skill. Being able to speak English is one of important qualification in this skill and as global citizen. This study focused on fostering fresh graduate students’ confidence in speaking English. So, students have good performance in speaking. There were thirty (30) students from first semester of English Education Department who joined Intensive Course class as the subject. They had poor motivation to speak English since English is a foreign language which is not exposed in their environment. This study used Three Communicative Activities technique in twelve successive meetings totally. It was done in two cycles (six meetings for each) since there were some activities should be improved in the first session (cycle). Oral test was administered to find the quantitative result and observation conducted to strengthen the finding. The result indicated that Three Communicative Activities improved students’ confidence in speaking English. They had significant progress in their performance in the class. The technique which allowed students to have more spaces to explore and express their ideas to their friends increased their confidence in their performance. The group or cooperative activities stimulated students to think critically in the discussion and promoted their confidence to talk more.Keywords: students’ confidence, three communicative activities, speaking, Muria Kudus University
Procedia PDF Downloads 21427295 Data Driven Infrastructure Planning for Offshore Wind farms
Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree
Abstract:
The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data
Procedia PDF Downloads 8727294 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets
Authors: Ece Cigdem Mutlu, Burak Alakent
Abstract:
Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.Keywords: average run length, M-estimators, quality control, robust estimators
Procedia PDF Downloads 19227293 Development of Elementary Literacy in the Czech Republic
Authors: Iva Košek Bartošová
Abstract:
There is great attention being paid in the field of development of first reading, thus early literacy skills in the Czech Republic. Yet inconclusive results of PISA and PIRLS force us to think over the teacher´s work, his/her roles in the education process and methods and forms used in lessons. There is also a significant importance to monitor the family environment and the pupil, themselves. The aim of the publishing output is to focus on one side dealing with methods of practicing reading technique and their results in the process of comprehension. In the first part of the contribution there are the goals of development of reading literacy and the methods used in reading practice in some EU countries and a follow-up comparison of research implemented by the help of modern technology of an eye tracker device in the year 2015 and a research conducted at the Institute of Education and Psychological Counselling of the Czech Republic in the year 2011/12. These are the results of a diagnostic test of reading in first classes of primary schools, taught by the genetic method and analytic-synthetic method. The results show that in the first stage of practice there are no statistically significant differences between any researched subjects taught by different methods of reading practice (with the use of several diagnostic texts focused on reading technique and its comprehension). Different results are shown at the end of Grade One and during Grade Two of primary school.Keywords: elementary literacy, eye tracker device, diagnostic reading tests, reading teaching method
Procedia PDF Downloads 18827292 Empirical Acceleration Functions and Fuzzy Information
Authors: Muhammad Shafiq
Abstract:
In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data
Procedia PDF Downloads 30127291 Optimization of Process Parameters in Wire Electrical Discharge Machining of Inconel X-750 for Dimensional Deviation Using Taguchi Technique
Authors: Mandeep Kumar, Hari Singh
Abstract:
The effective optimization of machining process parameters affects dramatically the cost and production time of machined components as well as the quality of the final products. This paper presents the optimization aspects of a Wire Electrical Discharge Machining operation using Inconel X-750 as work material. The objective considered in this study is minimization of the dimensional deviation. Six input process parameters of WEDM namely spark gap voltage, pulse-on time, pulse-off time, wire feed rate, peak current and wire tension, were chosen as variables to study the process performance. Taguchi's design of experiments methodology has been used for planning and designing the experiments. The analysis of variance was carried out for raw data as well as for signal to noise ratio. Four input parameters and one two-factor interaction have been found to be statistically significant for their effects on the response of interest. The confirmation experiments were also performed for validating the predicted results.Keywords: ANOVA, DOE, inconel, machining, optimization
Procedia PDF Downloads 20627290 Evaluating Alternative Structures for Prefix Trees
Authors: Feras Hanandeh, Izzat Alsmadi, Muhammad M. Kwafha
Abstract:
Prefix trees or tries are data structures that are used to store data or index of data. The goal is to be able to store and retrieve data by executing queries in quick and reliable manners. In principle, the structure of the trie depends on having letters in nodes at the different levels to point to the actual words in the leafs. However, the exact structure of the trie may vary based on several aspects. In this paper, we evaluated different structures for building tries. Using datasets of words of different sizes, we evaluated the different forms of trie structures. Results showed that some characteristics may impact significantly, positively or negatively, the size and the performance of the trie. We investigated different forms and structures for the trie. Results showed that using an array of pointers in each level to represent the different alphabet letters is the best choice.Keywords: data structures, indexing, tree structure, trie, information retrieval
Procedia PDF Downloads 45227289 Data Management System for Environmental Remediation
Authors: Elizaveta Petelina, Anton Sizo
Abstract:
Environmental remediation projects deal with a wide spectrum of data, including data collected during site assessment, execution of remediation activities, and environmental monitoring. Therefore, an appropriate data management is required as a key factor for well-grounded decision making. The Environmental Data Management System (EDMS) was developed to address all necessary data management aspects, including efficient data handling and data interoperability, access to historical and current data, spatial and temporal analysis, 2D and 3D data visualization, mapping, and data sharing. The system focuses on support of well-grounded decision making in relation to required mitigation measures and assessment of remediation success. The EDMS is a combination of enterprise and desktop level data management and Geographic Information System (GIS) tools assembled to assist to environmental remediation, project planning, and evaluation, and environmental monitoring of mine sites. EDMS consists of seven main components: a Geodatabase that contains spatial database to store and query spatially distributed data; a GIS and Web GIS component that combines desktop and server-based GIS solutions; a Field Data Collection component that contains tools for field work; a Quality Assurance (QA)/Quality Control (QC) component that combines operational procedures for QA and measures for QC; Data Import and Export component that includes tools and templates to support project data flow; a Lab Data component that provides connection between EDMS and laboratory information management systems; and a Reporting component that includes server-based services for real-time report generation. The EDMS has been successfully implemented for the Project CLEANS (Clean-up of Abandoned Northern Mines). Project CLEANS is a multi-year, multimillion-dollar project aimed at assessing and reclaiming 37 uranium mine sites in northern Saskatchewan, Canada. The EDMS has effectively facilitated integrated decision-making for CLEANS project managers and transparency amongst stakeholders.Keywords: data management, environmental remediation, geographic information system, GIS, decision making
Procedia PDF Downloads 16327288 An Efficient Approach for Speed up Non-Negative Matrix Factorization for High Dimensional Data
Authors: Bharat Singh Om Prakash Vyas
Abstract:
Now a day’s applications deal with High Dimensional Data have tremendously used in the popular areas. To tackle with such kind of data various approached has been developed by researchers in the last few decades. To tackle with such kind of data various approached has been developed by researchers in the last few decades. One of the problems with the NMF approaches, its randomized valued could not provide absolute optimization in limited iteration, but having local optimization. Due to this, we have proposed a new approach that considers the initial values of the decomposition to tackle the issues of computationally expensive. We have devised an algorithm for initializing the values of the decomposed matrix based on the PSO (Particle Swarm Optimization). Through the experimental result, we will show the proposed method converse very fast in comparison to other row rank approximation like simple NMF multiplicative, and ACLS techniques.Keywords: ALS, NMF, high dimensional data, RMSE
Procedia PDF Downloads 34327287 Providing Health Promotion Information by Digital Animation to International Visitors in Japan: A Factorial Design View of Nurses
Authors: Mariko Nishikawa, Masaaki Yamanaka, Ayami Kondo
Abstract:
Background: International visitors to Japan are at a risk of travel-related illnesses or injury that could result in hospitalization in a country where the language and customs are unique. Over twelve million international visitors came to Japan in 2015, and more are expected leading up to the Tokyo Olympics. One aspect of this is the potentially greater demand on healthcare services by foreign visitors. Nurses who take care of them have anxieties and concerns of their knowledge of the Japanese health system. Objectives: An effective distribution of travel-health information is vital for facilitating care for international visitors. Our research investigates whether a four-minute digital animation (Mari Info Japan), designed and developed by the authors and applied to a survey of 513 nurses who take care of foreigners daily, could clarify travel health procedures, reduce anxieties, while making it enjoyable to learn. Methodology: Respondents to a survey were divided into two groups. The intervention group watched Mari Info Japan. The control group read a standard guidebook. The participants were requested to fill a two-page questionnaire called Mari Meter-X, STAI-Y in English and mark a face scale, before and after the interventions. The questions dealt with knowledge of health promotion, the Japanese healthcare system, cultural concerns, anxieties, and attitudes in Japan. Data were collected from an intervention group (n=83) and control group (n=83) of nurses in a hospital, Japan for foreigners from February to March, 2016. We analyzed the data using Text Mining Studio for open-ended questions and JMP for statistical significance. Results: We found that the intervention group displayed more confidence and less anxiety to take care of foreign patients compared to the control group. The intervention group indicated a greater comfort after watching the animation. However, both groups were most likely to be concerned about language, the cost of medical expenses, informed consent, and choice of hospital. Conclusions: From the viewpoint of nurses, the provision of travel-health information by digital animation to international visitors to Japan was more effective than traditional methods as it helped them be better prepared to treat travel-related diseases and injury among international visitors. This study was registered number UMIN000020867. Funding: Grant–in-Aid for Challenging Exploratory Research 2010-2012 & 2014-16, Japanese Government.Keywords: digital animation, health promotion, international visitor, Japan, nurse
Procedia PDF Downloads 30827286 Mechanical Properties and Thermal Comfort of 3D Printed Hand Orthosis for Neurorehabilitation
Authors: Paulo H. R. G. Reis, Joana P. Maia, Davi Neiva Alves, Mariana R. C. Aquino, Igor B. Guimaraes, Anderson Horta, Thiago Santiago, Mariana Volpini
Abstract:
Additive manufacturing is a manufacturing technique used in many fields as a tool for the production of complex parts accurately. This technique has a wide possibility of applications in bioengineering, mainly in the manufacture of orthopedic devices, thanks to the versatility of shapes and surface details. The present article aims to evaluate the mechanical viability of a wrist-hand orthosis made using additive manufacturing techniques with Nylon 12 polyamide and compare this device with the wrist-hand orthosis manufactured by the traditional process with thermoplastic Ezeform. The methodology used is based on the application of computational simulations of voltage and temperature, from finite element analysis, in order to evaluate the properties of displacement, mechanical stresses and thermal comfort in the two devices. The execution of this work was carried out through a case study with a 29-year-old male patient. The modeling software involved was Meshmixer from US manufacturer Autodesk and Fusion 360 from the same manufacturer. The results demonstrated that the orthosis developed by 3D printing, from Nylon 12, presents better thermal comfort and response to the mechanical stresses exerted on the orthosis.Keywords: additive manufacturing, finite elements, hand orthosis, thermal comfort, neurorehabilitation
Procedia PDF Downloads 19227285 Mapping and Mitigation Strategy for Flash Flood Hazards: A Case Study of Bishoftu City
Authors: Berhanu Keno Terfa
Abstract:
Flash floods are among the most dangerous natural disasters that pose a significant threat to human existence. They occur frequently and can cause extensive damage to homes, infrastructure, and ecosystems while also claiming lives. Although flash floods can happen anywhere in the world, their impact is particularly severe in developing countries due to limited financial resources, inadequate drainage systems, substandard housing options, lack of early warning systems, and insufficient preparedness. To address these challenges, a comprehensive study has been undertaken to analyze and map flood inundation using Geographic Information System (GIS) techniques by considering various factors that contribute to flash flood resilience and developing effective mitigation strategies. Key factors considered in the analysis include slope, drainage density, elevation, Curve Number, rainfall patterns, land-use/cover classes, and soil data. These variables were computed using ArcGIS software platforms, and data from the Sentinel-2 satellite image (with a 10-meter resolution) were utilized for land-use/cover classification. Additionally, slope, elevation, and drainage density data were generated from the 12.5-meter resolution of the ALOS Palsar DEM, while other relevant data were obtained from the Ethiopian Meteorological Institute. By integrating and regularizing the collected data through GIS and employing the analytic hierarchy process (AHP) technique, the study successfully delineated flash flood hazard zones (FFHs) and generated a suitable land map for urban agriculture. The FFH model identified four levels of risk in Bishoftu City: very high (2106.4 ha), high (10464.4 ha), moderate (1444.44 ha), and low (0.52 ha), accounting for 15.02%, 74.7%, 10.1%, and 0.004% of the total area, respectively. The results underscore the vulnerability of many residential areas in Bishoftu City, particularly the central areas that have been previously developed. Accurate spatial representation of flood-prone areas and potential agricultural zones is crucial for designing effective flood mitigation and agricultural production plans. The findings of this study emphasize the importance of flood risk mapping in raising public awareness, demonstrating vulnerability, strengthening financial resilience, protecting the environment, and informing policy decisions. Given the susceptibility of Bishoftu City to flash floods, it is recommended that the municipality prioritize urban agriculture adaptation, proper settlement planning, and drainage network design.Keywords: remote sensing, flush flood hazards, Bishoftu, GIS.
Procedia PDF Downloads 3827284 Investigating Nanocrystalline CaF2:Tm for Carbon Beam and Gamma Radiation Dosimetry
Authors: Kanika Sharma, Shaila Bahl, Birendra Singh, Pratik Kumar, S. P. Lochab, A. Pandey
Abstract:
In the present investigation, initially nano-particles of CaF2 were prepared by the chemical co-precipitation method and later the prepared salt was activated by thulium (0.1 mol%) using the combustion technique. The final product was characterized and confirmed by X-Ray diffraction (XRD) and transmission electron microscopy (TEM). Further, the thermoluminescence (TL) properties of the nanophosphor were studied by irradiating it with 1.25 MeV of gamma radiation and 65 MeV of carbon (C6+) ion beam. For gamma rays, two prominent TL peaks were observed with a low temperature peak at around 1070C and a high temperature peak at around 1570C. Furthermore, the nanophosphor maintained a linear TL response for the entire range of studied doses i.e. 10 Gy to 2000 Gy for both the temperature peaks. Moreover, when the nanophosphor was irradiated with 65 MeV of C6+ ion beam the shape and structure of the glow curves remained spectacularly similar and the nanophosphor displayed a linear TL response for the full range of studied fluences i.e. 5*1010 ions/cm2 to 1 *1012 ions/ cm2. Finally, various tests like reproducibility test and batch homogeneity were also carried out to define the final product. Thus, co-precipitation method followed by combustion technique was successful in effectively producing dosimetric grade CaF2:Tm for dosimetry of gamma as well as carbon (C6+) beam.Keywords: gamma radiation, ion beam, nanocrystalline, radiation dosimetry
Procedia PDF Downloads 18727283 Integrating Time-Series and High-Spatial Remote Sensing Data Based on Multilevel Decision Fusion
Authors: Xudong Guan, Ainong Li, Gaohuan Liu, Chong Huang, Wei Zhao
Abstract:
Due to the low spatial resolution of MODIS data, the accuracy of small-area plaque extraction with a high degree of landscape fragmentation is greatly limited. To this end, the study combines Landsat data with higher spatial resolution and MODIS data with higher temporal resolution for decision-level fusion. Considering the importance of the land heterogeneity factor in the fusion process, it is superimposed with the weighting factor, which is to linearly weight the Landsat classification result and the MOIDS classification result. Three levels were used to complete the process of data fusion, that is the pixel of MODIS data, the pixel of Landsat data, and objects level that connect between these two levels. The multilevel decision fusion scheme was tested in two sites of the lower Mekong basin. We put forth a comparison test, and it was proved that the classification accuracy was improved compared with the single data source classification results in terms of the overall accuracy. The method was also compared with the two-level combination results and a weighted sum decision rule-based approach. The decision fusion scheme is extensible to other multi-resolution data decision fusion applications.Keywords: image classification, decision fusion, multi-temporal, remote sensing
Procedia PDF Downloads 125